David D. Thornburg, Associate Editor
Of
Babbages And Things
Computer jargon and concepts have permeated our language in strange ways. This came home to me one night when I heard a caller on a talk show say that she had trouble "interfacing" with her partner. I guess this is just a reflection of the pervasiveness of computer technology. Every new technology spawns its own vocabulary, and computers are no exception.
In fact, the computer industry has provided us with both a rich assortment of words and a rich collection of concepts that alter how we think about our world. While the words of technology wax and wane in popularity, the concepts are longer-lived. This gives us the chance to misjudge the newness of a concept we have just learned. When this happens, a brief look at history often shows that what we thought was new was known a long time ago. I got caught in one of these historical time warps last spring. I was teaching a graduate-level computer course at Stanford University and had introduced a model of program design that I called a microworld.
To my way of thinking, microworlds are made of two kinds of things-objects and operators. The objects have certain attributes, and the operators work on these objects to create new instances of them. These new instances may inherit some or all of the attributes of the old objects. Sound like gobbledygook? Read on.
For example, the microworld of arithmetic contains objects we call numbers. These numbers have attributes (they may be integers, decimals, imaginary, etc.). The operators for arithmetic include addition, subtraction, multiplication, and so on. These operators combine the number objects to produce new numbers. Notice that this way of thinking about arithmetic has nothing to do with computers.
Computer Microworlds
Because we have devised ways to represent both numbers and their operations inside computers, the microworld of arithmetic is a suitable domain for implementation in a computer. Of course, the arithmetic microworld is not the only one we have. For example, word processing is a microworld which contains letters as objects and insert and delete as operators.
What I like about this concept is that it provides a framework for creating flexible computer programs in nearly any domain. To build a microworld, one has to identify the objects and operators, and then build representations of these in the computer using a suitable programming language.
I thought this way of looking at programming was fairly new, but I soon received the shock of my life while reading a collection of papers about Charles Babbage and the Analytical Engine-a nineteenth-century predecessor to the digital computer. At the end of one article translated into English by Ada Augusta, Countess of Lovelace, were some notes added by the Countess:
In studying the action of the Analytical Engine, we find that the peculiar and independent nature of the considerations which in all mathematical analysis belong to operations, as distinguished from the objects operated upon and from the results of the operations performed upon those objects, is very strikingly defined and separated. It is well to draw attention to this point, not only because its full appreciation is essential to the attainment of any very just and adequate general comprehension of the powers and mode of action of the Analytical Engine, but also because it is one which is perhaps too little kept in view in the study of mathematical science in general.
So Much For Arithmetic
Lest you think she had only mathematics on her mind, she went on to say:
By the word operation, we mean any process which alters the mutual relation of two or more things, be this relation of what kind it may. This is the most general definition, and would include all subjects in the universe.
The Analytical Engine embodied the basic concepts of today's computers, but nineteenth-century craftsmen lacked the technology to build it. Though it was not constructed in Babbage's lifetime, his dreams and Ada's ideas finally came to light a century later.
So the next time you toss computer jargon into your conversation to be trendy, remember that you might be reflecting on the trends of some British inventors in the 1800s!