Ivor
Catt First published in New Scientist, 6 March 1969, pp501/502 |
When Charles Babbage, the British mathematician, designed the first computer or calculator in the middle of the nineteenth century, his interests included subjects, such as the calculation of logarithms, which demanded that a great deal of tedious, repetitive arithmetic calculations be carried out. He called his machine a calculating machine.
The modern electronic computer was brought into being during the second world war with the need to do many tedious calculations on such things as shell trajectories, and was likewise originally thought of as just a calculating machine. Data processing was a completely separate industry, basically concerned with punched card sorting, and the first data-processing machines were just overgrown punched card sorters.
The two industries have now merged and today a computer is generally regarded as a machine primarily concerned with processing data. The electronic computer has thus made a radical change in the space of a few years, from arithmetic calculator to data processor.
The question now arises as to whether data processing is the main function that we want electronic computers to perform now and in the future. Will future generations looking back say that the electronic computer started off as an arithmetic calculator, very soon turned into a data processor, and there it remained? This does not seem like a plausible sequence of events, and we can surely expect the electronic computer to go through further mutations. Its present position as a hybrid between the arithmetic calculator and the data processor deems to be rather unstable.
Seeing the whole situation
It would surely be to our advantage to attempt to foresee future developments, rather than let our industry grow like Topsy, with no ideas within or without it as to where we are heading. That day, we could run into a blind alley, which would be bad for our self esteem, our prestige and our pockets. As I see it, the only positive ideas at present are in the ICL Stevenage basic language machine, and in the theory that the computer will link up with the communications industry. However, these ideas seem to be just extensions of the data-processing idea, and not very fundamental.
In the basic language machine, we try to unscramble the Tower of Babel that is the modern data-processing machine so that it may process data a little more efficiently. By linking up with the communications industry, we transfer the data from punched cards to telephone or other communication lines, so that the cards do not have to be carried about.
At the risk of being dubbed impractical, let us reculer pour mieux sauter. And while we are about it, let us do a thorough job by retracing our steps as far back as our minds can conceive. If we do so, reminding ourselves what our business is supposed to be about, there appear to be two basic factors: human and sociological needs and the capabilities of electronic technology.
The common denominator of most human and sociological needs is that I call "Situation Analysis and Manipulation". The clearest example of a "situation" as I use the word in this phrase, is a topological or geographical environment, where the altitude varies across a two-dimensional surface, and the altitude at any point can be specified - a well-mapped piece of land, for example. "Manipulation" in this case would be achieved by altering the altitudes by means of a bulldozer. A more complex example of a situation is a geographic environment which is altering with time, so that a complete description of the situation must contain the time dimension as well as the two horizontal dimensions. Again, manipulation could be achieved with a bulldozer.
Closed system design
What about our data-processing machine, the digital computer of today? Does its organization lend itself to the analysis and manipulation of situations? It would certainly be surprising if it did, since the machine is rooted in the sorting of punched cards (sequential data processing) and arithmetic calculations (again sequential). This tendency is reinforced by the fact that up to the present time all memories which were technically practicable, such as magnetic core memories, have been passive. This meant that to extract data for analysis from a memory requires the insertion of energy into the memory, and we can only insert a limited amount of energy into the memory at one instant. The result is that all the time we are blind to most of the information in the memory, and data has to be extracted sequentially, bit by bit, from the memory. When analysing a situation we, like a short-sighted man, would have only enough vision (energy) to see a very small area at any one time.
The two requirements of sequential data processing and sequential arithmetic calculation, backed up by the historical accident that we had no active "parallel" memories, so that our machine memories were oriented towards sequential work, created what Koestler called a "closed system". This then proceeded to reject any evidence which did not indicate that all we needed to do was process data sequentially. A "closed system" will reject new developments, and oppose any attempt to profit from them. The unreasonably slow development of semiconductor integrated circuit and large-scale integration memory is not because of technical barriers, and must be caused by other factors, such as the ideology of a "closed system".
The first integrated circuit memory, the Honeywell Transitron 16 bit memory, has had forced into its design all the weaknesses of a passive memory (in this case core memory). The first emitter coupled logic (as opposed to transistor/transistor logic) 16 bit memory, made by Motorola, has been similarly "gelded". However there is the possibility of progress, in spite of the fact that we seem to be trapped in the "closed system" of sequential data processing. This is because the modern data-processing machine has become so highly developed that the problems of its internal organization have taken on, to a small degree, the appearance of the larger problems of the real world outside the machine.
The problem of efficient time sharing - running more than one programme on the machine at the same time - is a situation, albeit a rather simple one, calling for analysis and manipulation. The problem of memory usage in a big machine is again a situation. The data-processing machine, for reasons of speed, cost and efficiency, has a hierarchy of memories, ranging from very large, slow ones to very small, fast ones. The problem of where information is and should be stored is a situation calling for analysis and manipulation.
Not surprisingly, the tool that is being developed to deal with these situations is an active memory, a semiconductor integrated-circuit memory. These have only recently become technically practicable. They are generally described as "associative memory", "content addressable memory" (CAM) or slave memory. There is almost no literature on the subject, either in published papers or in text books.
My suggestion is that we should do all we can to encourage designers to allow the content addressable memory to look outward at the world outside the machine and try to deal with external situations, as well as look inward and handle situations internal to the machine. I feel confident that this will happen in time of its own accord. But in the absence of a conscious effort on our part it will take 10 years instead of two, and out increasingly sophisticated and complicated data-processing machines will meanwhile come to be regarded with justifiable cynicism as dinosaurs which are very good at solving the wrong problem.
This article is not a discussion of content addressable memories. However, it has to be said that the coming of active memories ushers in a new era of memory, where we have a whole range of possibilities. At the one extreme is the equivalent of passive (core) memory, where only a small portion of the information can be analysed and manipulated at one time. Then we proceed to content addressable systems, which have a small amount of processing (analysing) capability distributed throughout the memory. Then come the more complex types of memory, with a greater degree of processing capability distributed around the memory.
I like to call the old passive type of memory "simple
memory", and memory with any distributed processing capability "complex memory". We shall have to feel our way towards more sophisticated, complex memory, and
our minds cannot at present conceive the direction of further development.
But let us make an effort to get onto the first rung of the ladder by pushing
forward in the use of content addressable memories, so that we have the chance,
with more complex memories, to develop further towards a machine with the
capacity to analyse and manipulate situations in general.
First published in New Scientist, 6 March 1969, pp501/502