From number crunching to information technology and beyond.

Silvio Hènin
Roche S.p.A. (Monza - Italy)
SILVIO.HENIN@Roche.COM
 


Appendix. Links to History of Automatic Calculation

"The world is moving so fast these days that the [one] who says it can't be done is generally interrupted by someone doing it" - Elbert Hubbard
 
 

Introduction

I switch on my mobile to send a friend an SMS message, I have already established a link with Internet to know the weather forecasts and exchange quotations while I was hearing music from a CD and my wife was watching a film on DVD with the 5-channel-Dolby Surround sound. Returning home, I let me drive from the GPS navigation system while my son was playing with a pocket video-game.
Only 20 years ago, all this looked liked science-fiction. Everything has happened so quickly... how could it happen?
All the above equipment share one thing: inside, they are provided with a tiny binary electronic computer. I want to tell you the story of this technological conquest (the digital processing of information) which was born to make mathematic calculations; I want to tell you the story of the ideas and of the dreamers who wrote this story.

Numbers and calculations

We are in the year 2001, the year Stanley Kubrick and Arthur C. Clarke suggested for Space Odyssey. The trip of man to Jupiter to rush after the monolith did not occur, but something much more important has arrived affecting our private, social and economic life: the PCs and the way they are connected together: Internet.

I like reminding the first pictures of the film in which a bone, thrown from a monkey-like ancestor of us, whirls in the space until becoming a space station. I examine the bone at close range and discover on its surface some small carvings, some notches arranged in a non-casual manner. Counting them, I realize that they represent almost the moon cycles. 

Archeological findings witness how 35,000 years ago man had already learnt how to represent quantities by means of symbols. His outstanding brain succeeded in counting days, sheeps, moon phases, and steps of a way and in transforming them in something which was physically different but which allowed not only to record, but even to carry out operations (two sheeps plus three sheeps = five sheeps, two notches plus three notches = five notches). The step from the symbolical representation of quantitites (writing) to their processing (mathematics) with simple and repeating mechanisms (algorithms) was surely a long one, but now a compulsory one. The most ancient known handbooks of arithmetic describing the procedures for carrying out operations, Ahmes' papyrus and the Sumerian tablets, date back to the 18th-16th century before the common era.

Together with philosophy, mathematics and geometry were the real important inventions of the ancient civilizations (Babylonians, Egyptians, Indians) which were developed to an extraordinarily high degree of logical and structural perfection from the Greeks (Pythagoras, Archimedes, Thales, Euclid, and many others). Surprising is the "unreasonable effectiveness of mathematics"  (E. Wigner) in helping man explore the micro- and macrocosmos, in understanding, or at least assuming, the way the universe is functioning and forecasting its behaviour. Wigner underlines how it is unbelievable that theorems which were suggested 500 years before Christ for theoretical purposes only, have turned out to be determining for the physical knowledge of the universe, more than two-thousand years later, with the quantization of sciences suggested by Bacon, Galileo, Kepler and completed by Newton in his Philosophiae Naturalis Principia Mathematica. According to the opinion of some epistemologists, all this is showing how it is natural for man to represent the world in mathematic terms in order to be able to freely act on symbols, it being impossile for him to act on the motion of stars!

But let us go back in time. Why did ancients need calculation, let's say until the late Middle Ages? Astrology and astronomy, to forecast the motions of stars and their effects on the events of human life (horoscope). Economy, to subdivide estate properties and herds among the  heirs (100 sheeps and 3 sons: what a problem!) and to cash taxes and duties. Engineering to foretell how many stones will be necessary to build a pyramid. Further on in time, we will find even more complex problems: interests on loans, customs duties, the calculation of the day when Easter falls.

Handling numbers or better operating on numbers (adding, subtracting, multiplying, dividing) at the time of cuneiform characters or of Roman numerals was not an easy task. This problem was got around with the most primitive means of calculation: the abacus. In its most rudimentary form, the abacus was a set of pebbles  (calculus = pebble) arranged in little holes digged in the sand (abacus from the Arabic 'âbâq = powder, sand). A hole for units, one for hundreds, one for thousands and so on. To add 5 and 6 it was enough to add 5 pebbles in the hole of units where already 6 pebbles were contained: however, being more than 9 pebbles not allowed in each hole, 10 pebbles were taken leaving 1 pebble in the units hole and putting 1 pebble in the group of tens (provided that people thought with a 10-base): the carry was so invented.

The use of the abacus was still spread in the 18th century of our era and it was applied for the calculation of the State finances, too. Lord Chancellor of the Exchequer draws the name from the concept of this instrument (the checkerboard) to refer to one of the highest positions of the British Government. In Orient, its use was still spread in the period after the second World War: a challenge is described between a Japanese with his abacus and an American soldier with his electric calculating machine - the Japanese won four rounds out of five!

But already in the year 1000 A.D. in Europe, a new means appeared to represent quantities, the so-called Arabic numerals (actually, they originated from India and Arabs adopted them in the 8th century). Arabic numerals were introduced in the Western culture from the Italian mathematician Fibonacci in the year 1202. Their innovative properties were the positional notation and the zero which acted as "place-keeper" (from sifr -> zephyr -> zero and cypher).  Arabic numerals allowed to simplify the mnemonic procedures of the four operations and to execute them in a mechanical way. In practice, the structure of the pebbles in the abacus dematerialized into the symbols, with the advantage that they did not roll away! A simple and easily repeatable procedure was invented: by knowing by heart the times tables of the addition and multiplication of two numerals, any operation could be performed with the only support of paper and pen. A procedure was invented, or better, an algorithm (the name deriving from the Arab mathematician Al-Khwarizmi, author of the first treatise of algebra applying Indian numerals in the 8th century A.D.).

The superiority over the old notation was recognized very slowly: another four-hundred years after Fibonacci, it could not be used in public agreements and contracts and for a long time, Roman and Arab numerals lived together, sometimes in an odd way, too.

The seventeenth and eighteenth century

Don't believe that complex mechanisms were unknown in ancient times. In 1901, in the Aegean Sea, in front of the island Antikythera, some coral-fishers brought on the surface of the sea some treasures from a Roman ship sunk around the year 87 B.C. Between amphoras and statues there was also the fragment of a strange set of gears. A gamma-rays analysis performed under the direction of the historian De Solla-Price demonstrated that it was a high-precision astronomic clock which allowed to the moon phases for every year. Precision technology was forgotten in the Middle Age and appeared again only in the clock-making of the 14th century to beat the time of prayers in monasteries and to call farmers to devotions. Clocks cannot be considered calculators in the true sense of the word, but the technology created to make them was used to make the first calculating machines.

Up to the 16th century, the need of calculation was still very low. Except for merchants, bankers and administrators of the royal finances, the great majority of the population - of rich and learned classes, too - was not able to carry out a two-figure-multiplication - don't mention poor classes where illiteracy reigned supreme. Imagine that in order to be member of the jury of a court in the eighteenth century it was necessary to demonstrate to be able to count until ten! Between the end of the fifteenth and the seventeenth century, many events triggered off a technological revolution which cannot do without calculations and wants them ever more precise. Let us mention two examples, only:

1) The discovery of the new world (1492) started the big explorations through the sea. Sure and repeatable courses had to be defined, it was necessary to "plot the position" in order to know where the ship is at any moment, i.e. to calculate the latitude and longitude of a ship or a place. The subdivision of the colonies of Spain and Portugal in America was ratified in 1506 with a bull of Pope Giulio II, exactly along 46°30' of longitude West. The error of one degree, only, would have given more than 111 Km territory to one of the two rivals.

2) A new implement of war had appeared: the gun. In order to hit the target, the sight, the weight of the ball and the quantity of powder had to be defined carefully: again calculations.

But the utmost impulse arrived from the onset of modern science. Thanks to Newton, the description of the physical world was not only qualitative but became quantitative, too, incorporating mathematics. Thus, the trajectory of a bullet and the orbit of a planet were calculable.

In the 15th to 18th century, many calculating and measuring instruments appeared, such as the astrolabe, the sextant, the Galileo's geometric military compass, which allowed to rapidly solve practical problems, without having recourse to the abacus or to paper and pen; above all, the operator did not need to be provided with great mathematical skill The three instruments quoted were analog as they reproduced the sizes to be calculated with analogous geometric sizes proportional to them. This kind of calculating machines has evolved up to our century, using electronics, too, but will then be completely replaced from digital calculating machines. For reasons of space, we will no more speak about their history.

Just in the seventeenth century, the first real digital (or numerical, that is finite-state) machines appear. The mostly known is surely the Pascaline designed by the French philosopher and mathematician Blaise Pascal to make the job of his father, a tax collector, easier. Pascalina allowed to make sums and subtractions with a clever system of cog-wheels and automatic carry propagation. About 50 machines were constructed also thanks to the patent given from Louis XIV, "Rois Soleil", to Pascal in 1645. Other attempts preceded Pascaline, such as the so-called Napier's Bones in 1617, which gained some popularity, and the machine by Schickard in 1623 of which a sketch, only, has remained which was sent from Schickard to his friend Kepler.

In the second half of the 17th century, another important name comes to the history of calculating machines, that of the mathematician Gottfried Leibniz, mainly known as the inventor of the differential calculus. Thanks to a technical trick (the stepped drum), the Leibniz' machine allowed to carry out multiplications and divisions, too. This mechanism was not very fortunate, was not easy to use, broke easily down and the carry mechanism was to say the least muddled. The machine was realized in 1694.

In our history, Leibniz has to be reminded for at least three other reasons:

1) inspired from Chinese symbols I Ching, he suggested the binary notation, today the basis of all computers (in some way preceded by Francis Bacon);

2) he recognized the importance of the mechanization of calculation, stating that "... it is unworthy of excellent men to lose hours like slaves in the labor of calculation which could safely be relegated to anyonse else if machines were used"

3) he contributed to the formalization of the symbolic logic suggesting its use in the philosophical reasoning, too. Indeed, he imagined that "... in the future, philosophers would settle their disputes as accountants do, just by taking pens and calculators (abaci) and proclaiming 'let us compute!".

The technology from Pascal and Leibniz as well as from other inventors of the same age (Morland, Grillett, Poleni, Stanhope, Hahn - to mention only few) remained at the base, though being continuously improved, of many other instruments of the following centuries until the first calculators which met with commercial success. In the 19th century, many patents of calculating machines are counted, such as the arithmometer by Thomas de Colmar (1820), the calculating machines by the American Baldwin (1875) and the Swedish Odhner (1874) and the first keyboard calculating machines, the "macaroni box" by the American Felt (1914).

Between the end of the nineteenth and the beginning of the twentieth century, times were mature for the outset of a market able to absorb these products and, therefore, for the outset of a flourishing industry of the field. The reasons thereof must mainly be looked for in the rapid development of production and commerce which followed the industrial revolution as well as in the outset of a contracting aristocracy which needed instruments for the control of warehouses, sales, invoices, budgets. To meet such request, efficient and economical calculating machines were needed. Before the First World War, important brands were established such as Friden, Burroughs, Monroe, Marchant, Felt & Tarrant, Brunswiga, NCR, Olivetti, Facit, Remington, which had a remarkable commercial success up to the Fifties.

Mechanical calculating machines used to carry out the four operations is a species intended to come to an end with the onset of computers. Still in 1960, however, the last heirs of Pascalina and of Leibniz' machine were sold as pocket or desk calculators. I personally remember the last exemplar of the species, Curta, a little jewel of precision mechanics based on a miniaturized version of the Leibniz' stepped drum which was sold until 1972 in as many as 140,000 exemplars.
 

The industrial revolution and the new needs for scientific calculation

Calculators could solve many problems but did not yet make the work of mathematicians easier. Calculating complex functions was a long and tedious work often subject to mistakes. Think at a simple expression such as y = 5x3 + 6x2 + 7x + 3 to be calculated for each value of included between 1 and 100. For each point, 8 multiplications and 4 additions have to be carried out (1,200 operations), all the partial results have to be transcribed (without mistakes) as well as the final result (again without mistakes).

The helping instrument at that time were the so-called Numerical Tables. In the seventeenth century, already many tables had been published, from those of multiplications, to logarithms, to trigonometric functions until the special ones for navigation or actuarial data. Tables were calculated by hand under one or more expert mathematicians who planned the work of a troop of computers (up to 1950, the term computer meant a man or a woman charged with the repetitive operations of calculation) in a real assembly line of figures! Calculation and transcription errors occurred often and liability was poor.

A machine was needed which was able not only to carry out the individual operations in the exact sequence and automatically,  but which could keep partial results in a sort of memory and directly print final results. Ideal thing would be the possibility of instructing the machine from time to time in order to execute different sequences according to the different problem to be solved. This dream was already expressed embryonally from Leibniz and occupied the imagination of many: the first who expressed it and transformed it into a reason of living was Charles Babbage. The scientific studies by Babbage ranged from pure mathematics to philosophy, from politics to industrial organization, from railway technology to  metaphysics. He wasn't an easy man; his passions and his hates made living difficult for him and for those living around him, but his contributions to the British science were huge: it is enough thinking that he was appointed to the Lucasian chair of mathematics in Cambridge - the chair of Newton - when he was 35!

Babbage lived in the period of the utmost industrial expansion of the Empire, a moment when the power of steam was the symbol and essence of progress. As mathematician, Babbage was particularly bothered from the frequent imprecisions of numerical tables. When he was still a student in Cambridge, his friends asked him once what he was thinking... "I am thinking that all these tables [pointing to the logarithms] might be calculated by steam!". Few years later, in 1823, he suggested a first project to automate the calculation of linear functions with the 'method of differences' through a steam-operated mechanical system and submitted his idea to the Royal Society which judged it worthy of a governmental support. The suffered history of the governmental financial support (as many as 17.000 pounds of that time), of the continuous project delays, of the contrasts with the mechanician Clement and, finally, of the project failure, has produced material for more than one book. Imagine that at the same time, a simple Swedish typographer, Scheutze, arrived constructing a machine of differences which was much simpler and cheaper but working and of which three exemplars were sold.

The inconstancy and perfectionism of Babbage which costed him the distrust of many contemporaries but of his dearest friends, did not obscure his even more futurist dream started in 1834: the analytical machine.

Analytical Engine was the name given by Babbage to the project of an actual programmable calculator provided with the basic elements which still today can be recognized in a computer:

a) the calculation unit (the mill) which had to carry out the four basic operations (today, the CPU)
b) the memory (the store) where to keep partial results (today, the RAM)
c) the instructions (the barrel) which the machine would have followed in sequence to solve the problem (today, microprogramming)
d) a unit for the input of data and programs
e) a unit for the output of results.

The program and input data had to be recorded on punched cards such as those used in the Jacquard loom.

Babbage left very few original writings about the Analytical Engine. A meeting held in Turin pressed Luigi Menabrea (future prime minister of Italy) to write its description in a Swiss magazine. This first article was then translated and commented by Ada Lovelace, daughter of Byron and Babbage's friend, and subsequently known as first softwarist of history.

The Analytical Engine has never been constructed except for the central part of the mill which was finished in 1906 by the son, Henry Babbage. The device was so complex and the working precision required from its parts so high that construction costs and times would have been inapplicable. Though it was constructed and worked, no one would have bought it: in the Nineteenth century, human work (even that of computers) was cheap enough not to be replaced with a machine.

In any case, Babbage left a heritage which, after having been forgotten for about one century, appears again, more or less recognized, in the first electromechanical computers of the Thirties. If he lives again today, will he be able to recognize in our PC made of plastics and silicon the descendant of its phantastic creature made of brass and steel and powered by steam?
 

Change of the century (nineteenth - twentieth century)

The progress of commercial and industrial activities together with the increased complexity of the State organization in the most advanced countries produced a great need of control at the end of the 19th century. Typical thereof is the American Census. In the United States, the population growth made the calculation of these data ever more difficult. In 1840, the inhabitants of the United States increased up to 17 million. In the meantime, the range of collected information became ever broader (school education, ethnical origin, immigration, job, health conditions), further sharpening the problem. The crisis burst with the census of 1880, the processing of which lasted seven years, nearly overlapping the subsequent one.

The solution arrived from a young engineer interested in statistics, Herman Hollerith, engaged from the Bureau of Census. Hollerith resorted to a simple trick, using punched cards. This idea was suggested from a character well-known to health librarians: J.S. Billings, the founder of the National Library of Medicine and of the Index Medicus.

At first, Hollerith' machine was only a card sorter able to sort the punched cards in different collection drawers according to the data they contained. The card stopped under a chaser with metallic needles, the needle corresponding to a punching passed it and entered a tank of mercury thus closing the electric circuit. This caused the opening of the relevant drawer. The subsequent model made calculation automatic, too: the same electric circuit made a counter to increase. Thanks to such system, the calculation of data for the census of 1890 required six weeks, only! During the following years, Hollerith' machines were used for the censuses in the Austro-Hungarian Empire, in Canada, in the Kingdom of Italy and in the Russian Empire.

Hollerith' machines were simply sorters, organizers and counters of cards. Despite the great commercial success, we were still far away from the computer such as we mean today and such as Babbage intended fifty years before. The history of Hollerith represents in any case a very significant step in the evolution of calculating machines for at least two reasons. From the technical viewpoint, it was the first device exploiting electricity to transfer information from the card to the counter. From the commercial viewpoint, the success of the idea made Hollerith establish the Tabulating Machines Company in 1896 which will become International Business Machines (IBM) in 1924.
 

1935 - 1945

Until the end of the Thirties, the use of electricity in calculating instruments was limited to Hollerith' machines, only, though other electric applications already represented an acquired fact: the telegraph (1844), the telephone (1876), the electric bulb (1879), the radio (1901) the television (1927) and the radar (1935). We may thus imagine how many people thought to use electricity to carry out calculations, too. Few people, however, passed beyond some shy experiment. Three names have to be reminded as the first inventors of calculators working with electricity: Konrad Zuse, Howard Aiken and George Stibitz.

In 1936, in Germany, an engineer named Konrad Zuse, absolutely set apart from the academic world, started constructing an automatic machine to solve calculation problems for designing plane wings; these analyses forcing him to long and repetitive calculations. In his living-room and with the mere aid of few tools, he first produced a binary mechanical memory, to which he soon connected a mechanical calculation unit as well as a programming unit controlled from old movie films punched by hand. He called this model V1 (Versuchsmodell 1), but subsequently changed this name into Z1 in order not to confuse it with the flying bombs having the same name! Having become aware of the poor liability and slowness of this machine, in 1939, Zuse prepared a second one, called Z2, characterized from a still mechanical memory but with a relay-operated electromechanical calculating unit.

In the following years, Zuse accomplished a real electromechanical working computer, Z3, which was submitted in 1941 to an audience of engineers and scientists, raising great interest. Not yet satisfied, living in a Berlin continuously bombed and in which it was difficult to find even food, Zuse constructed Z4 with a mechanical memory (relays were now unfindable): the machine was ended in 1944. Zuse arrived in protecting Z4 from the destructions of war and from the hands of Allies, hiding it in a cellar in the small Bavarian village of Hindelang. Once tranquillity returned again, Zuse transferred in to the Swiss Federal Institute of Technology (E.T.H.), in Zurich, were it remained working for 15 years. Up to 1951, this machine remained the only working computer in continental Europe.

The history of Zuse is emblematic for at least five reasons: 1) his contribution was completely original, as he was very isolated from the rest of the world but even from the German research activity; 2) the fact that he has conceived a binary representation of figures which is that adopted from all the modern computers; 3) the fact that he has independently achieved an architecture which was already suggested by Babbage; the invention of the first programming language (Plankalkul, 1943-45); 5) the extremely practical and simple way of facing the problem: the estimated cost of Z2 is 6.500 US $, only.

In the same years, in the United States, another pioneer faced the same problems as Zuse. Howard Aiken, of the Harvard University, was somehow the continuer of Babbage in our century, perhaps the only one who had read Babbage's writings and dedicated him his achievements (in Harvard cellars, Aiken found a fragment of the difference machine, gift from the grandson of Babbage). In 1939, Aiken started the construction of an electromechanical programmable calculator, gathering in a unique structure parts of commercial IBM machines, with the aid and financing of this now powerful company. In 1944, the so-called Automatic Sequence Controlled Calculator (subsequently renamed Harvard Mark I, owing to disagreements with IBM) was completed and working. It was a monster, 17 meters long, 2.5 meters high, 450 tons in weight, containing 760,000 parts and 800 km of wiring. Its cost exceeded 300.000 US $.

Mark I worked actively until 1958, solving many mathematical problems, but Aiken concepts were decidedly out-of-date: 1) it was still a decimal calculator, 2) it had a limited programming, 3) like Babbage, Aiken's idea concerning the use of a computer was still that of an equipment to produce precise numerical tables. Exemplary of Aiken's mentality is his statement that about ten computers would have been enough for the whole world!

The third character, Stibitz, was a scientist at Bell Telephone Laboratories who had to solve ever more practical problems. His electromechanical calculator of that time, the Complex Number Calculator (CNC) was constructed with the mere aim of speeding up the operations needed for the design of telephone circuits. The CNC was not a true general purpose computer, but it must be reminded for one reason (beside of being one of the first electromechanical computers): it was the object of the first distance processing experimentation: a terminal in Hanover (New Hampshire) operated the computer in New York through a phone link at a distance of 364 km. An ancestor of Internet?
 

The war and its needs for calculation. Electronics enters computers.

In the preceding paragraph, we have seen how during the Thirties and Fourties, a great spur existed to realize automatic and programmable computers and how it was finally possible to get public or private financial supports indispensable for such great projects. New sciences had created new needs for calculation: aerodynamics, to design ever quicker and more manageable planes; ballistics, to print precise firing tables for artillery; nuclear physics, to know and exploit atomic energy; cryptoanalysis, to reveal the secret messages of the enemy. To synthesize: needs created from war!

The first electronic vacuum tubes calculators originate exactly from the urging war-needs: ENIAC in the USA and Colossus in England. The use of thermonionic valves to replace relays had already been suggested at the end of the Thirties. The vacuum tube (do you remind the old radios provided with glass and metal objects heating like ovens?) may become a switch working thousand times quicker than an Aiken's relay and incomparably quicker than a  Babbage's cog-wheel. Already in 1939, Helmut Schreyer, friend of Zuse, had written his doctoral thesis on this matter and had built a prototype of vacuum tubes-operated calculating unit which unfortunately got lost during war. On the other side of the Atlantic, Vincent Atanasoff of the University of Iowa, built the prototype of a simple vacuum tube computer to solve set of  equation.

In 1943, two scientists of the Moore School of Electrical Engineering, University of Pennsylvania, J. Mauchly and J.P. Eckert, suggested to the United States Army a revolutionary solution to produce firing tables. The project was accepted, financed from the Defence and called Electronic Numerical Integrator And Computer (ENIAC). I don't dwell on the interesting history of this project which - on its fiftieth anniversary - made up the material for many books, meetings and Internet sites. In Autumn 1945, ENIAC was ended thanks to a human and financial effort which the war-load arrived to guarantee. The machine had huge dimensions: 19,000 vacuum tubes, 1,500 relays, 200 KW of absorbed power, 200 square meters space. All this for a computer the performace of which is ridiculous though when compared with a today's pocket computer. The total cost of ENIAC was 486.804 US $.

ENIAC architecture does not allow to consider it the direct precursor of today's electronic computers: numeration was decimal and not binary, programming was accomplished in the hardware by moving links and commutators, the memory was not separated from the calculating unit. It was a step backwards in comparison with the idea of Babbage and the computers of Zuse. ENIAC was the most powerful and quickest machine of the time only for the use of vacuum tubes and its huge dimensions: an 'brute force' approach.

An electronic project of similar sizes had preceded ENIAC in England: Colossus. This occurred at the efficient organization founded in Bletchley Park for the decoding of the secret German messages crypted from their machines ENIGMA, Lorenz and Siemens. Churchill referred to this assorted group of mathematicians, linguists and chess players as to his "...goose that laid the golden egg but never clucked" owing to the precision and constancy with which they arrived in reading the secret messages of the Axis army. In the group of scientists, a given Alan Turing appeared distinctly: we will find him again in the next pages.
Colossus was ended in December 1943, thus two years before ENIAC, and contained 1,700 vacuum tubes. The reading of messages occurred on a punched paper tape at the speed of 5,000 characters a second. Ten Colossi were constructed: nearly all of them were dismantled at the end of the war by order of Churchill and their projects were burnt for reasons of secrecy. Nothing was known about this machine until the Seventies, when some war historian gathered evidences and documents. Today, books, Internet sites as well as novels and films have been published about the history of Bletchley Park, Enigma and Colossus, until transforming it into a myth.

Post-war period: the architecture by Von Neumann

Until the end of war, the evolution of computers had been characterized from isolated and independent trials, from dreamers/inventors and from impelling problems to be solved with sometimes hurried approaches targeted to specific situations.
A theory was still lacking which allowed to design a real programmable automatic calculator of general use to solve all the problems to be described through an algorithm, that means problems for which a general procedure could theoretically be assumed.

Here, we have to go back in time to remind some theoretical contributions which would have allowed to face the problem in a more rational way. Already in the eighteenth century, logic and mathematics had started to flow in the science which will be called symbolic logic, according to which each human reasoning might be treated as a mathematical equation. After Leibniz, another important theorist formalized calculation: George Boole. From here up to the search for the automation of reasoning on a machine, too, the step was short. Many scholars of logic tried - more or less ingenuously - to realize mechanisms which would allow to automate Boole's operations. Given the cultural presence of machinery in the nineteenth century, it is not surprising that efforts to mathematize or quantify the rules of logic should issue in the design of logic machines such as those of Jevons in England and Marquand in the U.S. Marquand's friend Charles S. Peirce even suggested replacing the rods and strings of the machine with electrical circuits. But the devices served no practical purposes and remained at best aids to understand the structure of logical reasoning.
The development of this line of thinking soon led understanding that: 1) it was possible to realize a machine which automatically carried out fundamental operations of logic using the binary values true or false (1 and 0), only; 2) a fundamental equivalence existed between such operations and the action of electronic or electromechanical relays; 3) numerical calculations, too, could be brought back to the extremely simple table of truths od the symbolic logic.

In the Twenties, a young English mathematician came on stage, Alan Turing, who will leave a remarkable stamp on the theory of calculating machines. To face the so-called Entscheidungsproblem (could there exist, at least in principle, any definite method or process by which all mathematical questions could be decided?), in 1936, Turing conceived - on paper - a finite-state- (digital) mechanism which read data and instructions from an infinite tape. Turing's machine could solve any logical-mathematical problem with the sequence of single simple and limited operations. Turing's machine was not the project of a real instrument, but only a mental model to demonstrate a theorem; nevertheless, its simple and essential architecture and its absolute general nature are found in modern computers.

After the war, Turing designed a computer (christened ACE) able to switch at will from numerical work to algebra, codebreaking, file handling, or chess-playing. Later, in February 1947, Turing depicted a national computer center with remote terminals, and the prospect of the machine taking over more and more of its own programming work. But not a single component of ACE was assembled in time, and Turing found himself without any influence in the engineering of the project. The lack of cooperation, very different from the wartime spirit, he found deeply frustrating. Only one experimental prototype of ACE (Pilot-ACE) was constructed in 1954. In the meantime, Turing moved to Manchester, where one of the first English electronic computers was being constructed.

The name which has remained much more linked with the present structure of the computer is, in any case, that of a mathematician of Hungarian origins, John von Neumann: he worked at the I.A.S. in Princeton and during the war, he was consultant for many advanced military research activities, among which the Manhattan project.

In 1944, nearly by chance, Von Neumann knew about the ENIAC project and became immediately very interested in it. He went to the Moore School where he understood the potentialities of what was being realized and joined the working team. He brought together the needs of the Los Alamos National Laboratory (and the Manhattan Project) with the capabilities of firstly the engineers at the Moore School of Electrical Engineering who were building the ENIAC.

He immediately understood the intrinsic limits of ENIAC due to the scarcely elegant approach based on the quantity of components, only. In 1944, he published the famous "First Draft on the Report of the EDVAC (Electronic Discrete Variable Computer)" which may be considered the Magna Charta of informatics. The draft describes that which will be known in history as "Von Neumann's architecture" and which is made of:

1. a memory containing both data and instructions. Also to allow both data and instruction memory locations to be read from, and written to, in any desired order.
2. A calculating unit capable of performing both arithmetic and logical operations on the data.
3. A control unit which could interpret an instruction retrieved from the memory and select alternative courses of action based on the results of previous operations.

It reminds the Analytical Engine by Babbage very much, doesn't it? The real novelty consists in the first item, that is in the concept of stored program. Each program instruction, as any data to be processed, would be accessible in any order  in the same memory, instead of being written sequentially on a punch tape as in Aiken's Mark I. The computer would even have been able to modify its own program, on the basis of previous results. This idea diffused as a virus in the whole small community of the computer people, mainly among those who in 1946 participated in the first course "Theory And Techniques For Design Of Electronic Digital Computers" of the Moore School which starts all the projects of the first generation of electronic, binary, general purpose programmable machines; i.e. the computers as we known them today.

To tell the truth, the paternity of the invention cannot be attributed to Von Neumann, only. Many others of the Moore School contributed to the result, among whom Mauchly and Eckert themselves. Unfortunately, the failed recognition of their contribution caused some unpleasantness and incomprehension which lasted until the following decade and which divided the small community and still today divide the historians of informatics.

Von Neumann was not the first who realized a machine with the architecture bearing his name. The construction of the I.A.S. computer of Princeton went on very slowly and ended just in 1950. Before it came the British EDSAC of Cambridge and Mark I of Manchester, realized in 1948 and 1949, respectively.

Soon, the huge potentialities of these new instruments were explored in commercial and industrial sectors, too, which were since then dominated from the tabulating-machines by IBM. The same designers of ENIAC, Eckert and Mauchly, left the academic world to establish UNIVAC which, for some years, was supreme in the technological field of electronic computers for commercial use.

It is odd to remind that in October 1947, the directors of J. Lyons & Company, a British catering company famous for its teashops (!), decided to take an active role in designing and building on their own an electronic computer for their accounting and management activities: the so-called LEO (Lyon's Electronic Office). This eventually resulted in the foundation of a computer branch of Lyon's, producing and selling tens of computers in England and abroad.

In few years, in nearly all the industrialized countries, similar initiatives rose aiming at the construction of experimental models to test the different technologies available. In 1955, as many as 34 brands of electronic general purpose computers were working throughout the world. The age of experimentation was finishing and these big machines were becoming an unreplaceable instrument both in the academic and in the commercial and industrial world. This was the proof of the great versatility of the binary electronic calculator: to manage without distinction numerical data for nuclear physics, the budget of a company or the catalogue of a library. The Rubicon had been crossed, there was no turning back: on the contrary, progress would have stepped along ever more briskly, in an exponential manner.
 

The Fifties - Seventies

After 1950, the challenge became the typical one of the free market: to lower costs, increase velocity and reliability, reduce dimensions, conquer markets. Many technological innovations contributed in achieving said targets. Let us mention only one of them:
1) the invention of the transistor and then of integrated circuits replaced the expensive and not reliable vacuum tubes, thereby reducing dimensions and cost.
2) The new solid state memories (magnetic cores, thin film) allowed to realize the concept of stored program.
3) The improvement of magnetic stores (first tapes, then disks) dramatically increased the quantity of data and programs to be recorded thereby decreasing as much rapidly the cost of each recorded bit.
4) The use of quick printers, keyboards and video terminals allowed a more simple and direct interaction with the machine, replacing old punched cards.

Sure, today's situation was still very far away. Computers were still big and expensive; their use was trusted exclusively to mathematicians assisted from electronic engineers: a priesthood of disciples to whom the user passed the problem to be solved and from whom he expected results. In batch processing is still being spoken of:  a task being assigned to a programmer, a processing lasting hours and days to get a small heap of punched cards with results. A process which was impossible to interrupt of modify when in course.
Things started to change with the invention of programming languages (COBOL, FORTRAN) which extended the control of processors outside the priesthood of disciples. Programming times and costs decreased and became accessible to a larger group of users.

As for hardware, in the first Seventies, new actors appeared on the market since then dominated from IBM and the "seven dwarfs" (Sperry, CDC, Honeywell, Burroughs, General Electric, RCA, NCR). First, Digital with its minicomputers, then INTEL with its microchips. In 1971, the latter company realized the first microchip, called 4004, concentrating a whole calculation unit on a single few-millimeter-sized integrated circuit, sold at the price of 200 US $, only. Three years later, it was followed from the most powerful 8080. In 1975, MITS shipped one of the first personal, Altair 8800, a whole computer as big as a big box, at the price of 397 US $, only. In 1976, Steve Wozniak and Steve Jobs founded Apple Computer in the house garage: it will become a multimillion-dollar-company and play a major role in the computer industry.

The second informatic revolution, after the first one of 1946, was ready; the days of the mainframe and of the whole culture of "EDP centers" were numbered. The personal computer prepared itself to become a common instrument, such as the TV set and the car. The evolution of microchips began following the "Moore's Law": every 18 months, power doubles and cost halves, making a computer available for everyone and for few thousand dollars which is remarkably more powerful and quicker working than the old ENIAC.

A contribution being determinant for this process of "computer democratization" will be given from other factors, too, such as the improvement of the interaction between man and machine. At the Xerox Palo Alto Research Center, the so-called Graphic User Interfaces (today Windows and Macintosh) will be developed in the Seventies together with the very famous mouse (1968) developed by Dough Engelbart in Stanford. The Personal Computer becomes a friendly and "pleasant" instrument until being appointed from Time "machine of the year" in 1982.
 

The network: the computer is no more alone

On October 29, 1969, at 2.30 p.m. (CET), Charles Kline tried to link the UCLA computer with that of the Stanford Research Institute, 500 Km far away. The first message transmitted was the "LOGIN" command. This shy digital babble was the first wailing of a phenomenon which today affects the economic, social and even personal life much more than the landing on the Moon which occurred few months before.

As the other stories we have reviewed, Internet, too, was generated from dreams and necessities. For the network, the roots of the dream originate far away: we find them expressed from one of the fathers of Internet, J.C.R. Licklider. In 1963, Licklider wrote his historic memo addressed to "Members and Affiliates of the Intergalactic Computer Network". The memo outlined Licklider's thoughts that computer can help researchers share information, and Licklider had the vision of a day when communities of people with common interests would be able to discuss them all on-line.

The necessity - powerful accomplisher of dreams - was born from the political and military situation of the Cold War. The American worry about the putting into orbit of Sputnik in 1957, urged President Eisenhower to establish the Advanced Research Projects Agency (ARPA) whose task was keeping the US technological-military superiority. Besides atomic weapons and space vehicles, electronic calculators had become unreplaceable research and management instruments, thus a strategic resource. In 1962, a researcher of RAND, Paul Baran, published a report entitled "On Distributed Communications Networks" in which he imagined "... a communication network which will allow several hundred major communications stations to talk with one another after an enemy attack".

So started the project ARPANet. Many were the difficulties to get around; resolute dreamers were necessary, too, such as Vinton Cerf, John Postel, Larry Roberts, Bob Kahn, Leonard Kleinrock, and Dough Engelbart.

In 1969, the first four sites of ARPANet were linked (UCLA, UCSB, SRI, and Utah University). ARPANet grew rapidly and was introduced to the public in 1972 during the International Conference on Computer Communications (Washington). The presentation was successful. ARPANet linked now as many as 23 host-computers and in 1973 crossed the Atlantic Ocean to link the London University College, too, as the first European site.

When speaking of presentation to the public and international diffusion, we must remind that it involved a very strict circle of engineers and scientists bound to the university-military establishment of the NATO. Those who - though belonging to the world of research and universities - did not have access to this instrument began feeling excluded from a very powerful means. In 1974, BBN opened to the public a data transmission system similar to ARPANet, called TELENET. In 1979, the calculation centers of some American universities (Duke and UNC) linked their computers creating USENET. From the cooperation between the New York and the Yale University, BITnet (Because It's Time NETwork!) was born. An interesting experiment was the radio-connection of calculators in the archipelago of Hawaii, which was christened with the evocative name Alohanet. The term Internet began circulating, indicating the interconnection of several public and private networks among each other. A meta-network.

Many were the needs pressing the development of Internet. As we have seen, the informatic resources of the Sixties and Seventies were still few and expensive; with the network, it was possible to delegate a long and complex calculation to a distant computer center which "rented" processing time. Another felt need was the exchange of great quantities of data or the search for information in remote databases (let us think about the access to Medline, to remain in the field of medicine). Other applications of the network were born stealthily and nearly for fun: a paradigmatic example is electronic mail. In 1972, Ray Tomlinson of BBN invented the first e-mail program (and the famous symbol @). Already in the following year, messages made up for 65% of the whole traffic on the net. The first mailing-list was created in 1975 from Steve Walker to gather the lovers of science-fiction from all over the world.

In 1990, Arpanet ended existing and the military shifted their traffic, for security reasons, on an isolated net, MILNET. The risk of a disastrous black-out of scientific communications was avoided from the far-sightedness of the American National Science Foundation which gathered the main American research centers. In the same years, nearly all the industrialized countries created their own scientific, commercial and industrial networks.

At the end of the Eighties, Internet linked hundreds of thousands of computers and gave access to tens of millions of documents, not only as texts, but also as images, sounds, animations. The universal library was taking shape, which was a dream for many philosophers of the past but impenetrable for the non-initiated, such as that described by Borges. At the CERN in Geneva, a young physicist, Tim Berners-Lee, had to face exactly the problem of ferreting out documents and of linking them together. With his article "Information management: a proposal" of 1989, he suggested a solution which became a real standard.

The problem of the management of an ever rising number of documents had already been felt from a scientific advisor of F.D. Roosvelt, Vannevar Bush. Bush was aware of the exponential growth of scientific literature and tried to realize an automatic system which allowed to create "wholly new forms of encyclopedias... ready-made with a mesh of associative trails running through them..." and expressed this view in 1945 in his article "As we may think". Technology at that time was still insufficient for the realization of this dream; the machine he designed, MEMEX, was based on a complex system of microfilms and optical readers and did never work.

In his book "Literary Machine" of 1960, another "visionary", Ted Nelson, imagined that: "... the future of humanity is at the interactive computer screen, that the new writing and movies will be interactive and interlinked". He coined the term Hypertext to represent this whole of virtual links among different documents. Finally, dream, necessity and technology met with the arrival of computers and of the network, where the different fragments of knowledge may be spread on servers troughout the world and where the reader/navigator has to choose the ways to follow. The term world wide web indicates this mesh of links very well.
 

Today (and tomorrow?)

That I have called the second informatic revolution is now behind us. The Moore's law  is always valid and computers continue being ever smaller, cheaper and more powerful, hiding themselves where we don't expect, in the washing-machine, in the car computer, in our pocket-diary, in the DVD reader.

But the computer is no more the complicated machine which solves equations: it has become a commodity the working of which is no more perceivable from the user and which carries out tasks absolutely different from those it was invented for:
1) it manages information (processes, files them and looks for them), they being texts, images, sounds, vital parameters of a patient, or any other thing which may be represented from a binary number, which may be digitalized (CD-Audio, DVD, digital cameras and videocameras, electronic publishing, virtual reality, ...);
2) it manages communication not only among machines, but even and mainly among individuals and institutions. In this field, it is replacing all the traditional channels, such as phone, mail, newspapers, radio and television (audio and video stream, e-book, e-mail, MP3, ...).

In the setting created from this revolution I want to mention a couple of trends, only, to think about, leaving the rest to futurologists and informatic guru.
The first is the shift of value (cost) from hardware in heroic times first to software and then to information. Today, a PC could even be distributed free of cost, recovering the investment through the sale or rental of the software (a similar experiment has already been carried out in the USA). Tomorrow, even programs may be free of cost, and information will be sold or rented (music, video, books).

The second big trend concerns Internet. Here too, the final aim of Internet being first remote processing and exchange of documents within the scientific community has widely been replaced from its use for personal and social communication purposes, for the promotion and sale of goods and services. Internet will ever more become a selling place and a meeting point, a sort of big virtual trading center, with its places for fun and meeting but even with its libraries and school rooms.

We still expect many miracles from the digital management of information: the ability of machines to understand and reply by voice, the ability of understanding the human communication code which is so redundant and inaccurate (but so efficient for us): to sum up, we expect some more intelligence. Artificial intelligence is merely the topic of scientific discussions held in the ivory towers of academic, the most powerful computer of the world with its billions of logical circuits is more stupid than a small bug with its few thousands of neurons. Perhaps, the inheritance of the machines by Babbage, Zuse and Von Neumann, which were constructed after all to carry out the four arithmetic operations, is exactly the great limit to be passed in future. We will see.