The reason I started this blog was basically that I had been reading „Turing’s Cathedral“ about the life of John von Neumann by George Dyson. The reason I am writing about the book is that I am reading a biography about Alan Turing at the moment. If you add some Konrad Zuse here, you have the beginning of the digital age. I will focus mostly on the first part of the book about the invention of the digital. Whether machines will one day be able to replicate themselves will be left to transhumanism.
“What began as an isolated 5-kilobyte matrix is now expanding by over two trillion transistors per second (a measure of the growth in processing and memory) and five trillion bits of storage capacity per second (a measure of the growth in code). Yet we still face the same questions that were asked in 1953. Turing’s question was what it would take for machines to begin to think. Von Neumann’s question was what it would take for machines to begin to reproduce.”
If you judge from today perspective, the development of the IAS (Institut for Advanced Study) in Princeton reads like a startup story: John von Neumann hires experts from all over the world in order to build a new product – the Von Neumann Architecture. It is even a story about open source, because there is a lot of conflict about patents and copyright ownership. Still today, people discuss who build the first computer. But in the first half of the 20th century several intelligent people tried to build a calculating – a universal machine – and only cooperation made it the breakthrough technology that coins today’s technology.
This being said: the most important chapter in the book is about the ENIAC – a computer only studied, but not build by von Neumann. The Electronic Numerical Integrator and Computer was an army project supervised by Herman Goldstine and „an absolutely pioneer venture, the first complete automatic, all-purpose digital electronic computer“ (von Neumann). The ENIAC was the first project to substitute machine power for human brain power – „virtually all the algorithms that humans had devised for carrying out calculations needed reexamination“.
But two problems remained: lack of storage and switching between different operations. Both was changed by inventing the Electronic Discrete Variable Automatic Computer (EDVAC). Looking back, the ENIAC and the EDVAC tell the story, that „the idea of the stored program, as we know it now, and which is a clear-cut way of achieving a universal computer, wasn’t invented overnight (…). Rather it evolved gradually“, says a quote from Jan Rajchman in the book:
„The functional elements of the computer were seperated into a hierarchical memory, a control organ, a central arithmetic unit, and input/output channels, making distinctions still known as the „von Neumann architecture“ today.“
Von Neumann basically developed the combination of „the practical experience derived from the ENIAC with the theoretical possibilities of Turing’s Universal Machine: „A third kingdom of mathematics was taking form. The first kingdom was the realm of mathematical abstractions alone. The second kingdom was the domain of numbers applied, under the guidance of mathematicians, to the real world. In the third kingdom, numbers would assume a life of their own.“ According to Andrew Hogdes, the Turing machine „offered a bridge, a connection between abstract symbols, and the physical world“. In fact, Turing wasn’t the first one trying to bridge the two worlds:
1679, Leibniz imagined a digital computer in which binary numbers were represented by spherical tokens, governed by gates under mechanical control (…). „Leibniz had invented the shift register – 270 years ahead of its time. In the shift registers at the heart of the Institute for Advanced Study computer (and all processors and microprocessors since), voltage gradients and pulses of electrons have taken the place of gravity and marbles, but otherwise they operate as Leibniz envisioned in 1679.“
Basically the computer von Neumann dreamt of, would be a cathedral for Turing’s idea: „Words coding the orders are handled in the memory just like numbers“, explained von Neumann, breaking the distinction between numbers that mean things and numbers that do things. Software was born.“ This is the central part of the book and the story behind the book’s title. What makes all the books about the 30ies and 40ies worth reading is to understand the basic concept of the computer, the vision that they were built with and why these visions are still not met today. Human thinking is just not logic and since Leibniz, Turing and von Neumann computers are exact that: logic.
„Turing summarized the essence (and weakness) of this convoluted argument in 1947, saying that ,in other words then, if a machine is expected to be infallible, it cannot also be intelligent. Instead of trying to build infallible machines, we should be developing fallible machines able to learn from their mistakes.’
Today computers might be so much faster calculating and drawing conclusions and even making decisions. But even today, they can’t solve every problem (-> Halteproblem). Quantum computers won’t change this.
As long as humans build computers, we have to do it according to a plan. Computers are based on logic, they don’t doubt: „The paradox of artificial intelligence is that any system simple enough to be understandable is not complicated enough to behave intelligently, and any system complicated enough to behave intelligently is not simple enough to understand.“
The main difference as Stan Ulam puts it is that „the brain is a statistical, probabilistic system, with logic and mathematics running as higher-level processes“. Whereas the computer is „a logical, mathematical system“. But he adds that this can be improved by“ higher-level statistical, probabilistic systems, such as human language and intelligence“.
This is why Dyson (as well as Google and Facebook) comes to the conclusion, that search engines and social networks are „the most successful new developments in computing“:
They are – are nonlinear hybrids between digitally coded and pulse-frequency-coded systems and are leaving linear, all-digital systems behind. (…) Web 2.0 is our code word for the analog increasingly supervening upon the digital – reversing how digital logic was embedded in analog components, sixty years ago.
Summary: It still amazes me how computers came into existence and that at their very core everything is still binary and some Boolean algebra. Of course, over the past decades, we have developed these techniques into powerful algorithms living in devices smarter with every generation, but at the very core lies still the logical thinking of Hilbert, Gödel and Turing. Right now, we try to teach computers, how we speak. The next step will probably be to understand how we behave. But since computers move in these old tracks, I doubt they will ever be creative – or well…surprising.
„Turing’s Cathedral“ from George Dyson. Published in 2012 by Vintage Books. $16,95.
Extra: All the books one can find about the invention of the first electronic computers, also touch on the topic of the scientists’ exodus after the fascist Nazi regime had taken control over Germany. Probably the brightest generation of scientists ever completely left the center of science, Göttingen, in order to leave for Princeton. 80 years later, German government demands for a German Silicon Valley, a German Google or Facebook.