In 1953 there were only 53 kilobytes (KB) of Random Access Memory (RAM) on planet Earth.
Think about that for a second. What’s the average size of just one of those emails in your overflowing inbox?
It is amazing to think how far we have come in such a short space of time. Those 53 KB were not just on the one computer either, but actually scattered a good deal over the planet.
What’s more amazing to think about is what still lies around the corner. If you compare the timeline of computing to the timeline of the motor car you’ll quickly realise that we’ve not even reached Ford’s Model T stage yet.
Even in the 1980’s my home computer – a Commodore 64 – while being incredibly liberating was also extremely limited. It was called the “64” for a reason: that was the sum total of RAM available. While it was a hell of a lot smaller than the 1953 computers, it’s fair to say that the capacity for storing code hadn’t improved a great deal in the intervening years.
Before reading Turing’s Cathedral by George Dyson, I thought I was pretty much clued up on the origins and development of computing. After all, I knew quite a bit about Alan Turing and Charles Babbage, and had even dabbled myself with the hypothetical complexities of a Universal Turing Machine at university (it went badly).
Turns out I was only scratching the surface.
What was news to me was the role played by the fascinating Hungarian refugee John von Neumann at Princeton University, who went on to deliver in practice what Turing had achieved in theory. Part mathematician, part economist, and full on genius, Neumann had the vision, military connections (*cough*), and willpower to bring together all of the various jigsaw pieces needed to get the first computers up and running.
Nor did I know that this first computer was employed for two very distinct projects. During the day it was used by “Johnny” Neumann and his crew to test blast calculation theories for the development of the atomic bomb; by night it was run by a semi-crazy Norwegian/Italian called Nils Barricelli who was working on artificial intelligence and digital life-forms, whose end goal was to amplify electrical randomness into free-will.
Or, as Dyson puts it, “one program was dedicated to destroying life as we know it, the other to generating life of unknown forms”.
It was interesting to read too about the battles the early computer scientists at Princeton had to fight, as they were as much battles about prejudice as they were about engineering obstacles. Trapped somewhere in a grey zone between mathematics and engineering, the early computer scientists and engineers were viewed by some of the resident mathematicians and liberal arts practitioners with distaste: the introduction of their “machinery” brought an unnecessary “factory element” into their Cambridge-copied ivory towers. The computer engineers in particular bore the brunt of some outlandish snobbishness: they were not invited to the faculty tea.
But the invention of the computer goes back even further than these halcyon Princeton days: it was none less than Sir Francis Bacon who, back in 1623, established that all communication could be encoded digitally: “the transposition of two letters by five placings will be sufficient for thirty-two differences”. In other words he found that you could represent the alphabet in binary form, much like Morse code.
Where Bacon left off the grand-daddy of computing stepped in. Long before the Babbage Engine, the Turing Machine, the MANIAC, or indeed any of those first fundamental computing machines, it was Leibniz in 1697 who conceived of a machine that could run computations using – get this – marbles. The presence of a marble would represent a “1”, and the absence of a marble “0”. By passing marbles through a succession of physical gates, numbers could be added or subtracted.
The problem Leibniz solved with marbles was the same problem later solved more elaborately by Turing and executed by Neumann. The only difference was that the Princeton outcasts used electrical pulses instead.
Still, this makes me think of an interesting alternative universe. In this alternate world, rather than use hand held computing devices, we would have to visit giant arcade halls stuffed with millions of pinball machines in order to run our Excel spreadsheets. That would certainly make accountancy a more attractive career proposition.
The obvious drawback is that in this alternative Leibniz world, you’d need a grid of 424,000 marbles to store less information than you find in your average email.
That’s a big bag of marbles and certainly not nearly as handy as an iPhone. Weird to think you’ve got the atomic bomb to thank for that.
Turing’s Cathedral by George Dyson is out now.