Turing's Cathedral: The Origins of the Digital Universe
M**J
Breathtaking in scope, depth, and originality
The early history of computing is usually presented in a simple linear fashion: Atonsoff, Mauchley and Eckert, Turing and the Enigma project, Von Neumann, and the post war explosion. That's the way I learned it in college in the 70s, and the way just about every book presents it. It's correct, insofar as it goes, but it leaves out a tremendous amount of richness and detail that George Dyson relates in this book. His narrative consists of over a dozen parallel, interrelated, stories, each concentrating on one person or project, along with how they or it relates to the overall narrative. The story begins with the history of Princeton, New Jersey, and the two men most responsible for the creation of the Institute for Advanced Study: Abraham Flexner, and Oswald Veblen, son of economist Thorsten Veblen. Flexner and the younger Veblen shared a vision of creating a place in which the world's greatest thinkers, able to interact freely and freed from the mundane obligations of teaching and practical applications, would advance the world's knowledge on a heretofore unprecedented scale. In so doing they inadvertently created one of the era's greatest centers for applied research into computing.Turing and von Neumann make their appearances here, of course, along with Mauchley, Eckert, Oppenheimer, Ulam, Freeman Dyson (the authors' father), and other notables of the era. But Dyson also tells the story of a number of pioneers and contributors to the design, construction, and most of all the theory of computation, who have been overlooked by history. Most remarkable, perhaps, is Nils Barricelli, who could justifiably be called the founder of computational biology. Working in the early 1950s with a computer having less computational power and memory than a modern day sewing machine, he created a one-dimensional, artificial,universe in order to explore the relative power of mutation and symbiosis is the evolution of organisms. His work led to a number of original discoveries and conclusions that would only be rediscovered or proposed decades later, such as the notion that genes originated as independent organism, like viruses, that combined to create more complex organisms.There's an entire chapter on a vacuum tube, the lowly 6J6, a dual triode created during the war that combined several elements necessary for the creation of a large scale computer: Simplicity, ruggedness, and economy. It fulfilled one of von Neumann's guiding principals for ENIAC: Don't invent anything. That is, don't waste time inventing where solutions already exist. By the nature of its relative unreliability and wide production tolerances relative to project goals, it also helped stimulate a critical line of research, that of how to created reliable systems from unreliable components- something more important now than ever in this era of microprocessors and memory chips with millions and even billions of components on a chip.The chapter on Alan Turing is particularly good, covering as it does much of his work that has been neglected in biographies and presenting a much more accurate description of his work and his contributions to computational science. The great importance of his conceptual computer- the "Turing Machine"- is not, as is commonly stated in popular works, that it can perform the work of any other computer. It is that it demonstrated how any possible computing machine can be represented as a number, and vice versa. This allowed him to construct a proof that there exist uncomputable strings, I.e., programs for which it could not be determined a priori whether they will eventually halt. This was strongly related to Godel's work on the completeness of formal systems, and part of a larger project to disprove Godel's incompleteness theorem.What makes this a particularly exceptional book is the manner in which Dyson connects the stories of individuals involved in the birth of electronic computing with the science itself. He does an exceptional job of explaining difficult topics like Godel incompleteness, the problems of separating noise from data, and the notion of computability in a way that the intelligent read who may not have advanced math skills will understand. More importantly, he understands the material well enough to know what are the critical concepts and accomplishments of these pioneers of computing, and doesn't fall into the trap of repeating the errors of far too many popular science writers. The result is a thoroughly original, accurate, and tremendously enjoyable history. Strongly recommended to anyone curious about the origins of computers and more importantly, the science of computing itself.
A**R
How it came from bit
The physicist John Wheeler who was famous for his neologisms once remarked that the essence of the universe could be boiled down to the phrase "it from bit", signifying the creation of matter from information. This description encompasses the digital universe which now so completely pervades our existence. Many moments in history could lay claim as the creators of this universe, but as George Dyson marvelously documents in "Turing's Cathedral", the period between 1945 and 1957 at the Institute for Advanced Study (IAS) in Princeton is as good a candidate as any.Dyson's book focuses on the pioneering development of computing during the decade after World War II and essentially centers on one man- John von Neumann. Von Neumann is one of the very few people in history to whom the label "genius" can authentically be applied. The sheer diversity of fields to which he made important contributions beggars belief- Wikipedia lists at least twenty ranging from quantum mechanics to game theory to biology. Von Neumann's mind ranged across a staggeringly wide expanse of thought, from the purest of mathematics to the most applied nuclear weapons physics. The book recounts the path breaking efforts of him and his team to build a novel computer at the IAS in the late 1940s. Today when we are immersed in a sea of computer-generated information it is easy to take the essential idea of a computer for granted. That idea was not the transistor or the integrated circuit or even the programming language but the groundbreaking notion that you could have a machine where both data AND the instructions for manipulating that data could be stored in the same place by being encoded in a common binary language. That was von Neumann's great insight which built upon the idea of Alan Turing's basic abstract idea of a computing machine. The resulting concept of a stored program is at the foundation of every single computer in the world. The IAS computer practically validated this concept and breathed life into our modern digital universe. By present standards its computing power was vanishingly small, but the technological future it unleashed has been limitless.Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of pure thought that often looked down on von Neumann's practical computational tinkering. Secondly, it discusses the provenance of von Neumann's ideas which partly arose from his need to perform complex calculations of the events occurring in a thermonuclear explosion. These top-secret calculations were quietly run at night on the IAS computer and in turn were used to tweak the computer's workings; as Dyson pithily puts it, "computers built bombs, and bombs built computers". Von Neumann also significantly contributed to the ENIAC computer project at the University of Pennsylvania. Thirdly, Dyson brings us evocative profiles of a variety of colorful and brilliant characters clustered around von Neumann who contributed to the intersection of computing with a constellation of key scientific fields that are now at the cutting edge.There was the fascinating Stan Ulam who came up with a novel method for calculating complex processes - the Monte Carlo technique - that is used in everything from economic analysis to biology. Ulam who was one of the inventors of thermonuclear weapons originally used the technique to calculate the multiplication of neutrons in a hydrogen bomb. Then there was Jule Charney who set up some of the first weather pattern calculations, early forerunners of modern climate models. Charney was trying to implement von Neumann's grand dream of controlling the weather, but neither he nor von Neumann could anticipate chaos and the fundamental sensitivity of weather to tiny fluctuations. Dyson's book also pays due homage to an under-appreciated character, Nils Barricelli, who used the IAS computer to embark on a remarkable set of early experiments that sought to duplicate evolution and artificial life. In the process Barricelli discovered fascinating properties of code, including replication and parasitism that mirrored some of the great discoveries taking place in molecular biology at the time. As Dyson tells us, there were clear parallels between biology and computing; both depended on sequences of code, although biology thrived on error-prone duplication (leading to variation) while computing actively sought to avoid it. Working on computing and thinking about biology, von Neumann anticipated the genesis of self-reproducing machines which have fueled the imagination of both science fiction fans and leading researchers in nanotechnology.Finally, Dyson introduces us to the remarkable engineers who were at the heart of the computing projects. Foremost among them was Julian Bigelow, a versatile man who could both understand code and fix a car. Bigelow's indispensable role in building the IAS computer brings up an important point; while von Neumann may have represented the very pinnacle of abstract thought, his computer wouldn't have gotten off the ground had Bigelow and his group of bright engineers not gotten their hands dirty. Great credit also goes to the two lead engineers on the ENIAC project, J. Presper Eckert and John Mauchly, who were rather unfairly relegated to the shadows and sidetracked by history. Dyson rightly places as much emphasis on discussing the nitty-gritty of the engineering hurdles behind the IAS computer as he does on its lofty mathematical underpinnings. He makes it clear that the ascendancy of a revolutionary technology requires both novel theoretical ideas as well as fine craftsmanship. Unfortunately in this case, the craftsmanship was ultimately trampled by the institute's mathematicians and humanists, which only added to its reputation as a refuge for ivory tower intellectuals who considered themselves above pedestrian concerns like engineering. At the end of the computing project the institute passed a resolution which forbade any kind of experimentation from ever taking place; perhaps keeping in line with his son's future interest in the topic, Freeman Dyson (who once worked on a nuclear spaceship and genuinely appreciates engineering details) was one of the few dissenting voices. But this was not before the IAS project spawned a variety of similar machines which partly underlie today's computing technology.All these accounts are supplemented with gripping stories about weather prediction, the US thermonuclear program, evolutionary biology, and the emigration of European intellectuals like Kurt Godel and von Neumann to the United States. The book does have its flaws though. For one thing it focuses too heavily on von Neumann and the IAS. Dyson says relatively very little about Turing himself, about pioneering computing efforts at Manchester and Cambridge (the first stored-program computer in fact was the Manchester "Baby" machine) and about the equally seminal development of information theory by Claude Shannon. James Gleick's "The Information" and Andrew Hodges's "Alan Turing: The Enigma" might be useful complements to Dyson's volume. In addition, Dyson often meanders into one too many digressions that break the flow of the narrative; for instance, do we really need to know so much about Kurt Godel's difficulties in obtaining a visa? And do we need to get bogged down in minutiae such as the starting dates and salaries for every member of the project and the list of items on the cafeteria menu? Details like these might put casual readers off.Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters. It's certainly the most detailed account of the IAS computer project that I have seen. If you want to know about the basic underpinnings of our digital universe, this is a great place to start even with its omissions. All the implications, pitfalls and possibilities of multiple scientific revolutions can be connected in one way or another to that little machine running quietly in a basement in Princeton.
S**L
A historical account
It's more of history and people involved. It's bit boring in starting but after the chapter of John Von Neumann, if gets interesting.Bought this book after buying the MANIC by Benjamin Labatut
M**K
Great content of the book, to small font for comfortable reading
The book content is great, highly recommend. Maybe only try to get a different edition. This edition has very small font and it is not a comfortable reading. I ended up reading it on Kindle.
M**P
Gift
Present for someone who loves computer science history - went down well!
A**H
Spannende Geschickte des elektronischen digitalen Computers
George Dyson versteht es, die vielen beteiligten Personen und deren z.T. etwas unübersichtlichen Lebensläufe in eine spannende Geschichte zu verweben, die ein auch für Laien noch verständliches Bild davon zeichnet, wie im Princeton der 40er und 50er Jahre die Entwicklung von Atomwaffen und die Entwicklung elektronischer digitaler Computer Hand in Hand ging – und wie die Konstruktionsprinzipien dieser Computer noch die Architektur unserer heutigen Rechner bestimmen. An vielen Stellen ein faszinierendes Porträt entscheidender Figuren, allen voran John von Neumann.
G**O
This is a great book
....but you must love physics in the magic 30 (1920-1950) to fully appreciate it and appreciate the real life tales of the unparalleled cluster of geniuses who, driven by the wars ravaging Europe reassembled in America. True, the result of the atomic and nucllear weapons were not things to be proud of, but without them we wouldn't have computers and a lot of other things, as you will discover reading this book.
Trustpilot
1 month ago
3 weeks ago