Sunday, August 10, 2014

Turing's Cathedral

Before the flatscreen liquid crystal display, there was the cathode ray tube or CRT. Those electronic furnaces were the standard display device of desktop computing in the 70s, 80s, and 90s.  Before then, back when 1024 bits was a very large memory array and the largest computer in the world had 40 of those, the CRT was a storage device: numbers were written to the screen as blobs of light; and those blobs of light - 0s and 1s - were read by another beam.  And yet, for all of the seeming primitiveness, much that we do was done before in just that environment.  

Weather modeling was tested and proofed. Thermonuclear reactions were modeled - and so was stellar evolution... and so was the evolution of the digital world, the computer modeling itself as a projection of the binary future we now live without question.  Stanislaw Ulam, John von Neumann, Oswald Veblen, Robert Oppenheimer, Freeman Dyson, and Alan Turing were among about 25 or 30 pioneers, innovators, and visionaries of applied mathematics and frontline engineering, defined by a Platonic vision of the smartest people given free rein to do what they wanted without worrying about money.

In this detailed history of the earliest days of computing, the guest star, Alan Turing, is only a visitor.  The center belongs to John von Neumann, easily a man of Newtonian genius who thought deeply and clearly from the broadest abstractions to the minutest technical detail.  The atomic age and the information age brachiated from the trunk of Budapest's noble Jewry, its cafes, its huge homes of commerce.  It was reflected in the fineries of university culture across Eastern Europe and within the heart of Europe, Germany. And when the weeds of ignorance and hatred overran the garden of culture and learning, those who could came here.  Among them, also was Alan Turing. Though England was nominally a democracy, Turing would eventually be hounded to suicide by the oppressive and repressive politics of national security.  That same drive enveloped the Institute for Advanced Study and drove barbed tendrils into its structure, stinging Oppenheimer, and then those who had sided with or against him.  

Through all of that, the ENIAC ran continuously, a niggling 40 banks of 1024 bits modeling thermonuclear explosions and therefore the evolution of stars.  We can click on AccuWeather.com because meteorologists 70 years ago proved their theories with a 24-hour forecast that took 24 hours to run on the world's first computer.  Our digital world of iPads and Androids and wearable tech evolved from an array of vacuum tubes that was used to model the Darwinian evolution of an array of bits in a competitive environment.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.