In Man and the Computer John G. Kemeny analyzed some aspects of society and a few institutions with suggested alternatives for them to be improved with the application of computers. Many of Kemeny’s predictions were largely correct. Some were very wrong. To be fair, the future of computers moved so fast that even William Gibson gave up cyberpunk for mainstream fiction.
|
American Museum
of Natural History
Special Edition |
Generally, John Kemeny failed to expect the invention of the microcomputer and the resultant home and office desktops and laptops that today are ubiquitous. Although he wrote briefly about the advantages to video telephones, nothing he presented applies to the iPod. And much of what the iPod actually does –playing recorded music, taking pictures, giving directions to drivers – was not suggested for the video telephone.
Throughout the book two complementary ideas frame the thesis: a regional network of mainframes will allow personal terminals for millions; this technology also will empower “social analysts” to attempt solutions to our problems. Chief among those problems is traffic control in metropolitan centers. But that only reflects the full range of challenges from overcrowding, over population and over consumption. In short, John Kemeny was a fascist. But a nice fascist. He would never put anyone into a concentration camp for their beliefs, but he did see government authorities and academic experts as the central forces that can and should define and drive social progress.
Writing about business applications for computers, he outlined the power of management information systems. He did not suggest that entrepreneurs would find computers helpful. He certainly did not see a role for entrepreneurs in creating new modes of computing. He did call for private entities to provide time-sharing mainframes and databases, but only to counterbalance the potential threat of government monopoly over information and access to it. If John Kemeny read any Friedrich Hayek, it could only have appeared as an indecipherable alien language, written in Roman letters.
That aside though, this book was a seminal work and deserves attention for its positive attributes.
Early on (35-37) he grants validity to the importance of games on the computer. Gaming builds familiarity with the system, demystifying the computer.
That describes my experience meeting my wife in 1977 in front of an HP 9830 on which our physics instructor at Lansing Community College, Claude Watson, taught “BASIC for Arts and Science.”
“At Dartmouth we do not consider these recreational uses frivolous. First of all, they are an important resource for recreation in a residential college environment. But, more importantly, for many inexperienced users the opportunity for playing games against a computer is a major factor in removing psychological blocks that frighten the average human being away from free use of machines. Indeed, we are proud of the fact that one of the places that Dartmouth students take their dates to “show off” is the computation center. While they are likely to play several games there, they are also quite likely to show off with programs that they themselves have written.” (35)
As a teacher, Kemeny’s insights into the value of computer-aided instruction were accurate. (Ch. 7; 72-84; but also throughout) First, he acknowledged the importance of drill-and-practice. Beyond that, he recognized that one of the best ways to learn something is to teach it and we do that when we write a program to carry out a task. The three best uses from an educator’s point of view are rapid calculation, information retrieval, and the creation of algorithms. (80)
Kemeny correctly identified the need for a society of autodidacts, people who can teach themselves what they want to know by accessing databanks, journal articles, and lectures. (81-82). Books will remain important and pleasurable (83), but their content must be supplemented with interactive learning sessions.
Imagining “The Library of the Future” (Ch. 8; 85-98), Kemeny paints a detailed portrait of the product while totally missing the delivery mechanism. Focused on fiche images stored in card files and indexed by100-word abstracts, he did not expect that I could put “John Leonard Riddell” into Google Scholar and be led directly to a PDF of a lost work, Riddell’s 1836 monograph Memoir on the Nature of Miasm and Contagion. While some PDFs are image-only, this one is word-searchable. I can quote easily and exactly to show Riddell’s assertion that disease is caused by germs (“animiculae”).
|
Thomas Kurtz (Mac)
and John Kemeny (PC) |
For all of the prognostication and soothsaying, Kemeny in 1972 vastly underestimated the length, breadth, depth, and flow of the information revolution he would live to see welling up before he passed away too early in 1992 from heart failure. He calculated limits expressed in transmissions of megabits per second, delivering pages of storage, for dollars of rental time. Today we throw out computers with more capacity than the network he dreamed of in 1972 for 1980.
Of course, he could not see the limitations and contradictions that would evolve from within a successful informatic revolution.
Kemeny wrote about the need for “computer programmers.” We are all computer programmers, but like oncologists who have their moles removed by dermatologists, we are specialists, too often as limited in our skills with computers as our parents before us. The other day at work, I saw two young managers scan receipts to JPEG files with their iPods to facilitate the filing of expense reports. As their technical writer, I know that neither of them knows much about word processing software, or they would not have Arial in their Calibri and they would know what the red squiggly underline is trying to tell them. I warrant that none of the 40 project managers on my current infrastructure integration project could write a program to translate Roman numerals into Arabic numbers, though many of them are certified for Microsoft Project. (Myself, in 2007, I completed an undergraduate requirement for “computer literacy” by taking a class in Java a kind of Basic done up in Rococo.)
This post comes with some irony. As I read the book, I made notes in the front, a couple of words and a page number for each. Then I copied the page numbers and tags into this Word document to start the article. To get them into order, I highlighted and sorted. Seeing 111 ahead of 87, I went back and prepended zeroes to the two-digit integers. Then I resorted. Like much else, technology stays the same the more it changes.
Finally, Kemeny outlines a problem in simulation (traffic control) at some depth (130-135). He is correct that the general purpose digital computer allows flexible creation and testing of models. One promise not addressed was the testing of existing models.
Today’s controversies in anthropogenic global warming only bring this into the front. A model is based on assumptions. Data tests those premises. Real validation or falsification can only come from new data not in the original model; or else from different models employing the original data. Aside from some curve-fitting with least-squares, we seem not to do much of this, certainly not in the social sciences. If the essence of John Kemeny’s apology for computing is to be accepted, then modeling – both making and testing – must become a requirement for informatic literacy.
Postscript -- Speaking of literacy, to check the grammatical future in English, I googled "english grammar future tense" and read a Wikipedia article, then paged forward and read more at www.lousywriter.com. John Kemeny knew.