A History of Modern Computing : Second Edition
||Author: Paul E. Ceruzzi|
List Price: $22.95
Our Price: Click to see the latest and low price
Publisher: MIT Press (01 May, 2003)
Sales Rank: 63,004
Average Customer Rating: 3.67 out of 5
Customer ReviewsRating: 4 out of 5
4 STARS for Ceruzzi
Very informative and exciting.
Ceruzzi made it simple for all to understand how computers came about from 1940s untill today.
Rating: 1 out of 5
USA-centric and flawed
Here in America we say that books are Eurocentric. We have no name for a phenomenon that is useful to our cultural life, and "American-centric" is therefore my term of art for books that narrate culture and technology as if no interesting developments happen beyond our shores. The consequences of this ignorance, as we have seen, can be deadly, for one of the reasons for non-Western extremism is our instinct to treat non-Western participation in our culture and technology with disdain.
Thus, as Ceruzzi fails to narrate, Algol is really the only common ancestor of usable programming languages, yet Ceruzzi dismisses Algol because it was not a commercial success. Algol was not a commercial success because IBM failed to support it in the decade from 1954 to 1964, and then attempted to usurp it with vaporware PL/I, for which IBM's programmers failed to develop an adequate compiler until the mid-1970s. Nonetheless, the block structure of Algol was found to be the only rational way of thinking about program structure as opposed to Fortran.
But Ceruzzi not only naturalizes American technical praxis along the dimensions of geography, he also naturalizes it along a temporal axis in which the mainframe era was a failed try at modern praxis.
Thus the "colorful" Herb Grosch does get his picture in Ceruzzi's book...and with his goatee poor Herb looks slightly fraudulent.
Grosch's law was so obviously self-serving from the standpoint of Herb's employer IBM; it was that the larger the computer, the power delivered increases exponentially. Herb left IBM in the late 1960s, and the history of how men like Herb were compromised (by the occlusion of their feelings and thoughts with corporate goals) is unwritten.
Herb's law was falsified by the discovery in the late 1960s that large computers (such as MIT's Multics) required such complex software that their promise could not be delivered, and today's law is Moore's law, which declares that microchip power will instead exponentially increase as the micros get smaller.
Common to both "laws" is the naturalizing error of neoclassical economics, which acts as if history does not exist. While it does appear today that Moore's law is still true as chip designs deliver what is miscalled computer "power" (the "power" to deliver wrong answers at high speed should be deconstructed) and is actually mere clock speed at an exponentially increasing rate, an historical perspective should remind us that this too, shall pass.
Making smaller chips is a labor process which has damaged the water-table of places like Silicon Valley and which represents the personal choices of venture capitalists to fund, entrepreneurs to entrep, and employees to choose to work in moon suits that are damned itchy at the end of the day.
Moore's law, like so many "laws" of neoclassical economics, declares that in 1971 we stumbled upon a fact of nature, like Parson Malthus observing the lads cavorting with milkmaids. It is secretly normative (like so many laws of the dismal science) in that it commands us to conform to this fact of nature as a ticket to adulthood.
Perhaps "computers are takin' over." But a critical history of technology, which to me is the only study worthy of the name of history, would read against the grain. It would narrate world praxis in hardware and in software as did a 1999 IEEE Transactions (in the History of Computers) which showed how the Swedes got by in the 1960s without IBM mainframes. It would narrate victim history, including the very interesting history of computer programmers who, it seems, have been an invisible class because they represent, all the way down, a counter-narrative to the dominant narrative of an autonomous technology to which we have to conform (for example, the biography of computer pioneer Ted Nelson is more interesting than that of John von Neumann.)
A very useful result of such a history would be applied, retro computing, for while mainstream historians like Ceruzzi are laying the past to rest, libraries, universities and other institutions are losing data through losing the software that formats and reads older data files. The XML (eXtended Markup Language) notation tries to address this problem as did Ted Nelson's Xanadu system but technical innovations, useful as they are, by definition do not address existing Lotus 1-2-3 spreadsheets (or the moldering Algol compiler I discovered at Princeton.)
I look to a book and software system on CD-ROM that would preserve, not the physical realization of outdated systems like the IBM 7090 or TRS-80, but their important features, which was the "architecture" they presented to their actual programmers. While building a retro computer encyclopaedia would be a formidable task, it would be made easier by describing the architectural interface of the computer in a form that a modern system can "compile" to a program that simulates the old computer, thereby presenting the user of the encyclopaedia with actual running examples of old software.
To modern-day crowds, trooping through the Smithsonian, computers are physical objects. But actual programmers know that computers are ideas in the mind, and a retro encyclopaedia would be a fascinating narrative of how Turing's idea created the postmodern era. It would also make clear that the old fraud, Marx, was right, for the value computers has created for society consists in a deep labor of understanding architectures enough to craft problem instructions, including the most despised yet most valuable instruction: "computer, here is a language in which I shall speak, and here is how you shall translate that language."
This is a grand yet critical narrative, for it shows that Leibniz was wrong. Let us not calculate (sir) let us communicate. I probably expect too much of poor Mr Ceruzzi, who appears to be of the tribe of people with which I made acquaintance at Princeton; the humanists who honestly apply their narrative skills to technology. But it appears that in America, no-one has answered Derrida's 1978 call for a critical reading of technology.
Rating: 4 out of 5
Paul E. Ceruzzi, curator of the National Air and Space Museum, describes the development of computing, starting with its earliest history. He examines the beginnings of commercial computing from 1945 to 1956 and traces the history of computer hardware and software, dividing these developments into five- to 10-year time periods. His book emphasizes technical development, rather than personalities or business dynamics, a focus that contributes to its fairly dry, academic style. With this caveat, we [...] recommend the book primarily to those with a technological bent, such as professionals in operations and computer sciences, and academics in the field. However, if you are interested in the subject, you'll love this. Ceruzzi provides an informative and comprehensive saga including extensive footnotes and a bibliography that runs about 80 pages.
· Inventing the Internet (Inside Technology)
· A History of Computing Technology, 2nd Edition
· Computer: A History of the Information Machine (The Sloan Technology Series)
· The Universal History of Computing: From the Abacus to the Quantum Computer
· From Airline Reservations to Sonic the Hedgehog : A History of the Software Industry