vol. 27 no. 3, September, 2022

Book Reviews

Swade, Doron. The history of computing. A very short introduction. Oxford: Oxford University Press, 2022. xi, 139. ISBN. 978-0-19-883175-4 £6.99/$11.95

Oxford's series of "very short" introductions appears to have been highly successful for the publisher: the series now numbers more than 700 and there have been translations into 46 languages. "Very short" may be a little misleading however: true, the books are very small format, but with 130 pages of text, presented in, I think, 10 point Times New Roman, I estimate that the text has approximately 50,000 words.

However, in this volume in the series, the words are well chosen, and the author's background as engineer, historian, and museum curator in London's Science Museum makes him admirably fitted to tell us the story of computing. That the story is worth telling is signalled by the author quoting Marvin Minsky: "Computer-like devices are unlike anything which science has ever considered..." (pp. 1-2), and noting that if this is so, "If computers belong to a new category of object..." (p.2), is a new form of history required to deal with them?

In the rest of the first chapter the author review the varies modes of writing the history of computing and concludes with describing a number of "master narratives". The first of these narratives marries calculation, automatic computing, information management, and communications, to create the present "information age". This narrative guides the rest of the book, with occasional reference to the other narratives, which are: the timeline narrative, i.e., the chronological account of development, the scientific tool narrative in which the computer is presented in terms of its usefulness; and the information machine, which brings us to the present general information management function of the computer, the Web and telecommunications.

The remainder of the book draws upon these narratives to tell the story, first of calculating machines, from the abacus (still used in various countries around the world), through the slide rule, and various mechanical calculating devices, to the electronic calculator. The story moves on to the emergence of automatic computation, beginning with Charles Babbage and, crucially for history of computing, Ada Lovelace. While Babbage conceived his "difference engine" as a tool for mathematics, Lovelace made the essential insight into the fact that numbers could stand as representations of "entities other than quantity". If we assign numbers to non-arithmetical entities, we can manipulate them and then re-translate the results back to the original meanings.

Swade points out that Babbage had no impact at all on the development of the modern computer and that his work was not studied until the 1960s, by which time the modern mainframe computer was already widely used. For a time there was a competitor to the digital computer in the form of the analogue computer, so called because its operations are analogous to the phenomenon under study. The two forms of computation existed side by side well into the 1970s and there are still some applications for which analogue computers are most useful.

The peak of automatic computing was attained by the Harvard Mark 1: "The machine was massive–51 feet long, 2 feet deep, it consisted of some 760,000 parts, and weighed 5 tons." (p. 62). It also cost half-a-million dollars but, finally, as one writer put it, it was "Babbage's dream come true." The machine was completed in 1944 and used to assist the war effort and, of course, at the same time, things were happening at Bletchley Park in the UK.

The shift from automatic computing to electronic computing was somewhat drawn out, and there have been competing claims as to what was the first genuine electronic computer. Some cite Colossus, built at Bletchley Park to aid in the decoding of German military messages, but Swade is "uncomfortable" about this, noting that it was not a general-purpose electronic computer, but one designed for a single task, which the author describes as information management, rather than automatic computing. ENIAC is a more appropriate candidate for the choice of first general-purpose electronic computer.

Like Colossus, the war effort was behind the development of ENIAC, built at the University of Pennsylvania. However, ENIAC used a decimal system in its calculations, not the binary system, which had been used by Konrad Zuse in his Z-series of machines in the 1930s. The author makes it clear that the architecture of the modern electronic, digital computer did not emerge pure and perfect from these early developments. Different developers, mainly in the USA and the UK, explored different ideas for improving memory storage, speed and efficiency. Ultimately, what emerged was UNIVAC, which caused a media sensation by correctly predicting the outcome of the 1952 Presidential election in the USA, and, as the author notes "UNIVAC" became the generic name for a computer.

And now, in the present, what do we have? Personal computers, locally and globally networked, the World Wide Web with billions of users, social media, e-mail, messaging systems, etc., etc. Truly, as Marvin Minsky noted, science has seen nothing like it before. And not only science: humankind has never seen anything like itl, and, especially through the mobile phone, the computer has become a device of daily, multiple uses. Babbage would be amazed: the modern computer is not simply a mathematical calculating device, but a calculator, an information manager, a communication device, and more. Doron Swades story of how we got here should be of interest to anyone who uses a computer, of any kind.

Professor Tom Wilson
Editor in Chief.

How to cite this review

Wilson, T.D. (2022). Review of: Swade, Doron. The history of computing. A very short introduction. Oxford: Oxford University Press, 2022. Information Research, 27(3), review no. R746. http://www.informationr.net/ir/reviews/revs746.html

Information Research is published four times a year by the University of Borås, Allégatan 1, 501 90 Borås, Sweden.