A discussion of the core concepts of modern computing and their basis in history. Dr Doron Swade offers a new analysis of the history of computing, suggesting that instead of a linear progression from one phase to the next, it can be better understood as a series of separate computational functions diverging and converging.Dr Swade goes beyond the analysis of the history of computing as moving from the Mechanical to the Electromechanical and then to the Electronic phase. Instead he argues that the history of computing is better understood as the diverging and merging of a series of streams which represent very separate computational functions or paradigms: Calculation, Automatic Computation, Information Management, Communication and the Electronic Information Age.
31 October 2013
The Grand Narrative
of the History of Computing
Dr Doron Swade
The history of automatic computing is an emerging field. Despite its youth if we look at historical surveys, canonical textbooks, and popular histories, an unmistakable narrative has emerged.
The Grand Narrative
The history of computing looks to its roots in early number systems and counting. The tale typically starts with Roman calculi (pebbles) and ends with the Smartphone, taking on its way the surge of devices of the European Enlightenment (logarithms, slide rules, mechanical calculators), Charles Babbage’s calculating engines, Herman Hollerith’s tabulators, the boom of office automation in the early decades of the 20th century and, finally, the electronics revolution .
The tale told in this way implies a developmental continuity of sorts, a comforting form of evolutionary gradualism – a story of innovatory highlights representing welcome and cheerful progress. Some episodes are framed as ‘rescue narratives’ which portray innovation as a response to pre-existing need (Babbage, Hollerith, internal stored program) and this suggests that innovation was by and large welcomed by long-suffering or deprived practitioners newly relieved from onerous toil. This was not always the case.
The tale being time-serial suggests a chain of historical causation, of rolling influence. But there are features of historical change that do not sit comfortably with this model of relentless determinism: the omission of long periods of dormancy (‘the hundred dark years’); the tendency to obscure spontaneous and sometimes simultaneous invention (Konrad Zuse); issues of nascency, latency, and acceptance (John Atanasoff and ENIAC, Jacquard); the deselection of parallel influences and usage (slide rules, Curta, Abacus); and, above all, the suggestion of monocausal progression which turns out in many cases to be wishful.
If we shift from this technocentric or innovation-driven model to a user-based model and focus on the human activities that the devices relieve, aid or replace, the landscape opens up and we can devise a more dimensioned map better able to address some of the difficulties. Four separate threads can be identified: calculation, automatic computation, information management, and communications. Each has its traditions and distinct material culture.
If we now look at the standard chapter sequence of the grand narrative we can see that it has been assembled by splicing together episodes from different threads often with no strong historical or functional connection between the juxtaposed elements.
The separate threads merged during the revolution in solid state electronics, that is, the micro-chip or integrated circuit era of the 1970s, 80s and beyond. Products now abound that defy categorisation into traditional classes of object. Hitherto distinct functions and usage have been assimilated into new host devices. The ‘collapse of categories’ or ‘death of classes’ signals the end of the grand narrative. Or does it?
From Calculation to Computing
Unplaiting the thread of automatic computing from the larger braid offers an historiographic corrective that allows us to explore computing’s distinctive development. Electronic computing is widely seen as a gift of modernism. It is identified with the modern electronic age and this has had the effect of eradicating prehistory. It is as though the modern era with its rampant achievements stands alone and separate from the past.
We can identify a suite of core ideas in modern computing: ‘mechanical process’, digital logic, the notion of an algorithm, system architecture, software and universality, and the internal stored program. These ideas are unmistakeably modern though almost without exception they emerged in the 19th century, specifically in the work of Charles Babbage and Ada Lovelace.
With the exception of the internal stored program this set of features is explicit in the designs for Babbage’s Difference Engines and Analytical Engines. The Analytical Engine, conceived in 1833, marks the transition from calculation to computation and the designs embody just about all the logical features of a modern digital computer including user programmability, conditional branching, iteration, system architecture, parallel processing and micro-programming.
Babbage’s calculating engines signal the start of the movement to automate computation. In broader cultural terms they can be seen as the extension of the metaphor of industrial production to mental activity, and the idea of the ‘industrialisation of thought’ is evidenced in contemporary descriptions of his machines.
‘What is music but number clothed in sound?’ The final step to generalised computing is found in the idea that number could represent entities other than quantity – letters of the alphabet, notes of music. The idea that a computer is a generalised machine for manipulating symbols according to logical rules, and that the power of the computer in its relevance to us derives from the representational power of number, comes not from Babbage but from Ada Lovelace, Lord Byron’s daughter, friend and collaborator of Babbage. Lovelace appears to have seen the significance and potential of generalised computing in terms that eluded her contemporaries, even Babbage.
The startling congruence of modern and early ideas, articulated a hundred years apart, largely without a direct line of forward influence, leaves us with the conclusion, both reassuring and confining, that these ideas embody something fundamental about the nature of computing.
© Dr Doron Swade 2013