skip to main content
Skip header Section
A Science of Operations: Machines, Logic and the Invention of ProgrammingFebruary 2011
Publisher:
  • Springer Publishing Company, Incorporated
ISBN:978-1-84882-554-3
Published:25 February 2011
Pages:
350
Skip Bibliometrics Section
Bibliometrics
Skip Abstract Section
Abstract

Today, computers fulfil a dazzling array of roles, a flexibility resulting from the great range of programs that can be run on them.A Science of Operations examines the history of what we now call programming, defined not simply as computer programming, but more broadly as the definition of the steps involved in computations and other information-processing activities. This unique perspective highlights how the history of programming is distinct from the history of the computer, despite the close relationship between the two in the 20th century. The book also discusses how the development of programming languages is related to disparate fields which attempted to give a mechanical account of language on the one hand, and a linguistic account of machines on the other.Topics and features: Covers the early development of automatic computing, including Babbages mechanical calculating engines and the applications of punched-card technology, examines the theoretical work of mathematical logicians such as Kleene, Church, Post and Turing, and the machines built by Zuse and Aiken in the 1930s and 1940s, discusses the role that logic played in the development of the stored program computer, describes the standard model of machine-code programming popularised by Maurice Wilkes, presents the complete table for the universal Turing machine in the Appendices, investigates the rise of the initiatives aimed at developing higher-level programming notations, and how these came to be thought of as languages that could be studied independently of a machine, examines the importance of the Algol 60 language, and the framework it provided for studying the design of programming languages and the process of software development and explores the early development of object-oriented languages, with a focus on the Smalltalk project.This fascinating text offers a new viewpoint for historians of science and technology, as well as for the general reader. The historical narrative builds the story in a clear and logical fashion, roughly following chronological order.

Contributors

Recommendations

Reviews

G. Smith

Priestley makes an important distinction in his book: that of unpacking programming from the hardware, which is an intriguing twist for the history of computing. This work will be of interest to programmers and computer scientists, as well as to a more general audience. The work is not ostensibly a history, but interestingly enough, it does tie in the scientific revolution of the 17th century with more contemporary programming elements. Priestley teases out an idea implicit in computing history: "to describe the history of the idea of a programming language." The history of programming here, then, straddles computing machinery and computer programming. The crux of the issue, and a central thesis of the book, though, is an apparent paradox: Fortran and Cobol were commonly used at the time of their origin; thus, how is it that a language that was a relative failure in practical terms, Algol, later came to be regularly described as the most influential of early programming languages (p. 225)__?__ If Priestley is correct, and I believe he has produced enough evidence to demonstrate this point, Algol is not simply another programming language, "but rather a coherent and comprehensive research programme within which the Algol 60 report had the status of a paradigmatic achievement" (p. 225). This, then, is a paradigmatic moment, as enunciated by science historian Thomas Kuhn. Algol is so significant that it establishes the first theoretical, paradigmatic framework for the subsequent process of software development. The debate about machines and languages originates with the pioneering work of Francis Bacon. Bacon noted successes in the mechanical arts; in short, "experiments were mechanical aids to the senses" (p. 4). Bacon envisioned scientific progress guided by rules for the mind improved by experiments in the mechanical arts. Charles Babbage takes the critical next step, by taking the metaphor of the machine literally, designing and envisioning calculating engines that "combined the new, mechanical philosophy of algebra with the physical power made available by the machine-based industry of the industrial revolution" (p. 15). Babbage innovatively substituted a mechanical exercise for an intellectual process that was improved with a celerity and exactness that was previously unattainable. Babbage never built automatic computing engines, and as Priestley stresses, Babbage was not a pioneer of the computer. Instead, as viewed in a historical context, Babbage's machines were primarily "for the numerical evaluation of algebraic formulae" (p. 49). And yet, in a striking example of design convergence, Babbage anticipates later features of mid-20th century computing architecture. In each instance, builders "were trying to develop computers that were essentially calculators" (p. 49). Emil Post and Alan Turing next simplified "the execution of computations to the extent that it became plausible to imagine that the human element could in fact be replaced by machines" (p. 65). Characterized as "effective computability," this notion, coupled with machine builders, resulted in independent investigations that promoted a process of more completely automating calculation, diminishing the human role. When we appreciate the work of these independent yet related endeavors, the theoretical and the practical, application toward "the semantic notion of obeying a command, namely the universal machine" (that is, a computer) is possible (p. 98). The commonly accepted idea in computing history is that modern computing began shortly after the end of the Second World War with various independent endeavors. What Priestley points out, however, is that despite the similarities between other pioneering efforts, the University of Pennsylvania's electronic machine, the ENIAC, has often been distinguished from the other works, the relay machines. Given Priestley's interest in programming, he demonstrates "that a common approach informs the design of all these machines, one which represents a distinctive stage in the development of automatic computation" (p. 100). The limitation of all of these devices, however, is the striking amount of manual intervention that was still required to process a significant computation (p. 118). By 1950, these devices were "obsolescent, partly because of their limited computational capacity, but also because the design principles on which they were based had been superseded" (p. 123). The ENIAC design paved the way for ENIAC's successor, EDVAC, making a significant advance. The received wisdom has been that EDVAC and the intertwining of the logical and the practical is the watershed event, or even the necessary component, for ushering in the era of modern computing. Priestley emphasizes, though, that Turing's characterization of the computer as a general-purpose machine rather than a specialized calculator led to the modern computer. The stored-program property of computing was viewed as a technical feature, "not as the defining property of a new technology, as it later became" (p. 154). Although this is a work on technical programming, it could benefit from illustrations, and thus appeal to a wider readership. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.