Red Rock Eater Digest - The Fall of Babbage's Theologywriting

militaryeducationinternationalmediaenvironmentinternet-policycryptographylaboraiactivismcognitive-sciencelawcommercehealthreligion
2001-01-03 · 17 min read · Edit on Pyrite

Source

Automatically imported from: http://commons.somewhere.com:80/rre/2001/RRE.The.Fall.of.Babbage..html

Content

| | | | --- | --- | | Red Rock Eater Digest | Most Recent Article: Thu, 8 Feb 2001 |

``` ---

This message was forwarded through the Red Rock Eater News Service (RRE). You are welcome to send the message along to others but please do not use the "redirect" option. For information about RRE, including instructions for (un)subscribing, see http://dlis.gseis.ucla.edu/people/pagre/rre.html

---

The Fall of Babbage's Theology

Philip E. Agre Department of Information Studies University of California, Los Angeles Los Angeles, California 90095-1520 USA

pagre@ucla.edu http://dlis.gseis.ucla.edu/pagre/

Revised version of a paper presented at the seminar on "People and Computers", University of Newcastle upon Tyne, September 1999.

This is a draft. Please do not quote from it or cite it. Version of 3 January 2001.

3600 words.

//1 Introduction

Computers are language machines. But language is also the principal residence of a society's historical memory. As a result, computers embody in their very design some of the deepest myths of the West. I propose to interpret the great car wrecks of information technology in terms of the dysfunctional ideas that have been written into it. The resulting critique builds on a generation of institutional and social studies of computing (Dutton 1998, Kling 1996); it rationally reconstructs the view of computing that this literature has set itself against. Technical talk always encodes stories about people and their lives. The challenge is to bring those stories to the surface, and to understand how computers are embedded in the cultural and political world. To perform such an analysis is not to question the intelligence and good will of technical people. The point, rather, is to recover an aspect of the collective unconscious as it has become manifest in our machines and the talk that surrounds them.

This is a social scientific approach to the analysis of technology. Its premise is that individual human beings participate in complicated social processes whose workings they are only partially aware of. This notion of partial awareness is important. An example is found in linguistics: few people can diagram the phonological structure of the language they speak. Field studies have even found people who are shocked to discover (by listening to a tape recording) which dialect of their own language they have been speaking (Blom and Gumperz 1986). The same principle applies to other social phenomena. Society exists nowhere except in people's interactions -- with one another and with the artifacts that they and others have produced. Yet those interactions generally have much more internal complexity than even the people who participate in them are aware of. Even individuals who might understand perfectly the goings-on in their own locality will generally not understand the role that their activities play in the much larger tableau of interlocking social processes that is spread across geographic and social space.

Because of this partial awareness, collective phenomena have a way of living through us. And it is these collective units of analysis -- institutions, economies, cultures, language -- that social scientists study. Of course, every social phenomenon has an individual level, and much of the purpose of social theory is to analyze the relations between individual and collective phenomena. When mistaken ideas about people and their lives are reproduced through computer-related talk and inscribed in computing machinery, that is a collective phenomenon that lives in the activities of individuals. The mistaken ideas get foisted on people through the ideologies they encounter in newspapers and classrooms, but they also get foisted through the computers themselves.

This technology-mediated distortion of ideas is a moral issue, and the question will arise of who is responsible for it. On one hand, the computer scientists cannot be held completely responsible for patterns of action whose workings and consequences they do not fully understand. Yet it verges on nihilism to accept that responsibility can never be localized as a result. The solution to his ethical puzzle, I would suggest, has two parts: (1) inasmuch as everybody is only partially aware of the social structures they perpetuate through their taken-for-granted everyday activities, the blame that falls on computer scientists is probably no greater than that which falls on anyone else; and (2) the engineers' main obligation is not to be perfectly self-aware, which is impossible, but to cultivate an ever deeper awareness of the inheritances and implications of their work. With that background in mind, I want to consider the particulars of the historical unconscious of computing.

//2 Babbage's theology

The history of computing is long and involved, and no individual can be anything more than a way-station on a long road. Charles Babbage, however, has a particularly strong claim to being identified as the inventor of computing, and more to the point he has a strong claim to having given modern form to the ideology of computing. Babbage is especially important as a transitional figure in the application of the medieval engineering worldview to the design of technology for the modern factory. For the medieval engineer, the purpose of engineering was to bring about God's perfect order on earth, and thereby to raise up their chaos of the physical and social world to something more like the crystalline rationality of heaven (Noble 1997, White 1978). This was Babbage's view as well.

Babbage viewed the factory as a microcosm, the engineer as an agent of God's order, and the computer as a means of administering the rational order of automation that God had directed the engineer to achieve (Schaffer 1994). Because the engineer was called to impose God's revealed order on the factory, he had no regard for views of the people who worked there. In fact, the very existence of those people as human beings was hard to conceive in Babbage's system. Thus understood, it is obvious that Babbage's reading of religion was selective, and that the shortcomings of his engineering system do not reflect on religion. Nor is the point that Babbage's religious beliefs were the one unique cause of modern computing. The point, rather, is that Babbage's work represented one chapter in a much longer story of constant interaction between religious ideas and technical practices within the evolving context of political economy.

I believe that modern computing practice inherits the belief system that Babbage, following a long tradition, crystallized in his work. This belief system, I emphasize, is not something that is necessarily conscious and chosen. It structures a discourse and practice, and it is frequently rediscovered and recodified in one form or another by philosophers and ideologists. It lives, however, not in explicitly axiomatic form but in the vast network of practices that engineers learn and teach, and in the machines that they build, repair, study, extend, and modify. Each new generation of engineers is socialized into a way of doing things -- a way of talking, thinking, drawing, visualizing, calculating. Each of these many practices tends to interlock with the others, and the practices employed by each individual engineer must be compatible to some extent with those of others, and with the installed base of machines as they already exist. Engineering, like any other institution, is a tradition, and like most traditions it is only partially aware of what it consists of and how it could be different.

The tacit belief system that underlies the history of computing might be called Babbage's theology. It consists of six deep assumptions:

(1) Omniscience. The designer of a machine (and, by extension, of the total order in which the machine is intended to participate) has perfect knowledge of the task that the machine is to perform and the environment in which the machine is to operate. Although fine details might remain to be ironed out and new features might need to be added, nothing fundamental will be learned from observing the machine's use in practice, and no perspectives other than the designer's need to be taken into account in the design process.

(2) Omnipotence. The designer of a machine (and, by extension, of the work arrangements that the machine presupposes and requires) has the power to implement the design and to secure the cooperation of all of the parties who will interact with it. In practice this power is borrowed from the designer's employer, who also happens to own the factory, but that fact is nowhere acknowledged by the theology, who identifies the engineer and not the owner as the agent of God's rational order.

(3) Disembedding. The factory is a self-sufficient world unto itself, and can be considered apart from the rest of the world except for the flows of material inputs and outputs, which can be characterized simply and completely. The relevant environment for the design of the machine consists of nothing beyond the factory, the totality of which is under the designer's control.

(4) Perfection. The order that the designer institutes in the factory is optimal. In fact it is uniquely and ahistorically optimal, and no other order could equally well have been chosen. Once imposed it need never be changed.

(5) Discontinuity. The order institute by design is a wholly new. It need not contend with any inheritances, influences, or leftovers of the past. The current state of affairs need not be understood in any depth, since it will be entirely superseded by the design. The main exception arises when the information flows of the current order are mapped with the intention of reproducing them inside a machine. Nothing else about the current situation matters.

(6) Transcendence. The imposition of a perfect order upon the factory has the effect of lifting it above its messy, chaotic, arbitrary past. It will now partake of a qualitatively different nature more akin to the spiritual realm than the physical, or in medieval language to essences than to accidents.

The elements of this belief system are not independent, and in practice they tend to reinforce one another. They reinforce one another more tenaciously because the underlying unity has not been brought to the attention of the engineers whose work it organizes. The point, then, is not that the engineers are trying to do these things. It would be more accurate to say that they are trying not to do them. This is important. Because the six assumptions are so self-evidently false, engineering practices that conform to them routinely collide with the reality of industrial life. The question is what happens when those collisions occur. To the extent that the underlying logic of the belief system goes uncomprehended, repairs to engineering practice will inevitably be local and incremental. Because these local repairs will be artificial exceptions to the vast interlocking network of practices and ideas that is handed down through generations of engineering training, practice will tend to expel the innovation and reinstate the original problematic order. The history of engineering, including the history of computing, consists largely of this story: attempts at reform that relax the six assumptions followed by codifications that reinvent the underlying order. The underlying order of the assumptions reasserts itself because it is the only order that is remotely accessible from the worldview and practices that any given generation of engineers encounters.

//3 The need for reform

Once it is accepted that the practice of computing is indeed organized by the assumptions of Babbage's theology, the need for reform is obvious. And yet precisely because the assumptions of the theology are so insidious and so obstinate, it is important to articulate just why it is increasingly necessary to replace them. Some of the reasons why the assumptions are out of date might be enumerated as follows:

(1) The intrinsic evil and folly of trying to be God. Many of history's disasters have been caused by people who thought they possessed God-like powers, and by the political and technological cultures that supported them. Trying to be God is intrinsically corrupting because, among other things, it encourages the delusion of certainty and the neglect of information that does not fit the putatively revealed order. Engineering should certainly be organized by norms of rationality, but those norms are only useful to the extent that their premises are correct, that the information upon which they rely is complete, and that they are applied correctly, none of which is guaranteed to any human engineer. Designers who renounce omnipotence and omniscience, to start with, will be led to a profoundly different practice of design.

(2) The serial disasters threaten both society and the profession of computing. Large software development projects routinely fail. The waste of resources from these failures can be astounding, but even worse is the danger that ensues when necessary important social infrastructures do not function correctly. The foremost example is the air traffic control system in the United States, whose exponential growth has not slowed despite ongoing failure to replace its ancient computing technology.

(3) The ever more tangled environment of legacy systems. There once existed something called "computerization", which meant the transition from an environment without computers to an environment with computers. Because computers have become practically ubiquitous, at least in the developed world, computerization hardly exists any more. Instead, new computing systems necessarily join an ongoing ecology of computing resources, all of which make their own demands for compatibility (Kling, 1989: 506). The Y2K problems with digital representations of calendar dates made suddenly visible the tangled mass of legacy hardware and software upon which many organizations depend, even though the original programmers and even their source code are long gone.

(4) The life cycle orientation. Managers of information systems have come to realize that a majority of costs are incurred after a system has been installed and put into use. These costs include maintenance, customization, and upgrades to new releases from the vendor. They also include the glue that integrates the often disparate elements of an organization's unique collection of computing resources. A new computer system is therefore not designed once-and-for-all; quite the contrary, the design process is simply one stage in a life cycle whose details, despite being unknowable almost by definition, must somehow be planned for in advance.

(5) The increasingly non-local nature of design. When systems are designed and built on a bespoke basis for particular facilities or organizations, the design can be adapted in some detail to the particulars of that environment, and the design process can likewise be decoupled from processes that might occur in other environments. Because of the economies of scale in software production, as well as the frequent need for compatibility, these isolated design activities are becoming obsolete. The whole idea of software development as design and programming is giving way to the global, political dynamics of standards. The development of standards, furthermore, interacts in complicated and unpredictable ways with the mass dynamics of their adoption. Nobody is an island any longer, and the viability of an organization's technical choices increasingly depends on the choices that other organizations make (Shapiro and Varian 1998).

(6) The ever-increasing efficacy of resistance. Computers are always used in the context of larger social arrangements, whether in homes, workplaces, transportation systems, military environments, or public spaces. Because many of these social arrangements are controversial, many people refuse to use computers in the way they are intended. Forms of resistance to computing include simple disuse, superficial use, appropriating the equipment for unforeseen purposes, security attacks, and sabotage. The earliest generations of computers were introduced into work environments that were already highly regimented, often replacing rationalized paperwork with its digital equivalents. But computing systems are increasingly designed for use in mobile and self-organized work, for professionals whose compliance cannot easily be coerced, and for consumers whose purchase of computer products is largely discretionary. These more autonomous computer users must increasingly be consulted and appeased if a new computer system is to succeed in practice.

(7) The increasing need for disciplinary knowledge in design. The concept of information promises to generalize across a wide range of applications. But different industrial sectors are increasingly developing their own distinctive disciplines of digital design, each of which requires a degree of immersion in that sector's particular knowledge and practices. Medical informatics, for example, contends with technological and administrative problems that differ from those of traditional industrial environments.

(8) The increasing centrality of business strategy in design. The architecture of any system that faces market competition, including open source systems that face proprietary competitors, must be designed with a view to the competitive landscape (Morris and Ferguson 1993). For example, a strategy should include a clear sense of which firms are competitors and which are allies, and the system's architecture should be modularized in a way that ensures that it will not require alliances with competitors or competition with allies.

(9) The increasing centrality of economic, legal, and policy issues in design. Computer system design initially arose as a branch of industrial automation, and as such it posed few distinctive problems beyond those that were already familiar from earlier generations of industrial practice. Having addressed basic problems of algorithms, systems analysis, and user interface, the computing profession is now struggling with the massive problem of interorganizational networking (Friedman 1989). As networked computing crosses organizational boundaries, all of the institutional complexity of those boundaries immediately comes to bear on design. Interorganizational systems often embody contracts and regulations, and they are subject to commercial and intellectual property law. Electronic commerce applications that conduct business with consumers face an especially complex legal environment. All of these social issues are capable of influencing fundamental architectural decisions, and must accordingly come to bear early in the design process. Yet they do not conform to the same logic as the requirements of traditional automation; a law, for example, is something quite different from an algorithm.

(10) The increasing complexity of human relationships mediated by the technology. Modern networked computing applications are capable of supporting complex contractual relationships, online community life, globally distributed workflow, banking settlements, military command structures, auctions, the ongoing news reports of extended families, and a wide variety of other institutionally specific forms of human relationship. Many aspects of these relationships can be supported with generic applications such as electronic mail. But more advanced distributed applications require an advanced understanding of the nature and dynamics of the human relationships that they support.

//4 The nature of the problem

For all of these reasons, Babbage's theology will wreak increasing havoc until the practice of computing is reformed. Reform is impossible, however, without an understanding of how the theology is reproduced. It is not a simple catechism that could be changed by rewriting the introductory chapter of a textbook. Instead, it is distributed throughout the ideas and practices of computing, all of which must be reformed together if any of the individual reforms are going to succeed. Enumerating the many aspects of the problem is a vast undertaking, and so a few examples will have to suffice.

(1) Device orientation. One might suppose that computer science is the science of computers, but this is entirely misleading. A useful computer requires a story about the context in which it is used, and so computer science is largely the science of stories. Yet many of the stories that accompany computers are superficial. The real focus is on the computer itself, or on the strictly technical method that the computer embodies. The research process resembles a hammer looking for nails; having located something approximately nail-shaped, it is happy to return to the investigation of hammers.

(2) Reading a social story out of the device. It is often supposed that technology drives history, so that (for example) a decentralized computer network necessarily brings about a decentralized society. In practice, however, computer technologies are shaped by the existing institutions of society, and are appropriated by those institutions for their own purposes. Computers are a malleable technology, capable of inscribing a wide variety of stories about society. Attempts to predict social changes from the architecture of a computer may simply recover the mistaken ideology that was programmed into it, and it will certainly underestimate the complexity of the social forces that will take hold of it once it is deployed -- assuming that it is ever deployed at all (Orlikowski 1993).

(3) Language. The discourse of computing routinely redefines words in ways that occlude the complexity of the computer's embedding in its social environment. In ordinary usage, for example, the word "search" is capable of naming a complicated activity that employs diverse resources and is embedded in a variety of social relationships. The discourse of computing, however, redefines "search" to mean only one small part of this large phenomenon, namely the things that happen inside a computer once a "search" command is issued (Agre in press). In this way, terminology can render the social embedding of computers almost unthinkable.

(4) Reward systems. Research in computer science rewards theorems and systems, and it rewards them with little regard to the sophistication or the accuracy of the researcher's larger story about the social world in which the results of the research are embedded. Research that does not produce theorems and systems is not even regarded as computer science, or indeed even as research, in the great majority of computer science departments.

(5) The opposition between "soft" and "hard". The partition between computer science and other fields is enforced in large part by a series of invidious distinctions that are built into the disciplinary language. A prominent example is the notion that mathematics-based research methods are "hard", whereas others are "soft".

//5 Conclusion

Once these unfortunate prejudices are finally overcome, what might a reformed practice of computing look like? Inherited patterns of thought have enough power to shape imagination that it is hard to know. Participatory (Schuler and Namioka 1993), concurrent (Nevins and Whitney 1989), and industrial (Baxter 1995) design certainly provide elements of an alternative model. More generally, one way to understand the entire critical literature on computing is as a search for a finite design, that is, a practice of design that acknowledges and takes seriously some hard facts: that designers are finite, that they are human, and that they are not God. This sounds obvious enough in the abstract, but like much else in the social world it is more easily said than done.

//* References

Philip E. Agre, Cyberspace as American culture, Science as Culture, in press.

Mike Baxter, Product Design: A Practical Guide to Systematic Methods of New Product Development, Chapman and Hall, 1995.

Jan-Petter Blom and John J. Gumperz, Social meaning in linguistic structures: Code-switching in Norway, in John J. Gumperz and Dell Hymes, eds, Directions in Sociolinguistics: The Ethnography of Communication, Oxford: Blackwell, 1986.

William H. Dutton, ed, Society on the Line: Information Politics in the Digital Age, Oxford University Press, 1998.

Andrew L. Friedman, Computer Systems Development: History, Organization and Implementation, Chichester, UK: Wiley, 1989.

Rob Kling, ed, Computerization and Controversy: Value Conflicts and Social Choices, second edition, Academic Press, 1996.

Charles R. Morris and Charles H. Ferguson, How architecture wins technology wars, Harvard Business Review 71(2), 1993, pages 86-97.

James L. Nevins and Daniel E. Whitney, eds, Concurrent Design of Products and Processes: A Strategy for the Next Generation in Manufacturing, New York: McGraw-Hill, 1989.

David F. Noble, The Religion of Technology: The Divinity of Man and the Spirit of Invention, New York: Knopf, 1997.

Wanda J. Orlikowski, Learning from Notes: Organizational issues in groupware implementation, The Information Society 9(3), 1993, pages 237-250.

Simon Schaffer, Babbage's intelligence: Calculating engines and the factory system, Critical Inquiry 21(1), 1994, pages 201-228.

Douglas Schuler and Aki Namioka, eds, Participatory Design: Principles and Practices, Hillsdale, NJ: Erlbaum, 1993.

Carl Shapiro and Hal Varian, Information Rules: A Strategic Guide to the Network Economy, Boston: Harvard Business School Press, 1998.

Lynn White, Jr., Medieval Religion and Technology: Collected Essays, Berkeley: University of California Press, 1978.

end ```

| | | --- | | ProcessTree Network TM For-pay Internet distributed processing. | | Advertising helps support hosting Red Rock Eater Digest @ The Commons. Advertisers are not associated with the list owner. If you have any comments about the advertising, please direct them to the Webmaster @ The Commons. |