Source
Automatically imported from: http://commons.somewhere.com/rre/1998/Advanced.Information.Tec.html
Content
This web service brought to you by Somewhere.Com, LLC.
Advanced Information Technology and Social Change
```
---
This message was forwarded through the Red Rock Eater News Service (RRE). Send any replies to the original author, listed in the From: field below. You are welcome to send the message along to others but please do not use the "redirect" command. For information on RRE, including instructions for (un)subscribing, send an empty message to rre-help@weber.ucsd.edu
---
Date: Mon, 22 Jun 1998 15:13:40 -0400
From: David Hakken
Advanced Information Technology and Social Change: The Worksite Connection
COSSA Congressional Breakfast Presentation, June 19, 1998
By David Hakken, Ph.D. Professor of Anthropology and Director Policy Center State University of New York Institute of Technology at Utica/Rome
1. Introduction
Understanding the relationship between technology and social change is important for several reasons. Perhaps the key one is the popularity of the "Computer" or "Cyberspace Revolution" (CR) idea, the notion that we are currently in the midst of a profound transformation to a new way of life brought about by computer technology. The CR View dominates discussion of "our times" and colors most social policy. Through changing work, it is believed, computers warrant change in labor market, education, and economic policy. To the extent this view misrepresents or oversimplifies the relationship of technology to social change, public policy is at risk. Thus, in considering what social science has to say about technology and social change, it makes sense to focus on computing at work as a key to broader social change.
I attempt today to summarize my own and others' research on computerized work. After a few clarifications of my "take" on computing, I will discuss
1. What some have improperly thought to be true about its connection to social change 2. What we now think is probably true, and 3. What we're not sure about, and 4. The policy implications of all this.
2. Advanced Information Technology at Work
As an anthropologist, my initial interest in work was piqued by the publication in 1972 of the report, Work in America by the Federal Department of, then, Health, Education, and Welfare. In accounting for the "blue collar blues" and other problems evident at work, this report stressed a radical misfit, between the "anachronistic authoritarianism" of the mass production worksite and the aspirations for meaningful work of a young, better-educated workforce.
The increasing role of Advanced Information Technology (AIT) in production of commodities has further complicated American workspaces. These complications are a preoccupation of what I refer to as my ethnography of cyberspace work. I use this phrase to refer to empirical study via extended fieldwork or "participant observation" of actual work processes involving, or "mediated by," computing, which may be the precursors of cyberspace. My focus is not just on machines but on the way people use them, the "computing actor network."
In the 1950s computers began to be integrated into offices and factories. In the 1970s, new forms of communication mediated by computers began to be used as well. The Advanced information Technology of the 1990s integrates computer-mediated activities and communication. Our current period of enthusiasm over, e.g., "information infrastructures" follows from the new possibilities unleashed by AIT: Basically, representations of any work process that can be computerized can now be communicated anywhere there are computers. What have we learned in forty years of studying worksite computing?
3. What Some Have, It Turns out Incorrectly, Thought to be True about AIT at Work
One common presumption was that workspace computing would replace human work. This, after all, is what much of it was intended to do. While some groups of workers have been displaced by this new technology, other occupations have expanded. The more or less continuous overall growth of jobs in the United States contradicts this "disappearance of work" presumption at the general level. While some of the European economies have faced periods of consistent high unemployment, it makes more sense to attribute this to social and economic policy or short term adjustment to specific technology changes than to the general development of technology. While many good jobs -- some would even say the job itself as a stable social form -- have disappeared, this has not meant the end of work.
If we can represent by computer much activity previously done by humans, and have computers direct other computers, why haven't we reduced the need for workers? Worksite ethnography helps us understand this conundrum: Almost invariably, work mediated by computer turns out to be different. How this has happened is illustrated in insurance. Here, computer processing of policy writing has not been accompanied by fewer policy writers but by collection of more information on applicants. Another example is how it takes me just as long to write a paper using a word processor; I've merely increased the number of drafts. I like to think it has improved my writing, but sometimes I doubt it.
A second common but incorrect presumption was that new technology would generally democratize work. The capacity of AIT to send information quickly to everyone in an organization can support decentralized decision-making; however, it has not, in general, done so. Like many of my colleagues, I am pleased with the reduction in the number of paper memos that cluttered my mailbox, but they have been replaced by email "community messages" with less information. They let me know of events, but they tell me little about what others are thinking or which organizational policies are being currently emphasized, why, and how.
Perhaps the failure to democratize follows from the fact that information remains a precious resource in organizations. Whether information "wants" to be free or not, those who have it tend to retain control over it. In the use of intra-organizational electronic communications systems, there is a long-term trend toward shorter messages with less analysis. Indeed, as other tokens of authority, from executive parking spaces to the gray flannel suit, have faded, the power connected with control of access to key information has comparatively increased.
Many believe that new technology at work means a shift in the importance of various factors of production, from labor and capital to information and knowledge. My view that this is also a misconception is more controversial. A considerable body of work ethnography from the 1970s on has documented both the importance of the knowledge that workers brought to production in the past, and that they continue to bring in AIT-mediated work sites. Workers detailed knowledge of how to do the work is of increasing importance in the "cyberfacture" processes described below. Certainly the relative importance of different types of workers has changed, as has the character of the knowledge contributed by each type, but these are shifts from one type of work knowledge to another, not a shift from workers to knowledge. The importance of venture capitalists in Silicon Valley similarly suggests little decline in the importance of capital. Perhaps popular consciousness has been confused by the evident decline in the power of workers to parlay their working knowledge into better deals with their employers. Through undermining the demand for older skills, AIT has something to do with this decline, but AIT has led to new groups of workers whose knowledge is in high demand, e.g., those able to address the millenium bug. The declining power of nations and trade unions to control terms of trade is an equally important factor in explaining this decline. Finally, popular consciousness encompasses contradictory conceptions of knowledge (for example, knowledge is held to be both highly generalizable and highly context-dependent). Lack of clarity about knowledge is too great for this notion of a shift to knowledge as the key factor in production to mean much more than a slogan. In sum, AIT has changed work less than is generally believed to be the case.
4. What We Now Think Is Probably True
What is undoubtedly true, however, is the importance of AIT artifacts, especially computers, as commodities. Indeed, this is probably their chief role in contemporary society. A large proportion in stock market growth is attributable, directly or indirectly, to the successful promotion, production, and sale of AIT. Postponement of Windows '98 really does threaten general well-being, at least in the short run. Consumers do spend money on computers that they might not have spent at all.
However, like the shift among kinds of workers, such a compositional shift in market content does not by itself mean a fundamental transformation in the general shape of society. Indeed, some of the most interesting applications of AIT in production actually extend the life of older forms. As a resident of New York's Mohawk Valley, I have access to some of the best rail track in the country. I ride the train less nowadays, however, because I've spent too many hours on sidings, watching higher priority, huge freight trains roll by. The retrofitting of computerized sensors to old right-of-way has integrated train hauling into "just in time" manufacturing, thereby extending current arrangements for carrying commercial cargo.
Another thing that we can say with confidence about AIT at work it that it plays an extremely important symbolic or ideological role. There is very little evidence to suggest that businesses buy computers only after "bottom line" strategic planning or detailed assessments of their need for increased information. Rather, field data suggest that computers are purchased in order to make organizations appear to be "up to date." This is one important source of the "productivity paradox," or the continued investment in huge amounts of AIT despite their failure to result in demonstrably increased output.
(I am not saying that computers don't increase productivity, only that most decisions to invest in it are made in the absence of data documenting such an increase. This is partly a measurement problem, following from the point noted above: Since computered work is typically done differently, evaluating its productivity seems like comparing apples and oranges.)
Meat cutting is one case where the role of AIT may be almost completely symbolic. Its trade journals manifest considerable interest in computer-mediated animal disassembly as part of the "modernization" program, yet very little actual machinery has made it onto the cutting floor (giving new meaning to the phrase, "Where's the beef?"). There, the chief technological innovation has been the addition of a third layer of leather to the "armour" protecting workers. Meanwhile, the industry has changed profoundly, with substantial reductions in wages, average periods of employment and therefore skill level, and unionization; substantial increases in injuries; and changes in ethnic composition of the work force. As an anthropologist, I have been particularly sensitive to cross-cultural and cross -national differences in AIT at work. There is, for example, an important historical difference between American and Scandinavian approaches to information system design at work. The standard American approach took the "turn key" system as the design ideal; that is, a system designed by engineers to replace workers and be self-contained, operable with the "turn of a key." In Scandinavia, system development emphasized extensive collaboration with those doing the work and likely to be using the system.
Recently, this Nordic "user orientation" has fed back into US design, in the forms of Participatory Design and Computer-Supported Cooperative Work. In this latter, for example, systems are designed to support work groups of interacting workers, not just individuals sitting in front of screens. This internationalization of PD and CSCW are indicative of the declining importance of national differences in approaches to computing. There is reason to talk instead about a convergence in terms of two stages in work/AIT development. In the first stage, roughly from 1950 to 1985, computerization had much in common with previous forms of modern industry or "machinofacture." Emerging first in meat packing but most developed in the automated assembly line of the car companies, machinofacture concentrated upon the replacement of especially skilled work by self-powered machines. Accompanying introduction of machines in production was a radical expansion of ancillary work, especially a paper labor process running parallel to the "real" one. Via this paper parallel, engineers, clerks, secretaries, but especially managers drew information from the real workspace, processed it, and communicated the decisions made back into production.
Prior to 1985, except in Scandinavia, computerization tended to fit into this machinofacture pattern. After 1985, however, we begin to see a different, "cyberfacture" pattern emerging. Another parallel process emerges, one in which organization staff spend time in workshops, planning exercises, and classes on "corporate culture." Classes in occupational safety, English at work, and corporate citizenship occupy a considerable part of the modern meat cutters time, along with his counterpart in several other industries.
In cyberfactured worksites, the emphasis is on quality of performance and on team. Computer systems are key to construction and communication of the information which such organizations spread more generally, as part of decentralizing control and allowing teams greater self management. Computers are equally central to construction and communication of the centralizing symbols that sustain the performative force of this more "virtual" work, both among employees and to customers.
Cyberfacture forms of work seem to be emerging around the world. This is the case despite perhaps the most reliable generalization about AIT at work, the immense difficult in anticipating either which particular AIT artifacts will develop or the specific changes with which they will be associated. Nonetheless, we can say with confidence that a variety of forms of public support have been essential to promoting the pace, while not dictating the form, of AIT development, and that study of AIT in use is essential if we wish to apprehend its connection to broader social change.
5. What We're Not Sure about
First, the long-term implications of cyberfacture are not at all clear. In some organizations, cyberfacture looks like an historic reversal of individuation, the core feature of Taylorization or the approach to work promoted as scientific management. Taylorization separates each worker from the others, complete control of the worker's body being a step toward control of the work. At an organization like Sun Micro-systems, or in many international futures' trading offices, this process seems reversed. Elimination of individual offices and transformations of libraries, lounges, even lunch rooms into work spaces means that work cannot help but be more social. The group takes on many of the traditional roles of the boss, both supervisory and supportive.
Some social scientists see in these developments a shift in the historic balance between workers and managers, the real emergence of the workplace democracy for which computers have always had the potential. Others argue that it is merely a shift in the locus of control of the labor process, from control of the body to control of identity. Instead of robots replacing them, workers remain, but find their lives only in highly mediated, collectively constructed and monitored images on the screen. Pessimists also draw attention to the relative disinterest in the "high performance" organizational models projected by trade unions in Europe and the US.
Moreover, it is not at all clear whether cyberfacture will become the general pattern or remain an oddity in a few specialized places. Aspects of cyberfacture are spread broadly around the world, but not deeply; few organizations have managed to coordinate the changes in information systems and organizational structures that make up the "virtual work" model. Equally unclear are the geographic implications of the more recent forms of AITed work. Several social scientists have argued that computers disconnect work from any particular location, they "decouple space from place." Through virtual offices, telecommuting, or bio-engineering, there no longer needs to be a "there" at work. My sensitivity to this issue is the reason why I have referred here to "worksites" and "workspaces" rather than "workplaces". Yet others argue that industries like computer development and bio-technology reinforce the connection of space and place. Close, face-to-face, often university mediated, personal interactions seem key to constructing the symbiotic "hothouse" relations of such apparently strategic industries.
6. Policy Implications
This far too rapid portrait of the proto-cyberspace worksite leads me to offer the following generalizations of relevance to US national policy. First, and apparently uncontroversial at the moment, there is value in continued public support for research on and development of computing. I would like to see even more emphasis given to evaluation of these programs of support, especially through the study of them in use.
In Scandinavia, for example, I became convinced of the need to shift away from information technology policy toward information policy, less focus on hardware and more on how to promote its use. In contrast, we in the US build "information highways" in hopes that others will want to go where we've made it possible, and they'll want to strongly enough to build their own on-ramp. Instead, we could support groups in different organizations to identify information they'd like to share and then help them do so. Such a "bottom up" strategy can work, as long as networks remain open to further connections-have an "open" architecture.
The Internet developed in just this way. It is a good example of how public support promoted broad accessibility generally by example and strategic funding; the World Wide Web developed along similar lines. I doubt that example and "carrot" funding will always be enough in cyberspace, however. Regulation, of labor as well as product markets, and of education, seems likely to be necessary.
With regard specifically to educational policy, I again stress importance of increasing attention to the use context. When I first started teaching computers almost twenty years ago, I often discouraged students from majoring in computer science. Better, I argued, to combine skill in the use of these machines with substantive knowledge of some other field, be it health, education, or automobiles.
The fluctuations in the demand for computer scientists over the years, despite their current, y2k-related popularity, have reinforced this view. By supporting education programs that stress facility with things like participatory design, not programming, educational programs can help cyberfacture become a truly new way to work.
(I will be happy to provide references to various aspects of the academic literatures summarized here. Please email me at hakken@sunyit.edu, or write me at
Policy Center SUNY Institute of Technology PO Box 3050 Utica, NY 13504.)
David Hakken Professor of Anthropology Director, Policy Center SUNY Institute of Technology PO Box 3050 Utica, NY 13504-3050 315-792-7437 (7503 FAX) hakken@sunyit.edu ```
This web service brought to you by Somewhere.Com, LLC.