Source
Automatically imported from: http://commons.somewhere.com:80/rre/1998/the.global.academic.vill.html
Content
This web service brought to you by Somewhere.Com, LLC.
the "global academic village" and intellectual standardization
``` Information technology and higher education: The "global academic village" and intellectual standardization
Phil Agre http://communication.ucsd.edu/pagre/ June 1998
University administrators these days are planning for a world in which information technology is pervasive -- so pervasive, in fact, that the very institution of higher education begins to change. It is entirely possible, of course, that we can use information technology to improve higher education. But information technology is exceedingly flexible, and we will surely face numerous choices about how best to apply it. Some of those choices will be ethically straightforward matters of efficiency, best left to the experts. Other choices, however, will require us to reflect carefully on the values that a university ought to express. If we have learned anything from attempts to improve life using information technology, it is that significant improvements are only possible when institutions are rethought at a basic level. Some will argue that the necessary changes are inevitable, having been determined in advance by the technology. But such arguments should be examined with great care: in practice they will usually be found to encode important ethical stances that do admit reasonable alternatives.
Let us take an example. A recent letter to University of California faculty from the chair of the University's Academic Council, Sandra Weiss, discussed something called "course articulation", which she defines as "the degree to which students can build an additive degree program by taking courses either at different institutions or at the different campuses of one institution". This same idea is called "modularity" in Britain, where it was central to the higher education platform of the Thatcher and Major governments. On the motives behind course articulation, Professor Weiss explains that "we have moved into an era where individual campuses are becoming part of a larger academic community -- a "global academic village" so to speak". Information technology helps drive this trend, and Professor Weiss further explains that "[f]or technology-mediated coursework, we need to identify comparable content across courses that would be acceptable for transfer and also grapple with our expectations regarding traditional "face to face contact" between professor and student and among students themselves". (Quotes are from Professor Weiss's "Notes from the Chair" column in the May 1998 issue of the "Notices" of the University of California's Academic Senate.)
This sort of discussion refutes often-heard stereotypes of professors -- or "academic elites", as the new jargon would have it -- as Luddites engaged in bull-headed resistance to technologically driven institutional change. Quite the contrary, as Professor Weiss's letter illustrates, my own impression is that fundamental changes are being implemented as we speak, and that these changes are often taking place beneath the radar screens of most faculty, much less the broad public. Now, the University of California has perhaps the most robust traditions of shared governance of any public university in the world, and so the faculty here have no excuse if they are unaware that these things are going on. And although I have my differences with the University of California administration, I think the game is being played more or less fairly. Still, it is important that we step back and ask what we are getting ourselves into, and what choices we are actually making.
I believe that traditional practices of computer system design lead to an important phenomenon that I call "ontological standardization". When you write a computer program, almost the first step is to define the ontology that the program's data structures are going to reflect -- that is, what sorts of things you think the world is made out of, and therefore what sorts of data objects are going to be created and stored through the program's operation. The technical term for this is a "data model". In the case of higher education, one's ontology might include people, job titles, departments, courses, majors, and grades. The ontology, in other words, has (at least) those six components, and your program will only work right if everything the program needs to represent can be comfortably subsumed within one or more of those six categories.
In the old, unnetworked world, different organizations -- universities in this case -- all developed their ontologies somewhat independently of one another. Forces did exist toward what Walter Powell and Paul Dimaggio call "institutional isomorphism" -- for example, the frequent movement of administrators from one organization to another. In the world of networked computing, however, the forces for institutional isomorphism are greatly amplified. If each student makes only a single choice among hundreds of different four-year schools, it does not matter so much whether the internal workings of those schools can be mapped onto one another. But if we suddenly move to the opposite extreme by letting each student choose among those hundreds of schools for each course or even each class meeting, then suddenly the schools need to ensure that they mean the same thing by the very concept of a course or a class meeting. Thus far, this issue has arisen primarily in the context of mergers between corporations: if the two companies' computers don't talk to one another, say because each side has meant something different by a word like "employee" or "sales", then geniune havoc can result. Now, however, the same issue can arise in a wide variety of institutional contexts, even when separate organizations are not being formally merged. Ontological standardization, then, is what happens when most of the organizations in a given institutional field are required to harmonize what they mean by the most fundamental categories of their internal workings.
What are the practical consequences of this force? The trend toward course articulation is a good example. In the old days -- that is to say, the past up to and including right now -- universities competed on the basis of their distinctive programs: one university's economics department, for example, might be ranked above another university's economics department in some magazine survey. In such a world, each university is able to take its own distinctive approach, and then each program within a university is able to take its own distinctive approach within the overall context of the university. The University of Chicago emphasizes scholarship, for example, and Harvard emphasizes social networking. Each program is able to divide up its curriculum into courses however it likes in accordance with its own distinctive approach. And because decisions about program philosophy and course content are made locally by each faculty, on the basis of its own talents and its competition to attract the best students, the contents of courses and the boundaries between courses can change rapidly and flexibly to suit the evolving circumstances.
With ontological standardization, however, all of this threatens to change. It is, to be sure, a good thing to help students transfer between campuses: the possibility of transferring from a community college into the University of California system, for example, has been a basic part of California's higher education strategy for decades, even if the transfer students don't always have the easiest time of it. We need to recognize, however, that the ease of transferring courses between schools -- effectively assembling one's college education a la carte from among the offerings of a large number of potentially quite different programs -- may come at a significant price in intellectual diversity. If the internal modularity of degree programs must be coordinated centrally, or at least negotiated among numerous independent universities, then the result will be less flexibility and greater uniformity. Power over fine details of the curriculum will inevitably shift in the direction of accrediting organizations, university administrators, and other professional coordinators. Faculty may effectively lose the ability to write their own syllabi. The diversity of thinking and teaching at universities has long been important to the health of a free society. That is, for example, why professors get tenure once they have proven their abilities by passing through many levels of competition and testing. And it may be tempting to stereotype universities as having become dominated by one or another unpopular tendency as a pretext for standing by as the institution drifts into greater uniformity. But I think that would be a terrible mistake. We need to preserve the institutional conditions for a diversity of intellectual approaches.
As we decide how to use information technology in higher education, we face choices that follow a pattern. In the "old days", various important values -- in this case decentralization and diversity -- were guaranteed, or at least encouraged, by the limitations of the physical world. Universities were numerous and spread out, it was relatively difficult to transfer people and practices between them, and so different universities evolved along somewhat independent paths. Now, however, we only get that independence, that separate evolution and diversity of educational approach, if we actively choose it. We will make some of our choices out in the open. But we will make other choices implicitly, tacitly, as a seeming consequence of simply following through the logic that information technology imposes on us.
We have been disserved, I think, by "cyber" claims that information technology inherently and inevitably brings decentralization and diversity to the world. If my own argument has the slightest merit then this is not so, and indeed the opposite might be closer to the truth. I do not believe that technology has any essential and inevitable consequences, however. The traditional practices of computer system design first arose in military and industrial settings in which centralized coordination is a virtue, or at least in which centralization does not threaten important societal values. In higher education, however, it is a different story. Let us use technology when it helps us do our good work better. But let us not permit the technology and its customary practices to dictate important, value- laden changes in our institutions. And when the situation calls for it, let us develop new technology, or else wait until somebody develops it for us. The whole point of technology is to serve human purposes, but the burden of technology is that we must choose what those purposes are.
end ```
This web service brought to you by Somewhere.Com, LLC.