Handbook of Communication Research: Chapter on ...



Handbook of Communication Research: Chapter on "Infrastructure"

How to Infrastructure

Susan Leigh Star

Geoffrey C. Bowker

Department of Communication

University of California at San Diego

La Jolla, CA

lstar@ucsd.edu bowker@ucsd.edu

INTRODUCTION

Resources appear, too, as shared visions of the possible and acceptable dreams of the innovative, as techniques, knowledge, know-how, and the institutions for learning these things. Infrastructure in these terms is a dense interwoven fabric that is, at the same time, dynamic, thoroughly ecological, even fragile. (Bucciarelli, 1994:131)

The central topic of this chapter is the infrastructure that subtends new media: overwhelmingly for our purposes this means the Internet in all its guises. We seek to explore the development and design of infrastructure; our main argument will be that a social and theoretical understanding of infrastructure is key to the design of new media applications in our highly networked, information convergent society.

When we think of infrastructure in a common-sense way, infrastructure is that which runs "underneath" actual structures -- railroad tracks, city plumbing and sewage, electricity, roads and highways, cable wires that connect to the broadcast grid and bring pictures to our TVs. It is that upon which something else rides, or works, a platform of sorts. This commonsense definition begins to unravel when we populate the picture, and begin to look at multiple, overlapping, and perhaps contradictory infrastructural arrangements. For the railroad engineer, the rails are only infrastructure when she or he is a passenger. Almost anyone can flip an electric switch, for a variety of purposes. When the switch fails, we are forced to look more deeply into the cause -- first check the light bulb, then the other appliances on the same circuit, then look at the circuit breaker box, then look down the block to see if it is a power outage in the neighborhood or city, and finally, depending on one's home repair skills, consider calling an electrician. Finally, increasingly many of us are faced with infrastructures designed by one group, that may not work for us. For instance, someone in a wheelchair appreciates the tiny (and not so tiny) barriers that are considered "wheelchair accessible" by the able-bodied. Four steps can be a mountain if the specific conditions of usability are overlooked. So we have three separate themes here:

1. The moving target of infrastructure, from easy-to-use black box to active topic of work and research;

2. The breakdown of infrastructure that opens the taken-for-granted;

3. The relative usefulness of infrastructure for different populations.

One thing that unites each of these strands is the notion that infrastructure is not absolute, but relative to working conditions. It never stands apart from the people who design, maintain and use it. Its designers try to make it as invisible as possible, while leaving pointers to make it visible when it needs to be repaired or remapped. It is tricky to study for this reason.

Perhaps for these reasons, infrastructure is an often-neglected aspect of communication studies, except in areas such as the specific regulation of an infrastructure, or the history of one type (such as the telephone, the automobile, or the Internet. Both such kinds of studies are invaluable, and certainly central to the discipline of communication. Communication infrastructures include printing, telegraph, radio, fax, television and the Internet/Web, movie production and distribution. Under some circumstances the human body becomes infrastructure, such as in the study of face-to-face conversations. Interestingly, for example in the study of body language, human emotions and situations are the infrastructure, and the targets of study are facial and body muscles, posture and proximity. This chapter outlines several facets of importance to theory and research on the design, development and use of infrastructure. We will also refer to some policy articles for purely policy-related infrastructure issues, such as the regulation of telecommunications or the standards-setting process for the Internet. For deeper coverage of these issues, see Chapters xx.

Given the above qualifications, what then are we left with that IS infrastructure? Can it be anything? Like many concepts in communication, the relational quality of infrastructure talks about that which is between -- between people, mediated by tools, and emergent (Jewett and Kling, 1991). Like the term communication itself, and others such as meaning, audience, free speech, or regulation, the exact sense of the term, and its "betweeness" are both theoretical and empirical questions.

We will work from Star and Ruhleder’s (1996) definition of the salient features of infrastructure in order to bound and clarify the term.

Embeddedness. Infrastructure is sunk into, inside of, other structures, social arrangements and technologies;

Transparency. Infrastructure is transparent to use, in the sense that it does not have to be reinvented each time or assembled for each task, but invisibly supports those tasks;

Reach or scope. This may be either spatial or temporal -- infrastructure has reach beyond a single event or one-site practice;

Learned as part of membership. The taken-for-grantedness of artifacts and organizational arrangements is a sine qua non of membership in a community of practice (Lave and Wenger, 1991; Star, 1996). Strangers and outsiders encounter infrastructure as a target object to be learned about. New participants acquire a naturalized familiarity with its objects as they become members;

Links with conventions of practice. Infrastructure both shapes and is shaped by the conventions of a community of practice, e.g. the ways that cycles of day-night work are affected by and affect electrical power rates and needs. Generations of typists have learned the QWERTY keyboard; its limitations are inherited by the computer keyboard and thence by the design of today’s computer furniture (Becker, 1982);

Embodiment of standards. Modified by scope and often by conflicting conventions, infrastructure takes on transparency by plugging into other infrastructures and tools in a standardized fashion.

Built on an installed base. Infrastructure does not grow de novo; it wrestles with the inertia of the installed base and inherits strengths and limitations from that base. Optical fibers run along old railroad lines; new systems are designed for backward-compatibility; and failing to account for these constraints may be fatal or distorting to new development processes (Monteiro and Hanseth, 1996).

Becomes visible upon breakdown. The normally invisible quality of working infrastructure becomes visible when it breaks: the server is down, the bridge washes out, there is a power blackout. Even when there are back-up mechanisms or procedures, their existence further highlights the now-visible infrastructure.

Something that was once an object of development and design becomes sunk into infrastructure over time. Therefore an historical, archeological approach to the development of an infrastructure like the Internet needs complementary sociological, regulatory and technical studies.

SOCIO-HISTORICAL ANALYSES OF INFRASTRUCTURE

Much current work in the socio-historical analysis of infrastructure has derived from the field of science and technology studies (STS). Latour and Woolgar’s Laboratory Life helped open a window to a more qualitative, intensively observational set of studies of scientific work and practice. The authors drew particular attention to the role of the scientists’ information infrstructure (articles, graphs, pieces of paper) in their work. By the 1990s, STS researchers began to turn their attention to the design and use of computing and information technologies (see e.g. Star, 1995). Woolgar has called this the "technical turn" in STS. Using many of the same techniques as had laboratory studies of science, these researchers studied (and in some cases, worked in partnership with) the design, use, manufacture and distribution of information technology (Henderson, 1999; Lucena, 1995; Collins and Kusch, 1998 to name just a few)[1].

The combination of the technical turn and studies of materials brought infrastructure to the fore in STS (see e.g. Star and Ruhleder, 1996; Latour and Hernant, 1999). The ethnographic eye that helped reveal the inner workings of science or technology research and development was helpful in interpreting infrastructure. Arguments about standardization, selection and maintenance of tools, and the right materials for the job of knowledge production have slowly come into center stage via this synthesis (Clarke and Fujimura, 1992). Along with this has come a rediscovery of some of the tools germane to cognate disciplines which had previously analyzed material culture and the built environment. These have included, inter alia, fields such as architecture (where scholars sometimes read the built environment as a kind of text); literary theory (especially those aspects of literary theory that help surface hidden assumptions and narrative structure), social geography (where the values and biases inherent in such tools as maps are a lively topic of inquiry) and information science (Library Trends; Bowker and Star, 1999).

Thomas Hughes (1983) in his historical discussion of the rise of electricity networks in Europe and the United States developed a suite of tools for analyzing the development and design of infrastructure. He drew attention to the importance of reverse salients – technological, social or political sticking points which can slow the development of an infrastructure. Crucially, he argued that the solution to a reverse salient does not have to be from the same register as the problem – there might be a political solution to a technical problem and so forth (cf. Latour, 1996). Thus for example, the technical problem of low bandwidth communications in email was partially solved by the social solution of using a complex set of ‘emoticons’ to provide a layering of the flat ASCII text. The design implication here is that there is no possible a priori assignation of different tasks to different bodies of specialists in the design of communication infrastructures for new media: the emergent infrastructure itself represents one of a number of possible distribution of tasks and properites between hardware, software and people (see Tanenbaum, 1996 for a full discussion of the variety of possible representations of the Internet, with different distributions between them).

Hughes further pointed out that the early electricity networks drew off skills developed in the design of gas networks before them and canals before that – each generation of engineers imports solutions from one infrastructure to the next. This trend has continued with the current design of the Web. This has both positive and negative consequences. On the positive side, it points to the fact that skills can be usefully transferred from one medium to the next. However, the negative side is that this can lead to a lack of imagination. Paul David (1995) has written about the ‘productivity paradox’ that saw productivity decline with the introduction of the electric dynamo in factories in the nineteenth century and with the introduction of the computer in organizations in the twentieth. He makes the strong point that it was not until ways of thinking through the new technology emerged that improvements occurred – the dynamo made a bad steam engine; the computer a bad typewriter… . This draws attention to the importance of the metaphors that we use to think through technology: Mark Stefik (1996) speaks of a set of ‘archetypes’ that have been used to drive design on the Internet. It is important to recognize that these archetypes might also be traps.

David’s argument works because infrastructures don’t just exist by themselves – they are embedded in organizations of many kinds. Bowker (1994b) has developed the concept of infrastructural inversion to describe the fact that historical changes frequently ascribed to some spectacular product of an age are frequently more a feature of an infrastructure permitting the development of that product: thus the rise of computing machinery in the late nineteenth century was consequent on (not causative of) changes in office organization; the twin rise of life expectancy and the medical profession at the same time was consequent on (not causative of) the development of water and sanitation infrastructures. Operating this inversion is a struggle against the tendency of infrastructure to disappear (except when breaking down). It means learning to look closely at technologies and arrangements which, by design and by habit, tend to fade into the woodwork (sometimes literally!). Infrastructural inversion foregrounds these normally invisible Lilliputian threads, and furthermore gives them causal prominence in many areas normally attributed to heroic actors, social movements, or cultural mores. The inversion is similar to the argument made by Becker (1982) in his book Art Worlds. Most history and social analysis of art has neglected the details of infrastructure within which communities of artistic practice emerge, instead focusing on aesthetics as devoid from these issues. Becker’s inversion examines the conventions and constraints of the material artistic infrastructure, and its ramifications. For example, the convention of musical concerts lasting about two hours ramifies throughout the producing organization. Parking attendants, unions, ticket takers, and theater rentals are arranged in cascading dependence on this interval of time. An eight-hour musical piece, which is occasionally written, means rearranging all of these expectations – which in turn is so expensive that such productions are rare. Or paintings are about the size, usually, that will hang comfortably on a wall. They are also the size that fits rolls of canvas, the skills of framers, and the very doorways of museums and galleries. These constraints are mutable only at great cost, and artists must always consider them before violating them. The infrastructure designer must always be aware of the multiple sets of contexts her work impinges on: frequently a technical innovation must be accompanied by an organizational innovation in order to work – the design of socio-technical systems engages both the technologist and the organization theorist.

Common to a set of key texts (Friedlander, 1995; Bud-Frierman (1994); Clarke and Fujimura (1992); Fischer, (1992); Rosenberg (1982)) analyzing infrastructure is a problematizing of this relationship between background and foreground. A given infrastructure may have become transparent, but a number of significant political, ethical and social choices have without doubt been folded into its development – and this background needs to be understood if we are to produce thoughtful analyses of the nature of infrastructural work. For example, a road network may be designed politically to prevent access by the poor to a given facility (Davis, 1997 makes this point about SeaWorld in San Diego; Winner (1986) about beach access in New York). Equally, a highly standardized form like the Census form folds in strong claims about the nature and importance of a particular set of ethnic categories (Bowker and Star, 1999). Web designers frequently wrap choices about the kind of visitor they want into their website (few sites are designed with access for the blind, say, since HTML is written in such a way that it does not force designers to produce alternative text for every image they produce); complex sites often require a high speed connection – the less wealthy user with a slow modem will be excluded by her own understandable impatience.

HOW INFRASTRUCTURE HAPPENS

Both standardization and classification are essential to the development of working infrastructures. Work done on standards committees and in setting up classification schemes is frequently overlooked in social and political analyses of infrastructure, and yet it is of crucial importance. In this section, we will review work that explores these issues.

There is no question that in the development of large scale information infrastructures, we need standards. In a sense this is a trivial observation – strings of bits traveling along wires are meaningless unless there a shared set of handshakes among the various media they pass through. An email message is typically broken up into regular size chunks, and then wrapped in various ‘envelopes’, each of which represent a different layer of the infrastructure. A given message might be encoded using the MIME (Multipurpose Internet Mail Extensions protocol) - this will allow various other mail programs to read it. It might then be chunked into smaller parts, each of which is wrapped in an envelope designating its ultimate address and its order in the message. This envelope will be further wrapped in an envelope telling it how to enter the Internet. It will then quite possibly be wrapped in further envelopes telling it how to change from a configuration of electrons on a wire to a radio message beamed to a satellite and back down again (Tanenbaum 1996; Hurley and Keller 1999). Each envelope is progressively opened at the end, and the original message reassembled from its contingent parts then appears ‘transparently’ on your desktop. It’s the standards that let this happen.

One observation that we can make at once is that it’s standards all the way down: each layer of infrastructure requires its own set of standards. We might also say that it’s standards all the way up. There is no simple break point at which one can say that communication protocols stop and technical standards start. As a thought experiment, let us take the example of a scientific paper. I write a paper about palaeoecology for the journal Science. I know that my immediate audience will not be leading edge palaeoecologists, but a wider community who know only a little about Devonian plant communities. So I wrap the kernel in an introduction and conclusion that discuss in general terms the nature and significance of my findings. A journalist might well write a more popular piece for the Perspectives section which will point to my paper. If this is my first paper for Science then I will probably go through quite a complex set of negotiations through the peer review process in order to ensure the correct ‘handshake’ between my message and its audience – is it written at the right level, have I used the right tone of authority--and so forth, Yates (1989) has written about similar sets of standards/protocols that arose with the development of office memoranda in the 19th century; she and Orlikowski have observed similar effects in the development of email genres (Orlikowski et al. 1993). I then need to be sure that the right set of economic agreements have been put into place so that my paper can be found on the web by anyone at any time (Kahin and Keller 1997).

This common vision of disciplinary, economic and network protocols serves the purpose of highlighting a central fact about infrastructures. It is not just the bits and bytes that get hustled into standard form in order for the technical infrastructure to work. People’s discursive and work practices get hustled into standard form as well. Working infrastructures standardize both people and machines. A further example will clarify this point. In order for the large scale states of the nineteenth century to operate efficiently and effectively, the new science of statistics (of the same etymological root as the word ‘state’) was developed. People were sorted into categories, and a series of information technologies were put into place to provide an infrastructure to government work (regular ten year censuses; special tables and printers; by the end of the nineteenth century punch-card machines for faster processing of results). These standardized categories (male; African-American; professional etc) thus spawned their own set of technical standards (80 column sheets – later transferred to 80 column punch cards and computer screens…). They also spawned their own set of standardized people. As Alain Desrosières and Laurent Thévenot (Desrosières and Thévenot 1988) note, different categories for professional works in the French, German and British censuses led to the creation of very different social structures and government programs around them. Early in the nineteenth centuries, the differences between professionals in one country or the other did not make so much difference: by the end of the century these differences had become entrenched and reified – unions had formed around them, informal dress codes had been adopted and so forth.

At both the technical and the social level, there is no guarantee that the best set of standards will win[2]. The process of the creation of standards for infrastructures is a long, tortuous, contingent one. The best known stories here are the adoption of the QWERTY keyboard (good for its time in preventing keys jamming in manual typewriters; counterproductive now for most in that puts most of the work onto the left hand – a hardship for many, but one appreciated by the southpaws (left-handers) amongst us - but so entrenched that there is no end in sight for it (David, 1986)); the victory of the VHS standard over the technically superior Betamax standard; the victory of DOS and its successors over superior operating systems.

So why does the best standard not always win? There are two sets of reasons for this. First is that in an infrastructure you are never alone – no node is an island. Suppose there are 500 users of DOS to every one user of a Macintosh. A software developer has less motivation to write software for Macs, which have a much smaller potential user base? So the strong get stronger. Going down one level into the infrastructure, companies are more likely to write API’s (Application Program Interfaces - to allow communication between one program and another) for interoperation between a new program and an industry standard one than a rarely used one. More generally put, a new kind of economic logic has developed around the network infrastructures that have been created over the past 200 years – the logic of ‘network externalities’ (David, 1986; David and Greenstein, 1990; David and Rothwell, 1994). The logic runs as follows. If a person buys a telephone for $50 and there are only 5 people on the network, then owning a telephone is not worth very much, even if the five are fast friends. If 5000 more people buy telephones then the first five have not had to lay out another cent, but their phones are suddenly much more valuable. This situation describes positive externalities. De facto standards (such as DOS, QWERTY etc) gain and hold on to their position largely through the development of positive externalities. The design lesson here is that integration with existing programs is key to the success of new infrastructural tools.

Arguably some of the most far reaching decisions of the past fifty years have been taken by standards setting bodies: although one does not find a ‘standards’ section of the bookshop alongside of ‘history’ and ‘new age’. Consider, for example the political and social implications of the ‘open source’ movement. This movement has a long history, running deep into the origin of the Internet, proclaiming the democratic and liberatory value of freely sharing software code. The Internet, indeed, was cobbled together out of a set of freely distributed software standards. The open source movement has been seen as running counter to the dominance of large centralized industries – the argument goes that it puts power over the media back into the hands of the people in a way that might truly transform capitalist society (Poster, 1995). This promise of cyberdemocracy is integrally social, political and technical. While there is much talk of an ‘information revolution’ (though be it noted there has been fairly regular talk of same since the 1830s – Bowker 1994a), there is not enough of the ways in which people and communities are in part constituted by the new infrastructure. One important example here is the development of patient support groups on the Internet – allowing sufferers of a rare illness to share moral support, information about new drugs and medical trials and folk treatments. Such patient communities would have been impossible in the past in the case of very rare diseases; they certainly even for more common diseases permit patients from rural areas access to the dense information networks of the city. Groups such as these frequently organize face to face meetings after they have ‘met’ on the Internet (Baym, 2000).

There are many models for information infrastructures. The Internet itself can be cut up conceptually a number of different ways. There is over time and between models a distribution of properties between hardware, software and people. Thus one can get two computers ‘talking’ to each other by running a dedicated line between them or by preempting a given physical circuit (hardware solutions) or by creating a ‘virtual circuit’ (software solution) which runs over multiple different physical circuits. Or finally you can still (and this is the fastest way of getting terabits of data from San Francisco to LA) put a hard disk in a truck and drive it down… (Tanenbaum, 1996). Each kind of circuit is made up of a different stable configuration of wires, bits and people; but they are all (as far as the infrastructure itself is concerned) interchangeable. One can think of standards in the infrastructure as the tools for stabilizing these configurations. There is a continuum of strategies for standards setting. At the one end of the spectrum is the strategy of one standard fits all. This can be imposed by government fiat (for example the Navy’s failed attempt to impose ADA as the sole programming language for their applications) or can take the form of an emergent monopoly (for example, Microsoft Windows/NT). At the other end of the spectrum is the ‘let a thousand standards bloom’ model. In such a case, one concentrates on producing interfaces such as APIs (Application Program Interfaces – which permit two applications to share data) between programs and on such standards as the ANSI/NISO Z39.50 information retrieval protocol. Z39.50 is a standard that has been developed for being able to make a single enquiry over multiple databases.: it has been very widely adopted in the library world. You can use whatever database program you wish to make your bibliographic database with, provided that the program itself subscribes to the standard. This means in essence that certain key fields like ‘author’, ‘title’ and ‘keyword’ will be defined in the database. Now, instead of having to load up multiple different database programs in order to search over many databases (a challenge of significance well beyond the library community as large scale heterogeneous datasets are coming into center stage in a number of scientific and cultural fields) one can frame a single query using the Z39.50 standard and have your results returned seamlessly from many different sources. One language that one hears about these two extremes is that the former is the ‘colonial’ model of infrastructure development where the latter is the ‘democratic’ model. There is some truth to the implied claim that one’s political philosophy will determine one’s choice; and there is some solace for democrats in the observation that with the development of the internet the latter has almost invariably won out – most Internet standards have been cobbled together in such a way that they permit maximal flexibility and heterogeneity. Thus, for example, if one tracks the emergence and deployment of collaborative computing, one finds that large scale stand-alone programs have done much worse than less powerful programs that permit easy integration with one’s other programs and with the Internet (Star and Ruhleder, 1996).

There are utopian visions that when we get all the standards in place, there will be seamless access to the world’s store of information from any place on the planet. The World Brain project (), for example, claims that it is developing: “… not a mere kitten brain, not an on-line library, but a true synthetic genius that will deliver us from war, poverty, and suffering” – compare also the utopian vision of the Principia Cybernetica (). Gregory (1999) has called such efforts ‘incomplete utopian projects’. It is the incompleteness that we must turn to here: they always remain incomplete; they are always outnumbered, always outgunned by the forces against them. The World Brain, for example, echoes H.G. Wells’ long forgotten project of the same name, announced just as optimistically some sixty years earlier: "An immense, an ever-increasing wealth of knowledge is scattered about the world today, a wealth of knowledge and suggestion that – systematically ordered and generally disseminated – would probably give this giant vision and direction and suffice to solve all the mighty difficulties of our age… " (Wells, 1938, pp.66-67; cited in Rayward, 1999).

The reason for pointing to these persistent failures is that the reflect some key features of infrastructures. They show that infrastructural development and maintenance requires work, a relatively stable technology and communication. The work side is frequently overlooked. Consider the claim in the 1920s that with the advent of microfiche, the end of the book was nigh – everyone would have their own personal libraries; we would no longer need to waste vast amounts of natural resources on producing paper; all the largest library would need would be a few rooms and a set of microfiche readers (Abbott 1988). A possible vision – and one that we should not discount just because it did not happen (MacKenzie (1990) reminds us that technological roads not taken can appear futile – for example, anyone who has used a microfiche reader will attest that it’s a most uncomfortable experience – whereas if the same resources had gone into the failure as into the successful technology, it probably could have been). However, the microfiche dream, like the universal digital library, runs up against the problem that someone has to sit there and to the necessary photography/scanning; and this takes a huge amount of time and resources. It is easy enough to develop a potentially revolutionary technology; it is extremely hard to implement it.

Further, one needs a relatively stable technology. If one thinks of some of the great infrastructural technologies of the past (gas, electric, sewage and so forth) one can see that once the infrastructure is put into place it tends to have a long life. Electrical wiring from before World War II is still in use in many households; in major cities there is frequently no good map of sewage pipes, since their origin goes too far back in time. The Internet is only virtually stable – through the mediation of a set of relatively stable protocols (for this reason, Edwards (1998) suggests that we call it an internetwork technology; rather than a Hughesian (Hughes, 1983) network technology such as the electricity network). However, there is nothing to guarantee the stability of vast datasets. At the turn of the twentieth century, Otlet developed a scheme for a universal library which would work by providing automatic electro-mechanical access to extremely well catalogued microfiches, which could link between each other (Rayward 1975). All the world’s knowledge would be put onto these fiches – his vision was a precursor to today’s hypertext. He made huge strides in developing this system (though he only had the person power to achieve a miniscule fraction of his goal). However, within forty years, with the development of computer memory, his whole enterprise was effectively doomed to languish as it does today in boxes in a basement – why retrieve information electromechanically using levers and gears when you can call it up at the speed of light from a computer? Much the same can be said of Bush’s never realized but inspirational vision of the Memex – an electromechanical precursor to the workstation (Bush et al. 1991). Large databases from the early days of the computer revolution are now completely lost. Who now reads punch cards - a technology whose first major use in this country was to deal with the massive datasets of the 1890 census, and which dominated information storage and handling for some seventy years? More close to the present day, the ‘electronic medical record’ has been announced every few years since the 1960s – and yet it has not been globally attained. Changes in database architecture, storage capabilities of computers and ingrained organizational practices have rendered it a chimera. The development of stable standards together with due attention being paid to backwards compatibility provide an in principle fix to these problems. It can all unravel very easily, though. The bottom line is that no storage medium is permanent (CDs will not last anyway near as long as books printed on acid free paper) – so that our emergent information infrastructure will require a continued maintenance effort to keep data accessible and usable as it passes from one storage medium to another and is analyzed by one generation of database technology to the next.

Finally, in order to really build an information infrastructure, you need to pay close attention to issues of communication. We can parse this partly as the problem of reliable metadata. Metadata (“data about data” – (Michener et al. 1997)) is the technical term for all the information that a single piece of data out there on the internet can carry with it in order to provide sufficient context for another user to be able to first locate it and then use it. The most widespread metadata standards are the Dublin Core, developed for library applications; the Federal Geographic Data Committee (FGDC - ) standard developed for Geographical Information Systems. If everyone can agree to standard names for certain kinds of data, then one can easily search for, say, authors over multiple databases. We have already seen this. Philosophically, however, metadata opens up into some far deeper problems. Take the example of biodiversity science. It is generally agreed that if we want to preserve a decent proportion of animal and floral life from the current great extinction event (one of six since the origin of life on this planet) then policymakers need the best possible information about the current species size and distribution, as well as the ability to model the effect of potential and current environmental changes. In order to do this, you need to be able to bring together data from many different scientific disciplines using many different information technologies (from paper to supercomputer). Now imagine that I am measuring lake water acidity in the Midwest in order to see if there are any effects from recent acid rain. I might be lucky and come across a very large dataset going back eighty or a hundred years and giving lake water acidity levels at one year intervals. However, this might well not be enough for me to actually use the data. It makes quite a difference if the measurement was taken immediately at the lake or later, when the samples were brought back to the laboratory – there will be different amounts of dissolved carbon dioxide in the sample. And as it happens, it makes a difference which technology of measurement I use – a new technology can provide a consistent jump in values for the same sample (see Bowker, 2000 for a detailed discussion of this and related cases of metadata problems). Now to make matters worse, I as a scientist am no expert in measuring lake water, I am an expert modeler, and I just want some data to plug into my model. But the point is that there is no such thing as pure data. You always have to know some context. And as you develop metadata standards you always need to think about how much information you need to give in order to make your information maximally useful over time. And here we circle back to the first difficulty with developing an information infrastructure – the more information that you provide in order to make the data useful to the widest community and over the longest time, the more work that you have to do. Yet empirical studies have shown time and again that people will not see it as a good use of their time to preserve information about their data beyond what is necessary to guarantee its immediate usefulness – thus Fagot-Largeault describes the medical culture of producing quick and easy diagnoses on death certificates in order to meet the administrative need and free time to get onto the next (live) patient (Fagot-Largeault 1989).

So standards are necessary – from social protocols down to wiring size. Indeed, when you foreground the infrastructure of your lives, you will see that you encounter thousands of standards in a single day. However, as we have aleady seen, the development and maintenance of standards are complex ethical and philosophical problem – ethical since decisions taken at the stage of development standards can have enormous implications for user communities; and philosophical since standards frequently deal with very basic ways of splitting up the world (the base categories of Z39.50 sketch out an interestingly restricted ontology of ‘authorship’ for example). Further, standards implementation requires a vast amount of resources. Standards undergird our potential for action in the world, both political and scientific; they make the infrastructure possible.

Representation in a database can be a key means for a group or profession to gain recognition – yet there are frequently costs involved. For instance, a group of nurses in Iowa developed a Nursing Interventions Classification in order to carve out a place in hospital information systems for nursing work – however they recognized while doing this that the scheme put them in danger both of being stripped of some of their work (‘it doesn’t take a skilled professional to do this…’) and of losing some of the essence of their contribution in the scheme (it is hard to code for ‘humor’ for example) (Bowker and Star, 1999). Such cases have proliferated with the development of interlocking large scale information infrastructures[3].

The installed base of a particular infrastructure carries huge inertia. And yet infrastructures do change over time, sometimes transforming remarkably rapidly, sometimes apparently discontinuously (a country going from postal communication to cellular phones without passing through landlines for example). This is made the more complicated by the ways in which the built environment (libraries, schools, offices, homes, traffic lights, airports, hospitals) is increasingly imbricated with information technologies of every sort (Taylor and van Every, 1993; 2000). These include sensors, databases, digital cameras, networked PCs, and all forms of display units. A deeper sense of this comes with the realization that these forms of information technology are increasingly integrated (they can share information via protocols and standards) and convergent (they are delivered through a single point, such as web pages embedded in a cellular telephone, where one's plane schedule for a connecting flight may be checked while on board another slight). In some cases, this may result in serious privacy and ethical boundaries, discussed in Chapter XX. An example of this would be the sale of a hospital to a pharmaceutical company, along with its records; the pharmaceutical company may use the list of patients and their illnesses to direct mail niche markets based on, for example, a chronic condition[4].

ACCESS AND USABILITY ISSUES

One of the places where the relational aspects of infrastructure appear most strongly is in the barriers presented to the poor or those with no access to hardware or training; those with disabilities; and institutional racism and sexism that form barriers to usability. As aspects of the latter are covered in other chapters, we shall confine these remarks to resources and issues touching on infrastructure.

The sense that new networked media will create a two-class society -- the "information rich" and the "information poor" seems to be coming true. Access to training, hardware, and maintenance is in general much more difficult, if not impossible, for the poor. Attempts to remedy this on the part of community and non-profit organization, NGOs, and governments have used a variety of means to lessen this phenomenon. As with development work, many of these efforts early on neglected local infrastructural considerations, such as availability of telephones for dialup access, costs of maintenance and supplies, and security. A typical scenario would be that large companies would donate slightly outmoded computers to a "needy" classroom, not recognizing that these classrooms lacked telephone, and that the teachers, while dedicated, were so overloaded that they had no time to catch up on their own computer skills. More recent work has begun to take these considerations into account. An important part of this is working locally, cooperatively designing solutions that work in situ[5].

A cognate community organizing effort has come through the freenet and community networks social movements. Freenets are locally-based, usually voluntarily or public-library based efforts to provide at least basic internet access on a democratic basis[6]. One way to view these efforts is as a direct organizing effort to address gaps in current local infrastructure. People living in low-income communities have a keen sense of the failures of imposed solutions that do not take account of their circumstances, or their lack of infrastructural resources (nor their often rich resources that are not formally taken into account in requirements analysis, such as support from churches, existing reform movements, or the social networks through which information travels (Bishop, 2000)[7].

There has been much hope expressed that in the developing world, the new information infrastructure will provide the potential for a narrowing of the knowledge gaps between countries. Thus an effective global digital library would allow third world researchers access to the latest journals. Distributed computing environments (such as the GRID[8]) would permit supercomputer grade access to computing to scientists throughout the world. The example of the use of cell-phone technology to provide a jump in technology in countries without landlines has opened the possibility of great leaps being made into the information future. As powerful as these visions are, they need to be tempered with some real concerns. The first is that an information infrastructure like the Internet functions like a Greek democracy of old – everyone who has access may be an equal citizen, but those without access are left further and further out of the picture. Further, as noted above with respect to the school environment, access is never really equal – the fastest connections and computers (needed for running the latest software) tend to be concentrated in the first world. Thirdly, governments in the developing world have indicated real doubts about the usefulness of opening their data resources out onto the Internet. Just as in the nineteenth century, the laissez-faire economics of free trade was advocated by developed countries with most to gain (because they had organizations in place ready to take advantage of emerging possibilities) so in our age, the greatest advocates of the free and open exchange of information are developed countries with robust computing infrastructures. Some in developing countries see this as a second wave of colonialism – the first pillaged material resources and the second will pillage information. All of these concerns can be met through the development of careful information policies. Works such as Mansell and Wehn’s (1998) both point to the extent of work that needs to be done and open up the very real possibilities that are emerging.

Much has been written on the social barriers for all women, and men of color, in feeling comfortable in computing environments, including the Web. Since before the Web, feminists have analyzed the "chilly climate" for girls and women learning computing, as well as the glass ceiling in the information technology industries. Holeton (1998) has several excellent chapters devoted to gender issues; the series Women, Work and Computing, hosted by IFIPS Special interest Group 8.1 and issued every four years for more than a decade, provides critiques of identity, design, employment, and epistemology in the world of gender and computing (see for example Eriksson, Kitchenham, and Tijdens, 1991). The infrastructural aspects of this stretch in many directions, beginning with the early mathematics and physics (traditionally male and white dominated fields) orientation of much of computer science, forming a complex a priori barrier to advancement for women, and for men of color. Other issues have included sexism and racism in the content of web sites and online conversations[9].

The Internet has proven a good resource for community organizing on the basis of ecological racism, and for organizing indigenous peoples[10] (who are often scattered and unable to meet face-to-face on a regular basis). Within this movement, there are attempts to put dying or very remote indigenous languages (such as that of the Suomi people in Lapland) online, so that they may be remembered and spread. These are good example of social movements and ethnic groups taking over the infrastructure for their own purposes[11]. An overlapping, but distinct set of issues can also be found in the infrastructural barriers faced by people with disabilities. Blind people now have available (if they can afford it, or if a social service agency can provide it) Braille terminals (that translate text to a Braille keyboard); fast text readers with optical character recognition that transform text into speech; and on some web sites, people have learned to make alternative "text only" pages, with short descriptions of the graphics involved. It is, however, an ongoing difficulty to convince web page designers consistently to make this option available. For those living with an inability to move or to work a keyboard or voice recognition systems, a number of augmenting devices have been developed over the years. These include mice that can be operated by tongues, eye movements, toes, or minor head movements. The impact of rapidly-improving voice recognition systems on these communities is yet to be fully realized, especially for those living on restricted incomes, as is often true of disabled people. As well, the basic communication processes carried by email and the web make self-employment, and new kinds of networks and social participation available for many of these people[12].

CONCLUSION: DESIGN IMPLICATIONS

In 1994, Stewart Brand wrote a wonderful book entitled How Buildings Learn: What Happens After They Are Built. He pointed out that we tend to view the architect as the designer of a given building, and as a result we overlook the massive modifications that any building undergoes in the course of its life. Much the same point can be made about designing infrastructures. The work of design is in many ways secondary to the work of modification. A good information infrastructure is one that is stable enough to allow information to be able to persist in time (word processor programs, for this reason, are generally ‘backwards compatible’ – meaning that the newer version of a program can open files created with an earlier version). However, it should also be modifiable – at the individual level in terms of ‘tailorability’ (allowing a user to modify it for their own purposes – see Nardi, 1993); and at the social level in terms of being able to respond to emergent social needs (web standards have developed quickly to needs for image and video manipulation on the web).

Designing for flexibility is not an easy task. In general, the required flexibility is emergent. It is clear that you do not need a single great architect for an infrastructure. Eric Raymond’s classic, though flawed, article ‘The Cathedral and the Bazaar’ (Raymond, 1999) says that unlike in the old days when a set of master blueprints was needed to build a cathedral, now we can see very large scale infrastructures (he is talking of the Linux operating system) being designed in more of an evolutionary, distributed process. The flaw we are referring to is that his history is wrong – cathedrals themselves often did not have master planners (Turnbull, 1993): they were built in just such an ad hoc fashion. Raymond does, however, make the very strong point that complex, interoperating systems (and, we would add, cathedrals) can be built in a distributed fashion, providing you have a good set of protocols in place for receiving and vetting changes. These social arrangements (the protocols) are best developed and maintained by standards bodies containing representatives of all major stakeholders.

We have argued throughout this chapter that there are significant ethical and political concerns in the design of infrastructures. The Scandinavian school of ‘participatory design’ has successfully responded to this challenge. It developed in part in response to union demands for an input into the design of information technologies that would be implanted in the workplace. A key feature in its design process has been the use of ethnographers of work practice to analyse the ways in which work is carried on in order to anticipate and design for social and cultural effects as well as to produce a technically efficient design (see Bowker et al, 1997, Introduction). This design process has been adopted by some major ‘infrastructure’ companies such as Intel, Hewlett Packard, Apple and Intel. This movement suggests a complementary way of reading the design implications of our analysis of infrastructure in this chapter. This is that the most important thing is for the user of the infrastructure to first become aware of the social and political work that the infrastructure is doing and then seek ways to modify it (locally or globally) as need be. Infrastructures subtend complex ecologies: their design process should always be tentative, flexible and open.

References:

Abbate, Janet. 1999. Inventing the Internet. Cambridge, Mass: MIT Press.

Abbate, J. , and B. Kahin, eds. 1995. Standards Policy for Information Infrastructure. Cambridge, MA: MIT Press.

Abbott, Andrew. 1988. The System of Professions: An Essay on the Division of Expert Labor. Chicago: University of Chicago Press.

Baym, Nancy K. 2000. Tune in, log on : soaps, fandom, and online community. Thousand Oaks, Calif.: Sage Publications.Becker, Howard S. 1982. Art Worlds. Berkeley, CA: University of California Press.

Berg, Marc, 1998. "Medical Work and the Computer-Based Patient Record: A Sociological Perspective". Methods of Information in Medicine: 37: 294-301.

Bowker, Geoffrey C. 1994a. “Information Mythology: the world of/as information.” Pp. 231-247 in Information Acumen: the Understanding and Use of Knowledge in Modern Business, edited by Lisa Bud-Frierman. London: Routledge.

Bowker, Geoffrey C. 1994b. Science on the Run: Information Management and Industrial Geophysics at Schlumberger, 1920-1940. Cambridge, MA: MIT Press.

Bowker, Geoffrey C. forthcoming 2000. ‘Mapping Biodiversity’, International Journal of GIS.

Bowker, Geoffrey C. and Susan Leigh Star. 1999. Sorting Things Out: Classification and its Consequences. Cambridge, MA: MIT Press.

Bowker, Geoffrey, Susan Leigh Star, William Turner and Les Gasser, eds. 1997. Social Science, Technical Systems, and Cooperative Work : Beyond the Great Divide. Mahwah, NJ : Lawrence Erlbaum Associates.

Brand, Stewart. 1994. How Buildings Learn: What Happens after they're Built. New York: Viking Press.

Bucciarelli, L. L. 1994. Designing engineers. Cambridge, MA: MIT Press.

Bud-Frierman, Lisa (ed.). 1994. Information Acumen: The Understanding and Use of Knowledge in Modern Business. London: Routledge.

Bush, Vannevar, James M. Nyce, and Paul Kahn. 1991. From Memex to hypertext : Vannevar Bush and the mind's machine. Boston: Academic Press.

Callon, Michel. 1992. “The Dynamics of Techno-economic Networks.” Pp. 72-102 in Technological change and company strategies : economic and sociological perspectives, edited by Rod Coombs, Paolo Saviotti and Vivien Walsh. London: Harcourt Brace Jovanovich.

Chrisman, Nicholas R. 1997. Exploring Geographic Information Systems. New York : J. Wiley & Sons.

Clarke, Adele. Disciplining Reproduction : Modernity, American Life Sciences, and "The Problems Of Sex." Berkeley: University of California Press.

Clarke, Adele E. and Joan H. Fujimura (eds). 1992. The Right Tools For The Job: At Work in Twentieth-Century Life Sciences. Princeton, NJ: Princeton University Press.

Collins, Harry M. and Martin Kusch. 1998. The Shape of Actions: What Humans and Machines Can Do. Cambridge, MA: MIT Press.

David, Paul, A., 1987. "Some new standards for the economics of standardization in the information age", in: Economic policy and technological performance, Ed.: Partha Dasgupta and Paul Stoneman, Cambridge University Press, Cambridge, 1987.

David, Paul A., 1995. "The Dynamo and the Computer: an Historical Perspective on theModern Productivity Paradox." The-American-Economic-Review. 80 (May 1990), 355 and Thomas K. Landauer, The Trouble With Computers: Usefulness, Usability, and Productivity. Cambridge, Mass. : MIT Press.

David, Paul, A., 1986. "Understanding the economics of QWERTY: the necessity of history", in: Economic history and the modern economist, Ed.: W.N. Parker, Basil Blackwell, Oxford.

David, Paul A. and Shane Greenstein, 1990. "The economics of compatibility standards: an introduction to recent research", in: Economics of. Innovation and New Technology, Vol. 1, p. 3 – 41.

David, Paul and Geoffrey S. Rothwell. 1994. Standardization, Diversity and Learning: Strategies for the Coevolution of Technology and Industrial Capacity. Stanford, CA: Center for Economic Policy Research, Stanford University.

Davis, Susan G. 1997. Spectacular nature : corporate culture and the Sea World experience. Berkeley: University of California Press.Desrosières, Alain, and Laurent Thévenot. 1988. Les Catégories Socio-professionnelles. Paris: Découverte.

Edwards, Paul N. 1998. Y2K: Millennial Reflections on Computers as Infrastructure. History and Technology. 15, pp.7-29.

Eriksson, I., B. Kitchenham, and K. Tijdens, eds. 1991. Women, Work and Computerization.: Understanding and Overcoming Bias in Work and Education. Amsterdam: North Holland.

European Commission Report COSTA4. 1998. Hetland, Per and Hans-Peter Meyer-Dallach, eds. Domesticating the World Wide Webs of Information and Communication Technology. ISBN 92-828-3361-5. Volume 7. of Making the Global Village Local? Series. Brussels: European Commission.

Fagot-Largeault, Anne. 1989. Causes de la Mort: Histoire Naturelle et Facteurs de Risque. Paris: Librairie Philosophique J. Vrin.

Fischer, Claude S. 1992. America Calling: A Social History of the Telephone to 1940. . Berkeley : University of California Press.

Friedlander, Amy. 1995. Emerging Infrastructure: The Growth of Railroads. Reston, VA: Corporation for National Research Initiatives.Graham, Stephen and Simon Marvin. 1996. Telecommunications and the City. London: Routledge.

Greenbaum, Joan and Morten Kyng. 1991. Design at Work: Cooperative Design of Computer Systems. Hillsdale, NJ: Lawrence Erlbaum Associates.

Grindley, Peter. 1995. Standards, strategy, and policy : cases and stories. Oxford England ; New York: Oxford University Press.

Henderson, Kathryn. 1999. On Line And On Paper : Visual Representations, Visual Culture, And Computer Graphics In Design Engineering. Cambridge, MA: MIT Press.

Hughes, Thomas P. 1983. Networks of Power: Electrification in Western Society, 1880-1930. Baltimore: Johns Hopkins University Press.

Hurley, Deborah, and James Keller. 1999. The first 100 feet : options for Internet and Broadband access. Cambridge, Mass: MIT Press.

Holeton, Richard, ed. 1998. Composing Cyberspace: Identity, Community, and Knowledge in the Electronic Age. Boston: McGraw-Hill.

Jewett, Tom and Kling, Rob. 1991. "The Dynamics of Computerization in a Social Science Research Team: A Case Study of Infrastructure, Strategies, and Skills. Social Science Computer Review 9: 246-275.

Kahin, Brian, and James Keller. 1997. Coordinating the Internet. Cambridge, Mass.: MIT Press.

Kling, Rob, Spencer Olin, and Mark Poster, eds. 1991. Postsuburban California : The Transformation of Orange County since World War II. Berkeley : University of California Press.

Kolko, Beth, Lisa Nakamura, and Gilbert Rodman, eds. 2000. Race in Cyberspace. NY: Routledge.

Latour, Bruno 1996. Aramis or the Love of Technology. Cambridge, MA: Harvard University Press.

Latour, Bruno and Emilie Hernant. 1999. Paris: Ville Invisible. Paris:

Library Trends, 1998. Special Issue on “How Classifications Work: Problems and Challenges in an Electronic Age.” Fall 47(2), 185-340.

Lucena, Juan. 1995. In Jasanoff, Sheila.; Markle, Gerald E.; Petersen, James C.; Pinch, Trevor, eds. Handbook of Science and Technology Studies. Thousand Oaks, Calif., SAGE.

MacKenzie, Donald A. 1990. Inventing Accuracy: An Historical Sociology of Nuclear Missile Guidance. Cambridge, MA: MIT Press.

Michener, William K., James W. Brunt, John J. Helly, Thomas B. Kirchner, and Susan G. Stafford. 1997. “Nongeospatial Metadata for the Ecological Sciences.” Ecological Applications 7: 330-342.

Namioka, A. and D. Schuler. 1993. Participatory Design: Principles and Practices. Hillsdale, NJ: Lawrence Erlbaum Associates.

Orlikowski, Wanda, JoAnne Yates, Kazuo Okamura, and Masayo Fujimoto. 1993. “Shaping electronic communication: the structuring and metastructuring of technology in use.” : 43.

Mansell, Robin, Uta Wehn, and United Nations Commission on Science and Technology for Development. 1998. Knowledge societies : information technology for sustainable development. Oxford ; New York: Published for and on behalf of the United Nations by Oxford University Press.

Mark Poster, 1995. Cyberdemocracy. .

Monmonier, Mark. 1995. Drawing the Line: Tales of Maps and Cartocontroversy. NY: Henry Holt.

Nardi, Bonnie A. 1993. A small matter of programming : perspectives on end user computing. Cambridge, MA: MIT Press.Raymond, Eric S. 1999. The cathedral & the bazaar : musings on Linux and open source by an accidental revolutionary. Beijing ; Cambridge, Mass.: O'Reilly.

Rayward, W. Boyd. 1975. The Universe of Information: The Work of Paul Otlet for Documentation and International Organisation. Moscow: Published for International Federation for Documentation (FID) by All-Union Institute for Scientific and Technical Information (VINITI).

Rayward, Boyd, 1999. H.G. Wells’s Idea of a World Brain: A Critical Re-Assessment . JASIS 1999

Rosenberg, Nathan. 1982. Inside the Black Box: Technology and Economics. Cambridge, UK: Cambridge University Press.

Schuler, Douglas. 1996. New community networks: Wired for change. New York: ACM Press.

Stefik, Mark. 1996. Internet dreams : archetypes, myths, and metaphors. Cambridge, Mass.: MIT Press.Suchman, Lucy and Michael Lynch, eds. 1990. Representation in Scientific Practice. Cambridge, MA: MIT Press.

Star, Susan Leigh , and Karen Ruhleder. 1996. “Steps toward an Ecology of Infrastructure: Design and Access for Large Information Spaces.” Information Systems Research 7: 111-134.

Summerton, Jane, ed. 1994. Changing large technical systems. Boulder, CO: Westview Press.

Tanenbaum, Andrew S. 1996. Computer networks. Upper Saddle River, N.J.: Prentice Hall PTR.

Taylor, James R., and Elizabeth J. Van Every. 2000. The Emergent Organization: Communication As Its Site and Surface. Mahwah, NJ: Lawrence Erlbaum Associates.

Taylor, James R.,and Elizabeth J.Van Every. 1993. The Vulnerable Fortress : Bureaucratic Organization and Management In The Information Age. Toronto: University of Toronto Press.

Turnbull, David. 1993. Maps are Territories: Science Is An Atlas: A Portfolio Of Exhibits. Chicago: University of Chicago Press.

Turnbull, David. 1993. “The Ad Hoc Collective Work of Building Gothic Cathedrals with Templates, String, and Geometry.” Science, Technology & Human Values 18: 315-343

Wells, H. G.(1938), World Brain. London: Methuen

Winner, Langdon. 1986. “Do Artifacts Have Politics?” in The Social Shaping of Technology: How the Refreigerator Got its Hum, edited by Judy Wacjman and Donald MacKenzie. Milton Keynes: Open University Press

Woolgar, S. 1991. "The Turn to Technology in Social Studies of Science." Science, Technology & Human Values, 16:20-50.

Yates, JoAnne. 1989. Control through Communication: The Rise of System in American Management. Baltimore: Johns Hopkins University Press.

-----------------------

[1] In addition, several detailed studies of material culture of scientific work began to appear, many of which began to pick up aspects of infrastructure (see e.g. Clarke, 1998).

[2] (see (Callon 1992) for a fine general discussion of the development of techno-economic networks; compare (Grindley 1995); (Abbate and Kahin 1995))

[3] See Suchman and Lynch (1990) for a good overview

[4] Graham and Marvin (1996) provide a comprehensive set of richly theoretical readings on the relationship between the built urban environment and information/telecommunications infrastructure, ranging from the home and surveillance to city maps, teleshopping, manufacturing, and the city itself as a node in an informatic network. From the socio-informatic-regional perspective, Kling, Olin and Poster (1991) do a detailed analysis of the information economy of Orange County, California. Star (1999) suggests some methodological challenges and approaches for the "ethnography of infrastructure," beginning with the built environment and its classifications and standards.

[5] Much good work along these lines can be found at the Participatory Design Conferences (PDC) held in tandem with the Computer-Supported Cooperative Work (CSCW) annual conferences. More information about this can be found at the site for Computer Professionals for Social Responsibility (CPSR): . This is also a good site for reviewing privacy and free speech issues. See also the foundational textbook by Greenbaum and Kyng (1991), and the book by Namioka and Schuler (1993), and the report by the European Commission on the issues of globalizing and localizing information and communication technology (1998).

[6] A good overview can be found in Schuler (1996). A listing of specific freenets can be found at . This listing is fairly international for the North and for Australia. Bytes for All is an e-zine (electronic magazine) devoted to universal access to IT, and can be found at: . A specific example of federated projects in the city of Chicago gives links of several neighborhood organizations and their emerging community-driven infrastructures: .

[7] Schön, Sanyal, and Mitchell (1999), present a variety of compelling chapters about community organizing efforts, public policy, and social empowerment, including many detailed case studies and theoretical foundations.

[8] The term refers to a "grid" of high-performance research networks. The web-based grid portal helps computer scientists, scientists and engineers by simplifying and consolidating access to advanced computing systems. It is one of a number of initiatives allowing distributed scientific collaboration.

[9] See for example Kalí Tal on "The Unbearable Whiteness of Being: African American Critical Theory and Cyberculture," ; Lisa Nakamura on Race In/For Cyberspace, , which addresses racism in MUDs and MOOs; and the University of Iowa's Communication Department Resource Page on Race and Gender in Cyberspace: ). Kolko, Nakamura and Rodman (2000) review some interesting and wide-ranging cases.

[10] A good link to indigenous people's internet resources (mostly US, Canada, and Australia), is found at: .

[11] See Summerton (1994) for a good overview of case studies of social movements aimed at changing large-scale infrastructure.

[12] A comprehensive home page sponsored by WGBH, US public television, gives advice and links for access for people with several types of disabilities (and some international links): .

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download