1 .edu



Samuel E. Trosow

Qualifying Examination

June 6, 1999

1. Is technology neutral? Discuss at least 4 different perspectives from social theorists on this question, and explain the implications of these perspectives on policy discourse and decision-making.

Table of Contents

I. Introduction 1

II. Bell’s Post-Industrial Society 4

III. Marxist Theory and the Frankfurt School 6

IV. Critical Technology Studies 8

V. Postman’s Technopoly. 9

VI. Implications for Policy Discourse and Decision Making 11

Conclusion 17

References 18

I. Introduction

This essay will discuss various perspectives on the neutrality of technology. Andrew Feenberg (1991) divides theories of technology into two major categories, the instrumental and the substantive. The instrumental theory, which is the predominant viewpoint, sees technology as an instrumental and neutral tool ready to serve its purposes. Under this view, technology is devoid of intrinsic valuative content and can be used for whatever ends desired by the user. Feenberg notes that under this theory, the typical response is unreserved commitment to employment of a particular technology if it suits an instrumental purpose. If one wishes to make exceptions on moral grounds, its price is reduced efficiency goals...is reduced efficiency (1991: 6). In contrast, the substantive theory of technology, which is based on the works of Martin Heidegger (1977) and Jacques Ellul (1964), infuses substantive value into the technology.

Feenberg (1992: 8) describes this substantive impact by noting that technology is not simply a means but has become an environment and a way of life.

Daniel Bell’s theory of post-industrial society as information society will be used as an exemplar of the instrumental theory of technology as it has had significant influence in contemporary mainstream thought. Associated with this viewpoint are a variety of other writers who take an optimistic if not celebratory stance towards technology who maintain the value-neutrality of technology. The second perspective, Marxism/neo-Marxism and critical theory stand in sharp distinction with the first group and strongly contest that technology is a neutral force. The third perspective has been called Critical Technology Studies and includes the works of Lewis Mumford, Jacques Ellul and Langdon Winner. As a fourth perspective, a work by Neil Postman will be considered. The discussion of these theorists will be followed by a discussion of Privacy Enhancing Technolgies (PETs) as an example of the implications of these perspectives on policy discourse and decision-making.

Before turning to these different theorists, the definitions of technology and neutrality of technology need to be considered. One explanation for the vast divergence of opinion on this issue is the definitional problem. Just as opinions differ on the underlying question of the neutrality of technology, so too they differ on the definition of technology. At the risk of oversimplification, these differences may be thought of as extremes on a continuum. At the narrowest end, technology is simply the physical tool, implement or machine. But in its broadest sense, technology should be viewed as not only the physical components of a system, but their relationships with each other and those who use them. At the narrow end, technology is simply a physical thing. At its broadest, it is a complex system consisting of things, relationships and processes.

Rejecting the narrow “implement” view of technology, Norman Balabanian (1993) proposes five dimensions to technology. First is the physical dimension of hardware (tools, instruments and machines), structures (buildings, roads, networks) and materials. The second dimension is the “know-how” consisting of procedures, processes and methods. The third consists of the personnel who have been provided with the procedural “know-how” necessary to manipulate the objects of the physical dimension. The fourth dimension is more complex and is often overlooked in simplistic definitions of technology. An organizational structure is a mechanism of management and control that involves a series of relationships and linkages between physical objects, processual know-how and personnel. The fifth dimension is political and economic power. Balabanian argues that while it is implicit in the organizational dimension, it is important to explicitly acknowledge it as a component of technology.

Langdon Winner also takes a broad approach to the definition of and distinguishes between technology and apparatus. The later are “…the class of objects we normally refer to as technological --tools, instruments, machines, appliances, weapons, gadgets -- which are used in accomplishing a wide variety of tasks” (1979:11). He also notes that “technology” applies to some forms of social organization such as factories, workshops, armies and bureaucracies. James Beniger (1986:9) defines technology as “roughly equivalent to that which can be done, excluding only those capabilities that occur naturally in living systems.”

It seems intuitive that as you proceed through the progressive layers of these dimensions, making the definition of technology more complex, the assertion of the “neutrality of technology” becomes less plausible. But what does the “neutrality of technology” mean. Balabanian (1993: 23) identifies various aspects of “neutral technology.” Primarily, it is a passive tool in which no values are embedded and it is apolitical in that it is not concerned with relations of power or domination. Neutral technology exists as an inanimate object waiting to be used, for better or worse, by a human and if it is used harmfully, it is the human operator’s fault.

It should be noted that though conceptually distinct, the notions of neutral technology and autonomous technology are closely related and both indicative of the instrumental as opposed to substantive theory of technology.

II. Bell’s Post-Industrial Society

One of the key aspects of Daniel Bell’s post-industrial theory is the central role of technology as a social determinant. Bell’s (1973) post-industrial thesis is amplified and updated in his later essay on the information society (Bell, 1980) in which he explicitly links post-industrialism to the information society. Bell identifies three general elements of the post-industrial/ information society. First, is the change from a goods producing to a service economy. The second element is the centrality of the codification of theoretical knowledge as a driving force in society. Third, he points to the creation of "intellectual technology” as the key tool of production, which he analogizes to machine technology in industrial society. By “intellectual technology,” Bell refers to methods that seek to substitute an algorithm, or decision rules, for intuitive judgments. These algorithms represent a "formalization" of judgments and their routine application to varied situations. Bell says that to the extent that intellectual technology is becoming predominant in the management of organizations and enterprises, it is as central a feature of postindustrial society as machine technology was in industrial society (1980: 504-505).

While all three elements are inter-related, Bell identifies the second point as “the axial principle” of the post-industrial society, noting that when theoretical knowledge is “codified” it becomes the “director of social change” (1980: 501). This is consistent with his earlier formulation (1973: 343-344) that in identifying a new and emerging social system, one must look for a defining characteristic:

“What has now become decisive for society is the new centrality of theoretical knowledge, the primacy of theory over empiricism, and the codification of knowledge into abstract systems of symbols that can be translated into many different and varied circumstances… If the dominant figures of the past hundred years have been the entrepreneur, the businessman, and the industrial executive, the “new men” are the scientists, the mathematicians, the economists, and the engineers of the new intellectual technology.”

Bell identifies rational social ordering as the goal of the new intellectual technology:

“The goal of the new intellectual technology is, neither more nor less, to realize a social alchemist’s dream: the dream of “ordering” the mass society. In this society today, millions of persons daily make billions of decisions about what to buy, how many children to have, whom to vote for, what job to take, and the like… If the computer is the tool, then decision theory is its master. Just as Pascal sought to play dice with God, and the physiocrats attempted to draw an economic grid that would array all exchanges among men, so the decision theorists seek their own tableau entier -- the compass of rationality, the “best’ solution to the choices perplexing men” (1973: 33).

This is one of the clearest indications of Bell’s adherence to the instrumentalist school. Postman (1993) would call it “deification of technology” while Marcuse (1964) would call it “reification of technology.”

Several critics of post-industrial theory have emphasized the problematic nature of the instrumental view of technology. Jennifer Slack (1984: 146) termed it vital to abandon ideas of neutrality of technology. Instead, she sees technology as as both “causes and effects that are integrally related to the environment.” Carolyn Marvin (1988:4) notes that “instrumental-centered” approaches to technology overlook how technologies are a “series of arenas for negotiating issues crucial to the conduct of social life.

III. Marxist Theory and the Frankfurt School

Karl Marx’s conception of technology runs directly counter to the notion of neutral and value-free technology. For Marx, purposeful production is the basic activity of man and the manner in which production is organized is a question of social relations. The relationship between human agency and production technologies is explicitly stated in Marx’s Grundrisse (1973: 706):

“Nature builds no machines, no locomotives, railways, electric telegraphs, self-acting mules, etc. These are products of human industry; natural material transformed into organs of the human will over nature, or of human participation in nature. They are organs of the human brain, created by the human hand; the power of knowledge objectified.”

The tradition of critical Marxism carried on by theorists associated with the Frankfurt School continue the strong critique of the role of technology in advanced industrial society. In One Dimensional Man, Herbert Marcuse (1964:xv) argues that the technical apparatus of production and distribution has become “totalitarian to the extent to which it determines not only the socially needed occupations, skills and attitudes but also individual needs and aspirations.” Technology thus serves to institute more effective and complete forms of social control. As a result of these totalitarian features, Marcuse asserts that:

“the traditional notion of the ‘neutrality’ of technology can no longer be maintained. Technology as such cannot be isolated from the use to which it is put; the technological society is a system of domination which operates already in the concept and construction of techniques” (Id, xvi).

Marcuse stresses the point that while advanced industrial society is a technological universe, it is at the same time a political universe. He cites Marx to directly confront the notion of neutral technology:

“One may still insist that the machinery of the technological universe is “as such” indifferent towards political ends -- it can revolutionize or retard a society. An electronic computer can serve equally a capitalist or socialist administration; a cyclotron can be an equally efficient tool for a war party or a peace party. This neutrality is contested in Marx’s controversial statement that the “hand-mill gives you society with the feudal lord; the steam-mill society with the industrial capitalist.” And this statement is further modified in Marxian theory itself: the social mode of production, not technics is the basic historical factor. However, when technics becomes the universal form of material production, it circumscribes an entire culture; it projects a historical totality -- a ‘world’” (p. 154).

This suggests that technology, when it reaches a certain level of development in a society, becomes effectively able to cloak the actual interests for which it acts. Indeed, in a subsequent passage, Marcuse (1964:168-169) says:

“The universal effectiveness and productivity of the apparatus under which [man and nature] are subsumed veil the particular interests that organize the apparatus. In other words, technology has become the great vehicle of reification -- reification in its most mature and effective form. The social position of the individual and his relation to others appear not only to be determined by objective qualities and laws, but these qualities and laws seem to lose their mysterious and uncontrollable character; they appear as calculable manifestations of (scientific) rationality.”

More recently, Marxian informed theorists such as Harry Braverman (1974) Richard Edwards (1979 and David Noble (1984) have continued to show how technology is consciously utilized to drive down labor costs, deskill work and transfer authority from the worker to management.

IV. Critical Technology Studies

David Hess (1997) groups a number of social theorists concerned with technology under the category of Critical and Technology Studies. He includes such here such influential theorists as Jacques Ellul (Technological Society, 1964) and Lewis Mumford (Technics and Civilization, 1964) Ellul wrote that technology (technique) was operating under its own inner logic and that mankind was submitting to its imperatives in a suicidal manner. Mumford distinguished between large system-centered technologies which are unstable and weaker yet resourceful human-centered ones. Hess also includes Langdon Winner in this group. Winner (1977) noted that while in the past, technology had seldom been seen as a primary subject matter for political or social inquiries, this was beginning to change. This previous neglect of technology was turning into what Winner called a “virtual obsession,” the interest being fueled by rapid developments in the technical sphere that outpace the ability of individuals and social systems to adopt.

Winner directly confronts the notion of “technics out of control” a major concern expressed by Ellul and others. But in contrast to Ellul’s sense of resignation and pessimism, Winner calls for a sustained examination and critique of the uncritical embrace of new technologies. He especially stresses a critique of the notion that technological design choices are not politically driven. In subsequent works (1980, 1986) he asks the question: Do artifacts have politics? As an example, he points to the bridges over the parkways on Long Island that were designed so that they would clear the curbs below at about nine feet, thereby allowing access for private cars but not for city buses. Winner contends this deliberate design decision was meant to exclude inner-city residents from reaching suburban beaches by public transit; a result which would not have been feasible through a direct measure ordering the segregation of beach facilities. Further examples of the politics of artifacts regarding the issue of personal privacy in the networked environment will be considered in the final section.

V. Neil Postman’s Technopoly

While difficult to characterize in any particular school of thought, Neil Postman’s (1993) critique of technological society is also worthy of mention. He defines technopoly as a state of mind and a state of culture consisting of the “deification of technology” in which:

“…culture seeks its authorizations in technology, finds its satisfactions in technology, and takes its orders from technology. This requires the development of a new kind of social order, and of necessity leads to the rapid dissolution of much that is associated with traditional beliefs” (1993: 71).

Postman’s analysis is based on a tripartite classification of cultures: tool-using, technocracy and technopoly. In the tool-using cultures, which were formally predominant but now disappearing, tools did one of two things. First, they were devised to solve particular problems of physical life. Second, they served the symbolic world of art, myth or ritual. What Postman emphasizes about tool-using culture is that social customs and spiritual ideas acted as guiding forces and the tools were always consistent with the culture into which they were introduced. Social customs determined the development of tools so technology was explicitly not an autonomous or independent force.

In the stage of technocracy, tools play a central role. As it is possible for tools to attack and even supersede the culture, then religion, tradition and social norms have to “fight for their lives” (p. 28). Postman gives the example of the telescope as a technology that attacked the fundamental propositions of prevailing theology. The printing press and the mechanical clock are other technologies often attributed with this function. But in the stage of technocracy, tradition and technology co-existed as opposing world-views in an uncomfortable tension (p. 48). But in Postman’s technopoly (which he says has only arisen in the United States), the thought world of the traditional disappears. Postman credits Frederick Taylor’s system of scientific management as the scaffolding upon which present day technopoly is built because it contained:

“…the first explicit and formal outline of the assumptions of the thought-world of Technopoly. These include the beliefs that the primary, if not the only, goal of human labor and thought is efficiency; that technical calculation is in all respects superior to human judgment; …that subjectivity is an obstacle to clear thinking; that what cannot be measured either does not exist or is of no value; and that the affairs of citizens are best guided and conducted by experts” (p. 51).

In contrasting technocracy from technopoly, Postman maintains that “[t]echnocracy does not have as its aim a grand reductionism in which human life must find its meaning in machinery and technique. Technopoly does” (p. 52).

While Postman’s critique of technopoly has much in common with Marxist, feminist and other critical theorists, he is primarily reflecting on the loss of spiritual meaning, his general world outlook is decidedly based on idealism. He explicitly identifies the “erosion of symbols” and the “loss of narrative” as the “most debilitating consequences of Technopoly” (p. 171). While he identifies commercial interests with being at the core of this loss of meaning, he also views religion as the world’s greatest narrative and would include it in the school curriculum as a measure of resistance.

While Postman identifies the historical influence on Taylorism on the development of technopoly, he fails to directly confront post-industrial theory. This explicitly linking of Taylor’s scientific management with Bell’s post-industrialism is noted by Robins and Webster (1987:105) who “see the postindustrialists’ information society as a direct continuation of Taylorism/Fordism --as Social Taylorism.”

While Postman’s Technopoly does not explicitly treat the neutrality of technology as such, the work is nonetheless important on this issue. In his strong categorization of technology as an autonomous force, Postman is implicitly arguing its neutrality as well. But this is only at a descriptive level. In a normative sense, Postman is explicit that the loss of meaning endemic in technopoly can and should be reversed.

In many ways, Postman’s analysis is similar to Langdon Winner’s (1977) notion that as a civilization, we are losing mastery over the tools we have created.

Postman’s work is also suggestive of Jacques Ellul’s (1964) Technological Society in that Ellul is similarly concerned with the erosion of moral values in a society overrun by technical systems. Also like Ellul, Postman presents a description of society which is negatively impacted by an autonomous technology which operates under its own inner logic and autonomy.

VI. Implications for Policy Discourse and Decision Making

This section considers the implications of the neutrality of technology debate for contemporary policy discourse and decision making. First, it is important to emphasize that the question of the neutrality of technology is part of the broader question of neutrality that permeates social analysis at various levels. In various disciplines, the neutrality debates take different forms and consider different phenomena. For example, in political science, theorists argue about the role of the state. Pluralists view the state as a neutral arena for the negotiation of different interests. Critics of pluralism, on the other hand, emphasize an asymmetry of power relationships and deny the neutral nature of the state apparatus and policy-making processes. In librarianship, the concept of neutrality has long been a foundational principle of library service. But critics argue that the ideal of political neutrality in American librarianship creates a vacuum which is filled by the most powerful and influential elements in society and causes the profession to be manipulated by the ruling elite (Blanke,1989; Harris, 1986). In economics, the idea of the market as a neutral and self-regulating allocative mechanism is a central concept of the neo-classical model. This is challenged by the more normative viewpoint of political economy that emphasizes the value-laden aspects of economic processes.

Framing the neutrality problem in such broader terms helps make the articulation of its policy implications more explicit. What cuts across all of these fields and what is particularly evident in the instance of technology, is that neutrality is illusory. It is an ideological mask that effectively keeps difficult questions from being raised.

In order to illustrate the implications of these perspectives on policy discourse and decision-making, it is useful to focus on an area of current interest, information privacy. In particular, the notion of “Privacy Enhancing Technologies” will be considered as an end goal for designers of new systems that may have the opportunity to collect, store and disseminate personal data about particular users of the system. This discussion emphasizes that the privacy-destructive or privacy-enhancing nature of an information system is contingent on a system of control which inextricably links the policy making and systems design processes. The example of how information systems may be designed in either privacy destructive or privacy enhancing manners also illustrates Winner’s contention that artifacts do indeed have politics.

Dorothy Glancy (1995) observes that the most effective way to protect individual privacy in the digital age is to design technological tools so that they prevent or limit the identification of individuals. Strategies for enhancing privacy as a feature of information systems may be pursued in the policy arena in the form of direct legal regulation. But they may also be pursued in the realm of system design itself. Lessig (1996) makes the point that the protection of an interest may involve a combination of both direct (legal) and indirect (economic, social and technological) forms of regulation. This section will emphasize a specific form of indirect regulation: the design of information systems with privacy considerations in mind.

Yet caution should be taken not to totally dissassociate the policy and design realms as they are interrelated and deeply imbedded in one another. This is well illustrated by John Bowers (1992) who borrows Langdon Winner’s (1980) example of the bridges over the parkway on Long Island (discussed, supra.) and Bruno Latour’s (1988) example of the Paris subway network to illustrate his point that “artefacts have politics.” Latour pointed to the decision of the radical Paris municipality in the late nineteenth century to design subway tunnels which were smaller than the coaches of the private railroad companies. This was done to preclude the ability of a subsequent right-wing majority from privatizing the municipalized rail system. Bowers extends his claim that “artefacts have politics” to the formalisms of computer systems. He contends that these formalisms generate representations through the operation of “rules over some vocabulary,” (p. 234). To formalize something is to give it a representation. Bowers characterizes modern institutions as “reservoirs for representation” into which multiple paths flow (p. 244). But as representations accumulate, they need to be ordered and disciplined so their sheer volume does not overwhelm the user and so the flows may be used for future action. Bruno Latour’s (1987: 232) “centres of calculation” provides a metaphoric site where such rationalizaton occurs. To be brought to the centre, representations need to be mobile and durable, and once there they need to be combinable without destroying the input (Bowers, 1992: 245).

Bowers then asks how formalizations are used in securing influence at a distance and the associated question of who gets to see the work of the formalization. In answering this, he notes that representation and rerepresentation involve taking an inscription away to a centre where it is subjected to interpretation, selection, combination, and recombination without the participation or knowledge of the represented. (p. 252).

Bowers calls for alternative design strategies that incorporate several features of Scandanavian participatory design. Formalisms should be viewed as critical objects of practice and their meanings should be open to negotiation and debated. Most significantly, they should not be tied to a centre of calculation which is private and open to only a few. They should be open, and not privatized, and constantly subject to revision and renegotiation (256). Bowers’ discussion of formalism provides a useful theoretical framework for considering the relationship between privacy and design.

To what extent do the socio-technological imperatives of efficiency present themselves in previously autonomous realms and impose the reordering of activites? Julie Cohen (1996) identifies the act of reading as such a previously autonomous realm. While her concern centers on the loss of the reader’s privacy and anonymity, the implications of Bowers’ model could portend an even broader impact: the loss of reading as an autonomous realm of human activity.

The consideration of privacy concerns may be thought of as endogenous to system design and not simply left as a problem to be addressed outside of the design process. The concern for privacy is an emerging issue in the literature of systems design, human-computer interaction (HCI), computer-mediated communication (CMC) systems and computer-supported collaborative work (CSCW). Garrett and Lyons (1993) outline the general requirements for the successful implementation of an Electronic Copyright Management System (ECMS). They include a consideration of the requirements of different stakeholders and a recognition that these may be conflicting. The identification, discussion and mediation of these potential conflicts is seen as part of the design process. This contrasts with many accounts of the design process which focus entirely on the need for technological performativity in the interest of revenue enhancement and market efficiency. The need to comprehend the position of all stakeholders is also recognized by Imprimatur (1997) which has carefully attempted to isolate the key concepts relating to digital transmission. Privacy is among these core issues and is identified as an important aspect in the design of rights management systems.

Herbert Burkert (1997) discusses “privacy enhancing technologies” (PETs) as technological and organizational concepts that aim at protecting personal identity. They are distinguished from data-security technologies (which are a necessary, but not sufficient condition for privacy protection) as PETs seek to eliminate the use of personal data altogether or to give direct control over revelation of personal information to the person concerned.” (p. 125) Four types of PET concepts (subject, object, action and system oriented) are suggested as a heuristic device to sort out the different ways in which privacy may be addressed in the system design process. Subject-oriented concepts seek to eliminate or reduce the ability to personally identify the subject. This can be done by assigning a proxy, an identifier which is either untraceable or traceable pursuant only to a given set of rules and procedures. These procedures may involve a trusted third party or be based on a technical device utilizing encryption.

The object-oriented concept is concerned with the traces that an object of the transaction may carry, and seeks to remove them without eliminating the object of the transaction. Transaction-oriented concepts are concerned with the automatic destruction of residual records that result from the transaction process. Examples include security videotapes or records containing the substance of a transaction. Library circulation systems which disassociate the patron’s identity from the circulation record when the book is returned combine these transaction and subject oriented concepts. In contrast, bookstore records that retain this link as potential marketing data utilize neither.

System-oriented concepts combine these in order to create “zones of interaction” where the subject’s identity is hidden, where the object being handled bears no traces and where no record of the transaction is created or maintained. Burkert provides the Catholic confession, anonymous crisis intervention centers and e-mail anonymizer services as examples of such system oriented concepts (p.127-128).

Burkert recognizes several achievements of PET design including shifting the burden of legitimization to those who want to collect and maintain personal information, and decreasing reliance on the principal of consent. He emphasizes that addressing privacy on the design level is an essential precondition to addressing privacy on the political level because in technology-oriented societies, the ability to “do something” is an important element in strategies of legitimization.” (p. 129).

Similarly, Victoria Belloti (1997:68) treats privacy as a central design issue for developers of potentially intrusive technologies. She notes that “[w]hile control and feedback about information and interpersonal access are fundamental to privacy,… they are also key resources for successful communication and collaboration among people. It is therefore doubly important that they be considered in the design of networked and collaborative computing and communication systems.”

The very discussion of PETs signifies an acceptance of the substantive theory of technology and a rejection of the instrumental view.

Conclusion

In conclusion, one’s viewpoint on the question concerning the “neutrality of technology” is directly relevant to the ongoing discourse surrounding information policy. As the privacy example indicates, the question of design of technological systems cannot be divorced from their political, economic and social effects. Looking at the issue from the broadest possible vantage-point as suggested above by Balabanian helps keep the totality of these relationships in mind. But the view that technology is neutral only masks this relationship and acts as a justification for the continuation of policies that will only bring Ellul’s worst fears closer to reality. As Martin Heidegger (1977:4) warned: “…we remain unfree and chained to technology, whether we passionately affirm or deny it. But we are delivered over to it in the worst possible way when we regard it as something neutral.”

References

Balabanian, Norman. (1993). “The Neutrality of Technology: A Critique of Assumptions,” Chapter 2 in Critical Approaches to Information Technology in Librarianship: Foundations and Applications, John Buschman, ed., Westport, CT: Greenwood Press. (pp. 15-40).

Bell, Daniel. (1973) The Coming of Post-Industrial Society: A Venture in Social Forecasting. New York: Basic Books.

Bell, Daniel.(1980). "The Social Framework of the Information Society." in The Microelectronics Revolution: The Complete Guide to the New Technology and its Impact on Society, Tom Forester, ed., Cambridge, Mass: MIT Press: 500-549.

Bellotti, Victoria. (1997) “Design for Privacy in Multimedia Computing and Communications Environments.” in Agre, Philip E. and Marc Rotenberg, eds Technology and Privacy: The New Landscape. Cambridge, MA: MIT Press: 63-98.

Blanke, Henry T. 1989. Librarianship and Political Values: Neutrality or Commitment? Library Journal 114, 39-43.

Bowers, John. (1992). “The Politics of Formalism,” in Contexts of Computer Mediated Communication, Lea, Martin, ed., New York: Harvester Wheatsheaf: 232-261.

Braverman, Harry. (1974). Labor And Monopoly Capital: The Degradation of Work in the Twentieth Century, New York: Monthly Review Press

Cohen, Julie E. (1996). “A Right to Read Anonymously: A Closer Look at “Copyright Management” in Cyberspace,” Connecticut Law Review, 28: 981-1039.

Edwards, Richard. (1979). Contested Terrain: The Transformation of the Workplace in the Twentieth Century. London: Heinemann.

Ellul, Jacques.(1964). Technological Society (trans. by John Wilkinson). New York: Vintage Books.

Feenberg, Andrew. (1991). Critical Theory of Technology. New York: Oxford University Press.

Garrett, John R. and Patrice A. Lyons. (1993). “Toward An Electronic Copyright Management Information System,” Journal of the American Society for Information Science 44(8): 468-473.

Glancy, Dorothy J. (1995). “Privacy and Intelligent Transportation Technology,” Santa Clara Computer High Technology Law Journal, 11:151.

Harris, Michael H. (1986) “State, Class and Cultural Reproduction: Towards a Theory of Library Service in the United States,” Advances in Librarianship, 14 (Wesley Simonton, ed.) New York: Academic Press.

Heidegger, Martin. (1977). The Question Concerning Technology and Other Essays (trans. William Lovitt). New York: Harper & Row.

Hess, David J. (1997). Science Studies: An Advanced Introduction. New York: New York University Press.

Imprimatur. (1997). Consensus Forum 1997. Rights Limitations and Exceptions: Striking a Proper Balance. Available online at: .

Latour, Bruno. (1988). The Prince for Machines as Well as for Machinations, in Elliot, B., ed. Technology and Social Process, Edinburgh: Edinburgh University Press.

Lessig, Lawrence. (1996). “Intellectual Property and Code,” St. John’s Journal of Legal Commentary, 11: 635.

Marcuse, Herbert. (1964). One-Dimensional Man. Boston: Beacon Press.

Marvin, Carolyn. (1988). When Old Technologies Were New: Thinking About Electric Communications in the Late 19th Century. Oxford: Oxford University Press.

Marx, Karl. (1973). Grundrisse: Foundations of the Critique of Political Economy. [1857-1858] translated by Martin Nicolaus. New York: Vintage Books.

Mumford, Lewis. (1964). Technics and Civilization. NY: Harcourt, Brace and World.

Noble, David. (1984). Forces of Production: A Social History of Industrial Automation. New York: Alfred A. Knopf.

Winner, Langdon. (1980). “Do Artifacts have Politics?” Daedelus, 109:121.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download