Throughout this volume, some messages referenced are cited by their â€œMessage-ID,â€ which should allow anyone interested to access the original messages through Google Groups (http://groups.google.com).
1 A Note on Terminology: There is still debate about how to refer to Free Software, which is also known as Open Source Software. The scholarly community has adopted either FOSS or FLOSS (or F/LOSS): the former stands for the Anglo-American Free and Open Source Software; the latter stands for the continental Free, Libre and Open Source Software. Two Bits sticks to the simple term Free Software to refer to all of these things, except where it is specifically necessary to differentiate two or more names, or to specify people or events so named. The reason is primarily aesthetic and political, but Free Software is also the older term, as well as the one that includes issues of moral and social order. I explain in chapter 3 why there are two terms.
3 So, for instance, when a professional society founded on charters and ideals for membership and qualification speaks as a public, it represents its members, as when the American Medical Association argues for or against changes to Medicare. However, if a new groupâ€”say, of nursesâ€”seeks not only to participate in this discussionâ€”which may be possible, even welcomedâ€”but to change the structure of representation in order to give themselves status equal to doctors, this change is impossible, for it goes against the very aims and principles of the society. Indeed, the nurses will be urged to form their own society, not to join that of the doctors, a proposition which gives the lie to the existing structures of power. By contrast, a public is an entity that is less controlled and hence more agonistic, such that nurses might join, speak, and insist on changing the terms of debate, just as patients, scientists, or homeless people might. Their success, however, depends entirely on the force with which their actions transform the focus and terms of the public. Concepts of the public sphere have been roundly critiqued in the last twenty years for presuming that such â€œequality of accessâ€ is sufficient to achieve representation, when in fact other contextual factors (race, class, sex) inherently weight the representative power of different participants. But these are two different and overlapping problems: one cannot solve the problem of pernicious, invisible forms of inequality unless one first solves the problem of ensuring a certain kind of structural publicity. It is precisely the focus on maintaining publicity for a recursive public, over against massive and powerful corporate and governmental attempts to restrict it, that I locate as the central struggle of Free Software. Gender certainly influences who gets heard within Free Software, for example, but it is a mistake to focus on this inequality at the expense of the larger, more threatening form of political failure that Free Software addresses. And I think there are plenty of geeksâ€”man, woman and animalâ€”who share this sentiment.
4 Wikipedia is perhaps the most widely known and generally familiar example of what this book is about. Even though it is not identified as such, it is in fact a Free Software project and a â€œmodulationâ€ of Free Software as I describe it here. The nonâ€“technically inclined reader might keep Wikipedia in mind as an example with which to follow the argument of this book. I will return to it explicitly in part 3. However, for better or for worse, there will be no discussion of pornography.
5 Although the term public clearly suggests private as its opposite, Free Software is not anticommercial. A very large amount of money, both real and notional, is involved in the creation of Free Software. The term re[PAGE 313]cursive market could also be used, in order to emphasize the importance (especially during the 1990s) of the economic features of the practice. The point is not to test whether Free Software is a â€œpublicâ€ or a â€œmarket,â€ but to construct a concept adequate to the practices that constitute it.
8 Critiques of the demand for availability and the putatively inherent superiority of transparency include Coombe and Herman, â€œRhetorical Virtuesâ€ and â€œYour Second Life?â€; Christen, â€œGone Digitalâ€; and Anderson and Bowery, â€œThe Imaginary Politics of Access to Knowledge.â€
9 This description of Free Software could also be called an â€œassemblage.â€ The most recent source for this is Rabinow, Anthropos Today. The language of thresholds and intensities is most clearly developed by Manuel DeLanda in A Thousand Years of Non-linear History and in Intensive Science and Virtual Philosophy. The term problematization, from Rabinow (which he channels from Foucault), is a synonym for the phrase â€œreorientation of knowledge and powerâ€ as I use it here.
11 The genealogy of the term commons has a number of sources. An obvious source is Garrett Hardinâ€™s famous 1968 article â€œThe Tragedy of the Commons.â€ James Boyle has done more than anyone to specify the term, especially during a 2001 conference on the public domain, which included the inspired guest-list juxtaposition of the appropriation-happy musical collective Negativland and the dame of â€œcommonsâ€ studies, Elinor Ostrom, whose book Governing the Commons has served as a certain inspiration for thinking about commons versus public domains. Boyle, for his part, has ceaselessly pushed the â€œenvironmentalâ€ metaphor of speaking for the public domain as environmentalists of the 1960s and 1970s spoke for the environment (see Boyle, â€œThe Second Enclosure Movement and the Construction of the Public Domainâ€ and â€œA Politics of Intellectual Propertyâ€). The term commons is useful in this context precisely because it distinguishes the â€œpublic domainâ€ as an imagined object of pure public transaction and coordination, as opposed to a â€œcommons,â€ which can consist of privately owned things/spaces that are managed in such a fashion that they effectively function like a â€œpublic domainâ€ is imagined to (see Boyle, â€œThe Public Domainâ€; Hess and Ostrom, Understanding Knowledge as a Commons).
12 Marcus and Fischer, Anthropology as Cultural Critique; Marcus and Clifford, Writing Culture; Fischer, Emergent Forms of Life and the Anthropological Voice; Marcus, Ethnography through Thick and Thin; Rabinow, Essays on the Anthropology of Reason and Anthropos Today.
13 The language of â€œfiguring outâ€ has its immediate source in the work of Kim Fortun, â€œFiguring Out Ethnography.â€ Fortunâ€™s work refines two other sources, the work of Bruno Latour in Science in Action and that of Hans-Jorg Rheinberger in Towards History of Epistemic Things. Latour describes the difference between â€œscience madeâ€ and â€œscience in the makingâ€ and how the careful analysis of new objects can reveal how they come to be. Rheinberger extends this approach through analysis of the detailed practices involved in figuring out a new object or a new processâ€”practices which participants cannot quite name or explain in precise terms until after the fact.
15 The literature on â€œvirtual communities,â€ â€œonline communities,â€ the culture of hackers and geeks, or the social study of information technology offers important background information, although it is not the subject of this book. A comprehensive review of work in anthropology and related disciplines is Wilson and Peterson, â€œThe Anthropology of Online Communities.â€ Other touchstones are Miller and Slater, The Internet; Carla Freeman, High Tech and High Heels in the Global Economy; Hine, Virtual Ethnography; Kling, Computerization and Controversy; Star, The Cultures of Computing; Castells, The Rise of the Network Society; Boczkowski, Digitizing the News. Most social-science work in information technology has dealt with questions of inequality and the so-called digital divide, an excellent overview being DiMaggio et al., â€œFrom Unequal Access to Differentiated Use.â€ Beyond works in anthropology and science studies, a number of works from various other disciplines have recently taken up similar themes, especially Adrian MacKenzie, Cutting Code; Galloway, Protocol; Hui Kyong Chun, Control and Freedom; and Liu, Laws of Cool. By contrast, if social-science studies of information technology are set against a background of historical and ethnographic studies of â€œfiguring outâ€ problems of specific information technologies, software, or networks, then the literature is sparse. Examples of anthropology and science studies of figuring out include Barry, Political Machines; Hayden, When Nature Goes Public; and Fortun, Advocating Bhopal. Matt Ratto has also portrayed this activity in Free Software in his dissertation, â€œThe Pressure of Openness.â€
16 In addition to Abbate and Salus, see Norberg and Oâ€™Neill, Transforming Computer Technology; Naughton, A Brief History of the Future; Hafner, Where Wizards Stay Up Late; Waldrop, The Dream Machine; Segaller, Nerds 2.0.1. For a classic autodocumentation of one aspect of the Internet, see Hauben and Hauben, Netizens.
17 Kelty, â€œCultureâ€™s Open Sourcesâ€; Coleman, â€œThe Social Construction of Freedomâ€; Ratto, â€œThe Pressure of Opennessâ€; Joseph Feller et al., Per[PAGE 315]spectives on Free and Open Source Software; see also http://freesoftware.mit.edu/, organized by Karim Lakhani, which is a large collection of work on Free Software projects. Early work in this area derived both from the writings of practitioners such as Raymond and from business and management scholars who noticed in Free Software a remarkable, surprising set of seeming contradictions. The best of these works to date is Steven Weber, The Success of Open Source. Weberâ€™s conclusions are similar to those presented here, and he has a kind of cryptoethnographic familiarity (that he does not explicitly avow) with the actors and practices. Yochai Benklerâ€™s Wealth of Networks extends and generalizes some of Weberâ€™s argument.
19 Despite what might sound like a â€œshoot first, ask questions laterâ€ approach, the design of this project was in fact conducted according to specific methodologies. The most salient is actor-network theory: Latour, Science in Action; Law, â€œTechnology and Heterogeneous Engineeringâ€; Callon, â€œSome Elements of a Sociology of Translationâ€; Latour, Pandoraâ€™s Hope; Latour, Re-assembling the Social; Callon, Laws of the Markets; Law and Hassard, Actor Network Theory and After. Ironically, there have been no actor-network studies of networks, which is to say, of particular information and communication technologies such as the Internet. The confusion of the word network (as an analytical and methodological term) with that of network (as a particular configuration of wires, waves, software, and chips, or of people, roads, and buses, or of databases, names, and diseases) means that it is necessary to always distinguish this-network-here from any-network-whatsoever. My approach shares much with the ontological questions raised in works such as Law, Aircraft Stories; Mol, The Body Multiple; Cussins, â€œOntological Choreographyâ€; Charis Thompson, Making Parents; and Dumit, Picturing Personhood.
20 I understand a concern with scientific infrastructure to begin with Steve Shapin and Simon Schaffer in Leviathan and the Air Pump, but the genealogy is no doubt more complex. It includes Shapin, The Social History of Truth; Biagioli, Galileo, Courtier; Galison, How Experiments End and Image and Logic; Daston, Biographies of Scientific Objects; Johns, The Nature of the Book. A whole range of works explore the issue of scientific tools and infrastructure: Kohler, Lords of the Fly; Rheinberger, Towards a History of Epistemic Things; Landecker, Culturing Life; Keating and Cambrosio, Biomedical Platforms. Bruno Latourâ€™s â€œWhat Rules of Method for the New Socio-scientific Experimentsâ€ provides one example of where science studies might go with these questions. Important texts on the subject of technical infrastructures include Walsh and Bayma, â€œComputer Networks and Scientific Workâ€; Bowker and Star, Sorting Things Out; Edwards, The [PAGE 316] Closed World; Misa, Brey, and Feenberg, Modernity and Technology; Star and Ruhleder, â€œSteps Towards an Ecology of Infrastructure.â€
22 In addition, see Lippmann, The Phantom Public; Calhoun, Habermas and the Public Sphere; Latour and Weibel, Making Things Public. The debate about social imaginaries begins alternately with Benedict Andersonâ€™s Imagined Communities or with Cornelius Castoriadisâ€™s The Imaginary Institution of Society; see also Chatterjee, â€œA Response to Taylorâ€™s â€˜Modes of Civil Societyâ€™â€; Gaonkar, â€œToward New Imaginariesâ€; Charles Taylor, â€œModes of Civil Societyâ€ and Sources of the Self.
1. Geeks and Recursive Publics
1 For the canonical story, see Levy, Hackers. Hack referred to (and still does) a clever use of technology, usually unintended by the maker, to achieve some task in an elegant manner. The term has been successfully redefined by the mass media to refer to computer users who break into and commit criminal acts on corporate or government or personal computers connected to a network. Many self-identified hackers insist that the criminal element be referred to as crackers (see, in particular, the entries on â€œHackers,â€ â€œGeeksâ€ and â€œCrackersâ€ in The Jargon File, http://www.catb.org/~esr/jargon/, also published as Raymond, The New Hackersâ€™ Dictionary). On the subject of definitions and the cultural and ethical characteristics of hackers, see Coleman, â€œThe Social Construction of Freedom,â€ chap. 2.
2 One example of the usage of geek is in Star, The Cultures of Computing. Various denunciations (e.g., Barbrook and Cameron, â€œThe California Ideologyâ€; Borsook, Technolibertarianism) tend to focus on journalistic accounts of an ideology that has little to do with what hackers, geeks, and entrepreneurs actually make. A more relevant categorical distinction than that between hackers and geeks is that between geeks and technocrats; in the case of technocrats, the â€œanthropology of technocracyâ€ is proposed as the study of the limits of technical rationality, in particular the forms through which â€œplanningâ€ creates â€œgaps in the form that serve as â€˜targets of interventionâ€™â€ (Riles, â€œReal Time,â€ 393). Rilesâ€™s â€œtechnocratsâ€ are certainly not the â€œgeeksâ€ I portray here (or at least, if they are, it is only in their frustrating day jobs). Geeks do have libertarian, specifically Hayekian or Feyerabendian leanings, but are more likely to see technical failures not as failures of planning, but as bugs, inefficiencies, or occasionally as the products of human hubris or stupidity that is born of a faith in planning.
4 Geeks are also identified often by the playfulness and agility with which they manipulate these labels and characterizations. See Michael M. J. Fischer, â€œWorlding Cyberspaceâ€ for an example.
6 On the subject of imagined communities and the role of information technologies in imagined networks, see Green, Harvey, and Knox, â€œScales of Place and Networksâ€; and Flichy, The Internet Imaginaire.
8 Ibid., 33â€“48. Taylorâ€™s history of the transition from feudal nobility to civil society to the rise of republican democracies (however incomplete) is comparable to Foucaultâ€™s history of the birth of biopolitics, in La naissance de la biopolitique, as an attempt to historicize governance with respect to its theories and systems, as well as within the material forms it takes.
10 Geertz, â€œIdeology as a Cultural Systemâ€; Mannheim, Ideology and Utopia. Both, of course, also signal the origin of the scientific use of the term proximately with Karl Marxâ€™s â€œGerman Ideologyâ€ and more distantly in the Enlightenment writings of Destutt de Tracy.
13 The depth and the extent of this issue is obviously huge. Ricoeurâ€™s Lectures on Ideology and Utopia is an excellent analysis to the problem of ideology prior to 1975. Terry Eagletonâ€™s books The Ideology of the Aesthetic and Ideology: An Introduction are Marxist explorations that include discussions of hegemony and resistance in the context of artistic and literary theory in the 1980s. Slavoj Å½iÅ¾ek creates a Lacanian-inspired algebraic system of analysis that combines Marxism and psychoanalysis in novel ways (see Å½iÅ¾ek, Mapping Ideology). There is even an attempt to replace the concept of ideology with a metaphor of â€œsoftwareâ€ and â€œmemesâ€ (see Balkin, Cultural Software). The core of the issue of ideology as a practice (and the vicissitudes of materialism that trouble it) are also at the heart of works by Pierre Bourdieu and his followers (on the relationship of ideology and hegemony, see Laclau and Mouffe, Hegemony and Socialist Strategy). In anthropology, see Comaroff and Comaroff, Ethnography and the Historical Imagination.
19 The question of gender plagues the topic of computer culture. The gendering of hackers and geeks and the more general exclusion of women in computing have been widely observed by academics. I can do no more here than direct readers to the increasingly large and sophisticated literature on the topic. See especially Light, â€œWhen Computers Were Womenâ€; Turkle, The Second Self and Life on the Screen. With respect to Free Software, see Nafus, Krieger, Leach, â€œPatches Donâ€™t Have Gender.â€ More generally, see Kirkup et al., The Gendered Cyborg; Downey, The Machine in Me; Faulkner, â€œDualisms, Hierarchies and Gender in Engineeringâ€; Grint and Gill, The Gender-Technology Relation; Helmreich, Silicon Second Nature; Herring, â€œGender and Democracy in Computer-Mediated Communicationâ€; Kendall, â€œâ€˜Oh No! Iâ€™m a NERD!â€™â€; Margolis and Fisher, Unlocking the Clubhouse; Green and Adam, Virtual Gender; P. Hopkins, Sex/Machine; Wajcman, Feminism Confronts Technology and â€œReflections on Gender and Technology Studiesâ€; and Fiona Wilson, â€œCanâ€™t Compute, Wonâ€™t Compute.â€ Also see the novels and stories of Ellen Ullman, including Close to the Machine and The Bug: A Novel.
20 Originally coined by Steward Brand, the phrase was widely cited after it appeared in Barlowâ€™s 1994 article â€œThe Economy of Ideas.â€
21 On the genesis of â€œvirtual communitiesâ€ and the role of Steward Brand, see Turner, â€œWhere the Counterculture Met the New Economy.â€
24 The rest of this message can be found in the Silk-list archives at http://groups.yahoo.com/group/silk-list/message/2869 (accessed 18 August 2006). The reference to â€œFlingâ€ is to a project now available at http://fling.sourceforge.net/ (accessed 18 August 2006). The full archives of Silk-list can be found at http://groups.yahoo.com/group/silk-list/ and the full archives of the FoRK list can be found at http://www.xent.com/mailman/listinfo/fork/.
26 Mooreâ€™s Lawâ€”named for Gordon Moore, former head of Intelâ€”states that the speed and capacity of computer central processing units (CPUs) doubles every eighteen months, which it has done since roughly 1970. Metcalfeâ€™s Lawâ€”named for Robert Metcalfe, inventor of Ethernetâ€”states that the utility of a network equals the square of the number of users, suggesting that the number of things one can do with a network increases exponentially as members are added linearly.
27 This quotation from the 1990s is attributed to Electronic Frontier Foundationâ€™s founder and â€œcyber-libertarianâ€ John Gilmore. Whether there [PAGE 319] is any truth to this widespread belief expressed in the statement is not clear. On the one hand, the protocol to which this folklore refersâ€”the general system of â€œmessage switchingâ€ and, later, â€œpacket switchingâ€ invented by Paul Baran at RAND Corporationâ€”does seem to lend itself to robustness (on this history, see Abbate, Inventing the Internet). However, it is not clear that nuclear threats were the only reason such robustness was a design goal; simply to ensure communication in a distributed network was necessary in itself. Nonetheless, the story has great currency as a myth of the nature and structure of the Internet. Paul Edwards suggests that both stories are true (â€œInfrastructure and Modernity,â€ 216â€“20, 225n13).
28 Lessig, Code and Other Laws of Cyberspace. See also Gillespie, â€œEngineering a Principleâ€ on the related history of the â€œend to endâ€ design principle.
29 This is constantly repeated on the Internet and attributed to David Clark, but no one really knows where or when he stated it. It appears in a 1997 interview of David Clark by Jonathan Zittrain, the transcript of which is available at http://cyber.law.harvard.edu/jzfallsem//trans/clark/ (accessed 18 August 2006).
30 Ashish â€œHashâ€ Gulhati, e-mail to Silk-list mailing list, 9 September 2000, http://groups.yahoo.com/group/silk-list/message/3125.
31 Eugen Leitl, e-mail to Silk-list mailing list, 9 September 2000, http://groups.yahoo.com/group/silk-list/message/3127. Python is a programming language. Mojonation was a very promising peer-to-peer application in 2000 that has since ceased to exist.
32 In particular, this project focuses on the Transmission Control Protocol (TCP), the User Datagram Protocol (UDP), and the Domain Name System (DNS). The first two have remained largely stable over the last thirty years, but the DNS system has been highly politicized (see Mueller, Ruling the Root).
33 On Internet standards, see Schmidt and Werle, Coordinating Technology; Abbate and Kahin, Standards Policy for Information Infrastructure.
4 The Apple-Microsoft conflict was given memorable expression by Umberto Eco in a widely read piece that compared the Apple user interface [PAGE 320] to Catholicism and the PC user interface to Protestantism (â€œLa bustina di Minerva,â€ Espresso, 30 September 1994, back page).
5 One entry on Wikipedia differentiates religious wars from run-of-the-mill â€œflame warsâ€ as follows: â€œWhereas a flame war is usually a particular spate of flaming against a non-flamy background, a holy war is a drawn-out disagreement that may last years or even span careersâ€ (â€œFlaming [Internet],â€ http://en.wikipedia.org/wiki/Flame_war [accessed 16 January 2006]).
7 Message-ID: email@example.com. It should be noted, in case the reader is unsure how serious this is, that EGCS stood for Extended GNU Compiler System, not Ecumenical GNU Compiler Society.
8 â€œMartin Luther, Meet Linus Torvalds,â€ Salon, 12 November 1998, http://archive.salon.com/21st/feature/1998/11/12feature.html (accessed 5 February 2005).
10 Message-ID: firstname.lastname@example.org. In one very humorous case the comparison is literalized â€œMicrosoft acquires Catholic Churchâ€ (Message-ID: email@example.com).
11 Paul Fusco, â€œThe Gospel According to Joy,â€ New York Times, 27 March 1988, Sunday Magazine, 28.
12 See, for example, Matheson, The Imaginative World of the Reformation. There is rigorous debate about the relation of print, religion, and capitalism: one locus classicus is Eisensteinâ€™s The Printing Press as an Agent of Change, which was inspired by McLuhan, The Gutenberg Galaxy. See also Ian Green, Print and Protestantism in Early Modern England and The Christianâ€™s ABCs; Chadwick, The Early Reformation on the Continent, chaps. 1â€“3.
15 At a populist level, this was captured by John Perry Barlowâ€™s â€œDeclaration of Independence of the Internet,â€ http://homes.eff.org/~barlow/Declaration-Final.html.
23 One of the ways Adrian discusses innovation is via the argument of the Harvard Business School professor Clayton Christensenâ€™s The Innovatorâ€™s Dilemma. It describes â€œsustaining vs. disruptiveâ€ technologies as less an issue of how technologies work or what they are made of, and more an issue of how their success and performance are measured. See Adrian Gropper, â€œThe Internet as a Disruptive Technology,â€ Imaging Economics, December 2001, http://www.imagingeconomics.com/library/200112-10.asp (accessed 19 September 2006).
24 On kinds of civic duty, see Fortun and Fortun, â€œScientific Imaginaries and Ethical Plateaus in Contemporary U.S. Toxicology.â€
25 There is, in fact, a very specific group of people called transhumanists, about whom I will say very little. I invoke the label here because I think certain aspects of transhumanism are present across the spectrum of engineers, scientists, and geeks.
26 See the World Transhumanist Association, http://transhumanism.org/ (accessed 1 December 2003) or the Extropy Institute, http://www.extropy.org/ (accessed 1 December 2003). See also Doyle, Wetwares, and Battaglia, â€œFor Those Who Are Not Afraid of the Future,â€ for a sidelong glance.
28 The computer scientist Bill Joy wrote a long piece in Wired warning of the outcomes of research conducted without ethical safeguards and the dangers of eugenics in the past, â€œWhy the Future Doesnâ€™t Need Us,â€ Wired 8.4 [April 2000], http://www.wired.com/wired/archive/8.04/joy.html (accessed 27 June 2005).
30 Eugen Leitl, e-mail to Silk-list mailing list, 16 May 2000, http://groups.yahoo.com/group/silk-list/message/2410.
31 Eugen Leitl, e-mail to Silk-list mailing list, 7 August 2000, http://groups.yahoo.com/group/silk-list/message/2932.
3. The Movement
1 For instance, Richard Stallman writes, â€œThe Free Software movement and the Open Source movement are like two political camps within the free software community. Radical groups in the 1960s developed a reputation for factionalism: organizations split because of disagreements on details of strategy, and then treated each other as enemies. Or at least, such is the [PAGE 322] image people have of them, whether or not it was true. The relationship between the Free Software movement and the Open Source movement is just the opposite of that picture. We disagree on the basic principles, but agree more or less on the practical recommendations. So we can and do work together on many specific projects. We donâ€™t think of the Open Source movement as an enemy. The enemy is proprietary softwareâ€ (â€œWhy â€˜Free Softwareâ€™ Is Better than â€˜Open Source,â€™â€ GNUâ€™s Not Unix! http://www.gnu.org/philosophy/free-software-for-freedom.html [accessed 9 July 2006]). By contrast, the Open Source Initiative characterizes the relationship as follows: â€œHow is â€˜open sourceâ€™ related to â€˜free softwareâ€™? The Open Source Initiative is a marketing program for free software. Itâ€™s a pitch for â€˜free softwareâ€™ because it works, not because itâ€™s the only right thing to do. Weâ€™re selling freedom on its meritsâ€ (http://www.opensource.org/advocacy/faq.php [accessed 9 July 2006]). There are a large number of definitions of Free Software: canonical definitions include Richard Stallmanâ€™s writings on the Free Software Foundationâ€™s Web site, www.fsf.org, including the â€œFree Software Definitionâ€ and â€œConfusing Words and Phrases that Are Worth Avoiding.â€ From the Open Source side there is the â€œOpen Source Definitionâ€ (http://www.opensource.org/licenses/). Unaffiliated definitions can be found at www.freedomdefined.org.
7 â€œNetscape Announces Plans to Make Next-Generation Communicator Source Code Available Free on the Net,â€ Netscape press release, 22 January 1998, http://wp.netscape.com/newsref/pr/newsrelease558.html (accessed 25 Sept 2007).
8 On the history of software development methodologies, see Mahoney, â€œThe Histories of Computing(s)â€ and â€œThe Roots of Software Engineering.â€
9 Especially good descriptions of what this cycle is like can be found in Ullman, Close to the Machine and The Bug.
16 See Hamerly and Paquin, â€œFreeing the Source.â€ The story is elegantly related in Moody, Rebel Code, 182â€“204. Raymond gives Christine Petersen of the Foresight Institute credit for the term open source.
17 From Raymond, The Cathedral and the Bazaar. The changelog is available online only: http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/.
19 On social movementsâ€”the closest analog, developed long agoâ€”see Gerlach and Hine, People, Power, Change, and Freeman and Johnson, Waves of Protest. However, the Free Software and Open Source Movements do not have â€œcausesâ€ of the kind that conventional movements do, other than the perpetuation of Free and Open Source Software (see Coleman, â€œPolitical Agnosticismâ€; Chan, â€œCoding Free Softwareâ€). Similarly, there is no single development methodology that would cover only Open Source. Advocates of Open Source are all too willing to exclude those individuals or organizations who follow the same â€œdevelopment methodologyâ€ but do not use a Free Software licenseâ€”such as Microsoftâ€™s oft-mocked â€œshared-sourceâ€ program. The list of licenses approved by both the Free Software Foundation and the Open Source Initiative is substantially the same. Further, the Debian Free Software Guidelines and the â€œOpen Source Definitionâ€ are almost identical (compare http://www.gnu.org/philosophy/license-list.html with http://www.opensource.org/licenses/ [both accessed 30 June 2006]).
20 It is, in the terms of Actor Network Theory, a process of â€œenrollmentâ€ in which participants find ways to rhetorically alignâ€”and to disalignâ€”their interests. It does not constitute the substance of their interest, however. See Latour, Science in Action; Callon, â€œSome Elements of a Sociology of Translation.â€
23 For example, Castells, The Internet Galaxy, and Weber, The Success of Open Source both tell versions of the same story of origins and development.
4. Sharing Source Code
1 â€œSharingâ€ source code is not the only kind of sharing among geeks (e.g., informal sharing to communicate ideas), and UNIX is not the only [PAGE 324] shared software. Other examples that exhibit this kind of proliferation (e.g., the LISP programming language, the TeX text-formatting system) are as ubiquitous as UNIX today. The inverse of my argument here is that selling produces a different kind of order: many products that existed in much larger numbers than UNIX have since disappeared because they were never ported or forked; they are now part of dead-computer museums and collections, if they have survived at all.
2 The story of UNIX has not been told, and yet it has been told hundreds of thousands of times. Every hacker, programmer, computer scientist, and geek tells a version of UNIX historyâ€”a usable past. Thus, the sources for this chapter include these stories, heard and recorded throughout my fieldwork, but also easily accessible in academic work on Free Software, which enthusiastically participates in this potted-history retailing. See, for example, Steven Weber, The Success of Open Source; Castells, The Internet Galaxy; Himanen, The Hacker Ethic; Benkler, The Wealth of Networks. To date there is but one detailed history of UNIXâ€”A Quarter Century of UNIX, by Peter Salusâ€”which I rely on extensively. Matt Rattoâ€™s dissertation, â€œThe Pressure of Openness,â€ also contains an excellent analytic history of the events told in this chapter.
3 The intersection of UNIX and TCP/IP occurred around 1980 and led to the famous switch from the Network Control Protocol (NCP) to the Transmission Control Protocol/Internet Protocol that occurred on 1 January 1983 (see Salus, Casting the Net).
5 There is a large and growing scholarly history of software: Wexelblat, History of Programming Languages and Bergin and Gibson, History of Programming Languages 2 are collected papers by historians and participants. Key works in history include Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog; Akera and Nebeker, From 0 to 1; Hashagen, Keil-Slawik, and Norberg, History of Computingâ€”Software Issues; Donald A. MacKenzie, Mechanizing Proof. Michael Mahoney has written by far the most about the early history of software; his relevant works include â€œThe Roots of Software Engineering,â€ â€œThe Structures of Computation,â€ â€œIn Our Own Image,â€ and â€œFinding a History for Software Engineering.â€ On UNIX in particular, there is shockingly little historical work. Martin Campbell-Kelly and William Aspray devote a mere two pages in their general history Computer. As early as 1978, Ken Thompson and Dennis Ritchie were reflecting on the â€œhistoryâ€ of UNIX in â€œThe UNIX Time-Sharing System: A Retrospective.â€ Ritchie maintains a Web site that contains a valuable collection of early documents and his own reminiscences (http://www[PAGE 325].cs.bell-labs.com/who/dmr/). Mahoney has also conducted interviews with the main participants in the development of UNIX at Bell Labs. These interviews have not been published anywhere, but are drawn on as background in this chapter (interviews are in Mahoneyâ€™s personal files).
6 Turing, â€œOn Computable Numbers.â€ See also Davis, Engines of Logic, for a basic explanation.
9 A large number of editors were created in the 1970s; Richard Stallmanâ€™s EMACS and Bill Joyâ€™s vi remain the most well known. Douglas Engelbart is somewhat too handsomely credited with the creation of the interactive computer, but the work of Butler Lampson and Peter Deutsch in Berkeley, as well as that of the Multics team, Ken Thompson, and others on early on-screen editors is surely more substantial in terms of the fundamental ideas and problems of manipulating text files on a screen. This story is largely undocumented, save for in the computer-science literature itself. On Engelbart, see Bardini, Bootstrapping.
13 Ultimately, the Department of Justice case against IBM used bundling as evidence of monopolistic behavior, in addition to claims about the creation of so-called Plug Compatible Machines, devices that were reverse-engineered by meticulously constructing both the mechanical interface and the software that would communicate with IBM mainframes. See Franklin M. Fischer, Folded, Spindled, and Mutilated; Brock, The Second Information Revolution.
14 The story of this project and the lessons Brooks learned are the subject of one of the most famous software-development handbooks, The Mythical Man-Month, by Frederick Brooks.
15 The computer industry has always relied heavily on trade secret, much less so on patent and copyright. Trade secret also produces its own form of order, access, and circulation, which was carried over into the early software industry as well. See Kidder, The Soul of a New Machine for a classic account of secrecy and competition in the computer industry.
16 On time sharing, see Lee et al., â€œProject MAC.â€ Multics makes an appearance in nearly all histories of computing, the best resource by far being Tom van Vleckâ€™s Web site http://www.multicians.org/.
17 Some widely admired technical innovations (many of which were borrowed from Multics) include: the hierarchical file system, the command shell for interacting with the system; the decision to treat everything, including external devices, as the same kind of entity (a file), the â€œpipeâ€ operator which allowed the output of one tool to be â€œpipedâ€ as input to another tool, facilitating the easy creation of complex tasks from simple tools.
20 Ritchieâ€™s Web site contains a copy of a 1974 license (http://cm.bell-labs.com/cm/cs/who/dmr/licenses.html) and a series of ads that exemplify the uneasy positioning of UNIX as a commercial product (http://cm.bell-labs.com/cm/cs/who/dmr/unixad.html). According to Don Libes and Sandy Ressler, â€œThe original licenses were source licenses. . . . [C]ommercial institutions paid fees on the order of $20,000. If you owned more than one machine, you had to buy binary licenses for every additional machine [i.e., you were not allowed to copy the source and install it] you wanted to install UNIX on. They were fairly pricey at $8000, considering you couldnâ€™t resell them. On the other hand, educational institutions could buy source licenses for several hundred dollarsâ€”just enough to cover Bell Labsâ€™ administrative overhead and the cost of the tapesâ€ (Life with UNIX, 20â€“21).
21 According to Salus, this licensing practice was also a direct result of Judge Thomas Meaneyâ€™s 1956 antitrust consent decree which required AT&T to reveal and to license its patents for nominal fees (A Quarter Century of UNIX, 56); see also Brock, The Second Information Revolution, 116â€“20.
22 Even in computer science, source code was rarely formally shared, and more likely presented in the form of theorems and proofs, or in various idealized higher-level languages such as Donald Knuthâ€™s MIX language for presenting algorithms (Knuth, The Art of Computer Programming). Snippets of actual source code are much more likely to be found in printed form in handbooks, manuals, how-to guides, and other professional publications aimed at training programmers.
23 The simultaneous development of the operating system and the norms for creating, sharing, documenting, and extending it are often referred to as the â€œUNIX philosophy.â€ It includes the central idea that one should build on the ideas (software) of others (see Gancarz, The Unix Philosophy and Linux and the UNIX Philosophy). See also Raymond, The Art of UNIX Programming.
24 Bell Labs threatened the nascent UNIX NEWS newsletter with trademark infringement, so â€œUSENIXâ€ was a concession that harkened back to the original USE usersâ€™ group for DEC machines, but avoided explicitly using the name UNIX. Libes and Ressler, Life with UNIX, 9.
27 Ken Thompson and Dennis Ritchie, â€œThe Unix Operating System,â€ Bell Systems Technical Journal (1974).
31 Tanenbaumâ€™s two most famous textbooks are Operating Systems and Computer Networks, which have seen three and four editions respectively.
32 Tanenbaum was not the only person to follow this route. The other acknowledged giant in the computer-science textbook world, Douglas Comer, created Xinu and Xinu-PC (UNIX spelled backwards) in Operating Systems Design in 1984.
35 A recent court case between the Utah-based SCOâ€”the current owner of the legal rights to the original UNIX source codeâ€”and IBM raised yet again the question of how much of the original UNIX source code exists in the BSD distribution. SCO alleges that IBM (and Linus Torvalds) inserted SCO-owned UNIX source code into the Linux kernel. However, the incredibly circuitous route of the â€œoriginalâ€ source code makes these claims hard to ferret out: it was developed at Bell Labs, licensed to multiple universities, used as a basis for BSD, sold to an earlier version of the company SCO (then known as the Santa Cruz Operation), which created a version called Xenix in cooperation with Microsoft. See the diagram by Eric LÃ©vÃ©nez at http://www.levenez.com/unix/. For more detail on this case, see www.groklaw.com.
36 See Vinton G. Cerf and Robert Kahn, â€œA Protocol for Packet Network Interconnection.â€ For the history, see Abbate, Inventing the Internet; Norberg and Oâ€™Neill, A History of the Information Techniques Processing Office. Also see chapters 1 and 5 herein for more detail on the role of these protocols and the RFC process.
38 The exception being a not unimportant tool called Unix to Unix Copy Protocol, or uucp, which was widely used to transmit data by phone and formed the bases for the creation of the Usenet. See Hauben and Hauben, Netizens.
41 See Andrew Leonard, â€œBSD Unix: Power to the People, from the Code,â€ Salon, 16 May 2000, http://archive.salon.com/tech/fsp/2000/05/16/chapter_2_part_one/.
42 Norberg and Oâ€™Neill, A History of the Information Techniques Processing Office, 184â€“85. They cite Comer, Internetworking with TCP/IP, 6 for the figure.
5. Conceiving Open Systems
1 Quoted in Libes and Ressler, Life with UNIX, 67, and also in Critchley and Batty, Open Systems, 17. I first heard it in an interview with Sean Doyle in 1998.
2 Moral in this usage signals the â€œmoral and social orderâ€ I explored through the concept of social imaginaries in chapter 1. Or, in the Scottish Enlightenment sense of Adam Smith, it points to the right organization and relations of exchange among humans.
3 There is, of course, a relatively robust discourse of open systems in biology, sociology, systems theory, and cybernetics; however, that meaning of open systems is more or less completely distinct from what openness and open systems came to mean in the computer industry in the period book-ended by the arrivals of the personal computer and the explosion of the Internet (ca. 1980â€“93). One relevant overlap between these two meanings can be found in the work of Carl Hewitt at the MIT Media Lab and in the interest in â€œagoricsâ€ taken by K. Eric Drexler, Bernardo Huberman, and Mark S. Miller. See Huberman, The Ecology of Computation.
5 General Motors stirred strong interest in open systems by creating, in 1985, its Manufacturing Automation Protocol (MAP), which was built on UNIX. At the time, General Motors was the second-largest purchaser of computer equipment after the government. The Department of Defense and the U.S. Air Force also adopted and required POSIX-compliant UNIX systems early on.
6 Paul Fusco, â€œThe Gospel According to Joy,â€ New York Times, 27 March 1988, Sunday Magazine, 28.
9 An excellent counterpoint here is Paul Edwardsâ€™s The Closed World, which clearly demonstrates the appeal of a thoroughly and hierarchically controlled system such as the Semi-Automated Ground Environment (SAGE) of the Department of Defense against the emergence of more â€œgreen worldâ€ models of openness.
11 McKenna, Whoâ€™s Afraid of Big Blue? 178, emphasis added. McKenna goes on to suggest that computer companies can differentiate themselves by adding services, better interfaces, or higher reliabilityâ€”ironically similar to arguments that the Open Source Initiative would make ten years later.
12 Richard Stallman, echoing the image of medieval manacled wretches, characterized the blind spot thus: â€œUnix does not give the user any more legal freedom than Windows does. What they mean by â€˜open systemsâ€™ is that you can mix and match components, so you can decide to have, say, a Sun chain on your right leg and some other companyâ€™s chain on your left leg, and maybe some third companyâ€™s chain on your right arm, and this is supposed to be better than having to choose to have Sun chains on all your limbs, or Microsoft chains on all your limbs. You know, I donâ€™t care whose chains are on each limb. What I want is not to be chained by anyoneâ€ (â€œRichard Stallman: High School Misfit, Symbol of Free Software, MacArthur-certified Genius,â€ interview by Michael Gross, Cambridge, Mass., 1999, 5, http://www.mgross.com/MoreThgsChng/interviews/stallman1.html).
13 A similar story can be told about the emergence, in the late 1960s and early 1970s, of manufacturers of â€œplug-compatibleâ€ devices, peripherals that plugged into IBM machines (see Takahashi, â€œThe Rise and Fall of the Plug Compatible Manufacturersâ€). Similarly, in the 1990s the story of browser compatibility and the World Wide Web Consortium (W3C) standards is another recapitulation.
16 Eric Raymond, â€œOrigins and History of Unix, 1969â€“1995,â€ The Art of UNIX Programming, http://www.faqs.org/docs/artu/ch02s01.html#id2880014.
17 Libes and Ressler, Life with UNIX, 22. Also noted in Tanenbaum, â€œThe UNIX Marketplace in 1987,â€ 419.
19 A case might be made that a third definition, the ANSI standard for the C programming language, also covered similar ground, which of course it would have had to in order to allow applications written on one [PAGE 330] operating system to be compiled and run on another (see Gray, Open Systems, 55â€“58; Libes and Ressler, Life with UNIX, 70â€“75).
21 Thomas C. Hayesdallas, â€œAT&Tâ€™s Unix Is a Hit at Last, and Other Companies Are Wary,â€ New York Times, 24 February 1988, D8.
23 Andrew Pollack, â€œComputer Gangs Stake Out Turf,â€ New York Times, 13 December 1988, D1. See also Evelyn Richards, â€œComputer Firms Get a Taste of â€˜Gang Warfare,â€™â€ Washington Post, 11 December 1988, K1; Brit Hume, â€œIBM, Once the Bully on the Block, Faces a Tough New PC Gang,â€ Washington Post, 3 October 1988, E24.
24 â€œWhat Is Unix?â€ The Unix System, http://www.unix.org/what_is_unix/history_timeline.html.
25 â€œAbout the Open Group,â€ The Open Group, http://www.opengroup.org/overview/vision-mission.htm.
26 â€œWhat Is Unix?â€ The Unix System, http://www.unix.org/what_is_unix/history_timeline.html.
27 Larry McVoy was an early voice, within Sun, arguing for solving the open-systems problem by turning to Free Software. Larry McVoy, â€œThe Sourceware Operating System Proposal,â€ 9 November 1993, http://www.bitmover.com/lm/papers/srcos.html.
28 The distinction between a protocol, an implementation and a standard is important: Protocols are descriptions of the precise terms by which two computers can communicate (i.e., a dictionary and a handbook for communicating). An implementation is the creation of software that uses a protocol (i.e., actually does the communicating; thus two implementations using the same protocol should be able to share data. A standard defines which protocol should be used by which computers, for what purposes. It may or may not define the protocol, but will set limits on changes to that protocol.
29 The advantages of such an unplanned and unpredictable network have come to be identified in hindsight as a design principle. See Gillespie, â€œEngineering a Principleâ€ for an excellent analysis of the history of â€œend to endâ€ or â€œstupidâ€ networks.
33 On Usenet, see Hauben and Hauben, Netizens. See also Pfaffenberger, â€œâ€˜A Standing Wave in the Web of Our Communications.â€™â€
36 There is little information on the development of open systems; there is, however, a brief note from William Stallings, author of perhaps the most widely used textbook on networking, at â€œThe Origins of OSI,â€ http://williamstallings.com/Extras/OSI.html.
37 Brock, The Second Information Revolution is a good introductory source for this conflict, at least in its policy outlines. The Federal Communications Commission issued two decisions (known as â€œComputer 1â€ and â€œComputer 2â€) that attempted to deal with this conflict by trying to define what counted as voice communication and what as data.
41 The usable past of Giordano Bruno is invoked by Malamud to signal the heretical nature of his own commitment to openly publishing standards that ISO was opposed to releasing. Brunoâ€™s fate at the hands of the Roman Inquisition hinged in some part on his acceptance of the Copernican cosmology, so he has been, like Galileo, a natural figure for revolutionary claims during the 1990s.
42 Abbate, Inventing the Internet; Salus, Casting the Net; Galloway, Protocol; and Brock, The Second Information Revolution. For practitioner histories, see Kahn et al., â€œThe Evolution of the Internet as a Global Information Systemâ€; Clark, â€œThe Design Philosophy of the DARPA Internet Protocols.â€
43 Kahn et al., â€œThe Evolution of the Internet as a Global Information System,â€ 134â€“140; Abbate, Inventing the Internet, 114â€“36.
49 This can be clearly seen, for instance, by comparing the various editions of the main computer-networking textbooks: cf. Tanenbaum, Computer Networks, 1st ed. (1981), 2d ed. (1988), 3d ed. (1996), and 4th ed. (2003); Stallings, Data and Computer Communications, 1st ed. (1985), 2d ed. (1991), [PAGE 332] 3d ed. (1994), 4th ed. (1997), and 5th ed. (2004); and Comer, Internetworking with TCP/IP (four editions between 1991 and 1999).
51 The structure of the IETF, the Internet Architecture Board, and the ISOC is detailed in Comer, Internetworking with TCP/IP, 8â€“13; also in Schmidt and Werle, Coordinating Technology, 53â€“58.
6. Writing Copyright Licenses
1 The legal literature on Free Software expands constantly and quickly, and it addresses a variety of different legal issues. Two excellent starting points are Vetter, â€œThe Collaborative Integrity of Open-Source Softwareâ€ and â€œâ€˜Infectiousâ€™ Open Source Software.â€
3 â€œThe GNU General Public Licence, Version 2.0,â€ http://www.gnu.org/licenses/old-licenses/gpl-2.0.html.
4 All existing accounts of the hacker ethic come from two sources: from Stallman himself and from the colorful and compelling chapter about Stallman in Steven Levyâ€™s Hackers. Both acknowledge a prehistory to the ethic. Levy draws it back in time to the MIT Tech Model Railroad Club of the 1950s, while Stallman is more likely to describe it as reaching back to the scientific revolution or earlier. The stories of early hackerdom at MIT are avowedly Edenic, and in them hackers live in a world of uncontested freedom and collegial competitionâ€”something like a writerâ€™s commune without the alcohol or the brawling. There are stories about a printer whose software needed fixing but was only available under a nondisclosure agreement; about a requirement to use passwords (Stallman refused, chose <return> as his password, and hacked the system to encourage others to do the same); about a programming war between different LISP machines; and about the replacement of the Incompatible Time-Sharing System with DECâ€™s TOPS-20 (â€œTwenexâ€) operating system. These stories are oft-told usable pasts, but they are not representative. Commercial constraints have always been part of academic life in computer science and engineering: hardware and software were of necessity purchased from commercial manufacturers and often controlled by them, even if they offered â€œacademicâ€ or â€œeducationalâ€ licenses.
7 Copyright Act of 1976, Pub. L. No. 94â€“553, 90 Stat. 2541, enacted 19 October 1976; and Copyright Amendments, Pub. L. No. 96â€“517, 94 Stat. 3015, 3028 (amending Â§101 and Â§117, title 17, United States Code, regarding computer programs), enacted 12 December 1980. All amendments since 1976 are listed at http://www.copyright.gov/title17/92preface.html.
8 The history of the copyright and software is discussed in Litman, Digital Copyright; Cohen et al., Copyright in a Global Information Economy; and Merges, Menell, and Lemley, Intellectual Property in the New Technological Age.
9 See Wayner, Free for All; Moody, Rebel Code; and Williams, Free as in Freedom. Although this story could be told simply by interviewing Stallman and James Gosling, both of whom are still alive and active in the software world, I have chosen to tell it through a detailed analysis of the Usenet and Arpanet archives of the controversy. The trade-off is between a kind of incomplete, fly-on-the-wall access to a moment in history and the likely revisionist retellings of those who lived through it. All of the messages referenced here are cited by their â€œMessage-ID,â€ which should allow anyone interested to access the original messages through Google Groups (http://groups.google.com).
10 Eugene Ciccarelli, â€œAn Introduction to the EMACS Editor,â€ MIT Artificial Intelligence Laboratory, AI Lab Memo no. 447, 1978, 2.
11 Richard Stallman, â€œEMACS: The Extensible, Customizable Self-documenting Display Editor,â€ MIT Artificial Intelligence Laboratory, AI Lab Memo no. 519a, 26 March 1981, 19. Also published as Richard M. Stallman, â€œEMACS: The Extensible, Customizable Self-documenting Display Editor,â€ Proceedings of the ACM SIGPLAN SIGOA Symposium on Text Manipulation, 8â€“10 June (ACM, 1981), 147â€“56.
12 Richard Stallman, â€œEMACS: The Extensible, Customizable Self-documenting Display Editor,â€ MIT Artificial Intelligence Laboratory, AI Lab Memo no. 519a, 26 March 1981, 24.
13 Richard M. Stallman, â€œEMACS Manual for ITS Users,â€ MIT Artificial Intelligence Laboratory, AI Lab Memo no. 554, 22 October 1981, 163.
14 Back in January of 1983, Steve Zimmerman had announced that the company he worked for, CCA, had created a commercial version of EMACS called CCA EMACS (Message-ID: firstname.lastname@example.org). Zimmerman had not written this version entirely, but had taken a version written by Warren Montgomery at Bell Labs (written for UNIX on PDP-11s) and created the version that was being used by programmers at CCA. Zimmerman had apparently distributed it by ftp at first, but when CCA determined that it might be worth something, they decided to exploit it commercially, rather than letting Zimmerman distribute it â€œfreely.â€ By Zimmermanâ€™s own [PAGE 334] account, this whole procedure required ensuring that there was nothing left of the original code by Warren Montgomery that Bell Labs owned (Message-ID: email@example.com).
21 Various other people seem to have conceived of a similar scheme around the same time (if the Usenet archives are any guide), including Guido Van Rossum (who would later become famous for the creation of the Python scripting language). The following is from Message-ID: firstname.lastname@example.org:
/* This software is copyright (c) Mathematical Centre, Amsterdam,
* Permission is granted to use and copy this software, but not for * profit,
* and provided that these same conditions are imposed on any person
* receiving or using the software.
24 The main file of the controversy was called display.c. A version that was modified by Chris Torek appears in net.sources, Message-ID: email@example.com. A separate example of a piece of code written by Gosling bears a note that claims he had declared it public domain, but did not â€œinclude the infamous Stallman anti-copyright clauseâ€ (Message-ID: firstname.lastname@example.org).
33 With the benefit of hindsight, the position that software could be in the public domain also seems legally uncertain, given that the 1976 changes to USCÂ§17 abolished the requirement to register and, by the same token, to render uncertain the status of code contributed to Gosling and incorporated into GOSMACS.
39 Joaquim Martillo, Message-ID: email@example.com: â€œTrying to forbid RMS from using discarded code so that he must spend time to reinvent the wheel supports his contention that â€˜software hoardersâ€™ are slowing down progress in computer science.â€
40 Diamond V. Diehr, 450 U.S. 175 (1981), the Supreme Court decision, forced the patent office to grant patents on software. Interestingly, software patents had been granted much earlier, but went either uncontested or unenforced. An excellent example is patent 3,568,156, held by Ken Thompson, on regular expression pattern matching, granted in 1971.
41 Calvin Mooers, in his 1975 article â€œComputer Software and Copyright,â€ suggests that the IBM unbundling decision opened the doors to thinking about copyright protection.
43 Goslingâ€™s EMACS 264 (Stallman copied EMACS 84) is registered with the Library of Congress, as is GNU EMACS 15.34. Goslingâ€™s EMACS Library of Congress registration number is TX-3â€“407â€“458, registered in 1992. Stallmanâ€™s registration number is TX-1â€“575â€“302, registered in May 1985. The listed dates are uncertain, however, since there are periodic re-registrations and updates.
46 A standard practice well into the 1980s, and even later, was the creation of so-called clean-room versions of software, in which new programmers and designers who had not seen the offending code were hired to [PAGE 336] re-implement it in order to avoid the appearance of trade-secret violation. Copyright law is a strict liability law, meaning that ignorance does not absolve the infringer, so the practice of â€œclean-room engineeringâ€ seems not to have been as successful in the case of copyright, as the meaning of infringement remains murky.
47 Message-ID: firstname.lastname@example.org. AT&T was less concerned about copyright infringement than they were about the status of their trade secrets. Zimmerman quotes a statement (from Message-ID: email@example.com) that he claims indicates this: â€œBeginning with CCA EMACS version 162.36z, CCA EMACS no longer contained any of the code from Mr. Montgomeryâ€™s EMACS, or any methods or concepts which would be known only by programmers familiar with BTL [Bell Labs] EMACS of any version.â€ The statement did not mention copyright, but implied that CCA EMACS did not contain any AT&T trade secrets, thus preserving their softwareâ€™s trade-secret status. The fact that EMACS was a conceptual designâ€”a particular kind of interface, a LISP interpreter, and extensibilityâ€”that was very widely imitated had no apparent bearing on the legal status of these secrets.
49 The cases that determine the meaning of the 1976 and 1980 amendments begin around 1986: Whelan Associates, Inc. v. Jaslow Dental Laboratory, Inc., et al., U.S. Third Circuit Court of Appeals, 4 August 1986, 797 F.2d 1222, 230 USPQ 481, affirming that â€œstructure (or sequence or organization)â€ of software is copyrightable, not only the literal software code; Computer Associates International, Inc. v. Altai, Inc., U.S. Second Circuit Court of Appeals, 22 June 1992, 982 F.2d 693, 23 USPQ 2d 1241, arguing that the structure test in Whelan was not sufficient to determine infringement and thus proposing a three-part â€œabstraction-filiation-comparisonâ€ test; Apple Computer, Inc. v. Microsoft Corp, U.S. Ninth Circuit Court of Appeals, 1994, 35 F.3d 1435, finding that the â€œdesktop metaphorâ€ used in Macintosh and Windows was not identical and thus did not constitute infringement; Lotus Development Corporation v. Borland International, Inc. (94â€“2003), 1996, 513 U.S. 233, finding that the â€œlook and feelâ€ of a menu interface was not copyrightable.
50 The relationship between the definition of source and target befuddles software law to this day, one of the most colorful examples being the DeCSS case. See Coleman, â€œThe Social Construction of Freedom,â€ chap. 1: Gallery of CSS Descramblers, http://www.cs.cmu.edu/~dst/DeCSS/gallery/.
51 An interesting addendum here is that the manual for EMACS was also released at around the same time as EMACS 16 and was available [PAGE 337] as a TeX file. Stallman also attempted to deal with the paper document in the same fashion (see Message-ID: firstname.lastname@example.org, 19 July 1985), and this would much later become a different and trickier issue that would result in the GNU Free Documentation License.
54 See Coleman, â€œThe Social Construction of Freedom,â€ chap. 6, on the Debian New Maintainer Process, for an example of how induction into a Free Software project stresses the legal as much as the technical, if not more.
7. Coordinating Collaborations
1 Research on coordination in Free Software forms the central core of recent academic work. Two of the most widely read pieces, Yochai Benklerâ€™s â€œCoaseâ€™s Penguinâ€ and Steven Weberâ€™s The Success of Open Source, are directed at classic research questions about collective action. Rishab Ghoshâ€™s â€œCooking Pot Marketsâ€ and Eric Raymondâ€™s The Cathedral and the Bazaar set many of the terms of debate. Josh Lernerâ€™s and Jean Tiroleâ€™s â€œSome Simple Economics of Open Sourceâ€ was an early contribution. Other important works on the subject are Feller et al., Perspectives on Free and Open Source Software; Tuomi, Networks of Innovation; Von Hippel, Democratizing Innovation.
2 On the distinction between adaptability and adaptation, see Federico Iannacci, â€œThe Linux Managing Model,â€ http://opensource.mit.edu/papers/iannacci2.pdf. Matt Ratto characterizes the activity of Linux-kernel developers as a â€œculture of re-workingâ€ and a â€œdesign for re-design,â€ and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the â€œpressure of opennessâ€ that â€œresults as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directionsâ€ (â€œThe Pressure of Openness,â€ 112â€“38).
3 Linux is often called an operating system, which Stallman objects to on the theory that a kernel is only one part of an operating system. Stallman suggests that it be called GNU/Linux to reflect the use of GNU operating-system tools in combination with the Linux kernel. This not-so-subtle ploy to take credit for Linux reveals the complexity of the distinctions. The kernel is at the heart of hundreds of different â€œdistributionsâ€â€”such as Debian, Red Hat, SuSe, and Ubuntu Linuxâ€”all of which also use GNU tools, but [PAGE 338] which are often collections of software larger than just an operating system. Everyone involved seems to have an intuitive sense of what an operating system is (thanks to the pedagogical success of UNIX), but few can draw any firm lines around the object itself.
4 Eric Raymond directed attention primarily to Linux in The Cathedral and the Bazaar. Many other projects preceded Torvaldsâ€™s kernel, however, including the tools that form the core of both UNIX and the Internet: Paul Vixieâ€™s implementation of the Domain Name System (DNS) known as BIND; Eric Allmanâ€™s sendmail for routing e-mail; the scripting languages perl (created by Larry Wall), python (Guido von Rossum), and tcl/tk (John Ousterhout); the X Windows research project at MIT; and the derivatives of the original BSD UNIX, FreeBSD and OpenBSD. On the development model of FreeBSD, see Jorgensen, â€œPutting It All in the Trunkâ€ and â€œIncremental and Decentralized Integration in FreeBSD.â€ The story of the genesis of Linux is very nicely told in Moody, Rebel Code, and Williams, Free as in Freedom; there are also a number of papersâ€”available through Free/Opensource Research Community, http://freesoftware.mit.edu/â€”that analyze the development dynamics of the Linux kernel. See especially Ratto, â€œEmbedded Technical Expressionâ€ and â€œThe Pressure of Openness.â€ I have conducted much of my analysis of Linux by reading the Linux Kernel Mailing List archives, http://lkml.org. There are also annotated summaries of the Linux Kernel Mailing List discussions at http://kerneltraffic.org.
5 Howard Rheingold, The Virtual Community. On the prehistory of this period and the cultural location of some key aspects, see Turner, From Counterculture to Cyberculture.
6 Julian Dibbellâ€™s â€œA Rape in Cyberspaceâ€ and Sherry Turkleâ€™s Life on the Screen are two classic examples of the detailed forms of life and collaborative ethical creation that preoccupied denizens of these worlds.
7 The yearly influx of students to the Usenet and Arpanet in September earned that month the title â€œthe longest month,â€ due to the need to train new users in use and etiquette on the newsgroups. Later in the 1990s, when AOL allowed subscribers access to the Usenet hierarchy, it became known as â€œeternal September.â€ See â€œSeptember that Never Ended,â€ Jargon File, http://catb.org/~esr/jargon/html/S/September-that-never-ended.html.
10 Indeed, initially, Torvaldsâ€™s terms of distribution for Linux were more restrictive than the GPL, including limitations on distributing it for a fee or for handling costs. Torvalds eventually loosened the restrictions and switched to the GPL in February 1992. Torvaldsâ€™s release notes for Linux 0.12 say, â€œThe Linux copyright will change: Iâ€™ve had a couple of requests [PAGE 339] to make it compatible with the GNU copyleft, removing the â€˜you may not distribute it for moneyâ€™ condition. I agree. I propose that the copyright be changed so that it conforms to GNUâ€”pending approval of the persons who have helped write code. I assume this is going to be no problem for anybody: If you have grievances (â€˜I wrote that code assuming the copyright would stay the sameâ€™) mail me. Otherwise The GNU copyleft takes effect as of the first of February. If you do not know the gist of the GNU copyrightâ€”read itâ€ (http://www.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.12).
14 Quoted in Zack Brown, â€œKernel Traffic #146 for 17Dec2001,â€ Kernel Traffic, http://www.kerneltraffic.org/kernel-traffic/kt20011217_146.html; also quoted in Federico Iannacci, â€œThe Linux Managing Model,â€ http://opensource.mit.edu/papers/iannacci2.pdf.
16 The original Apache Group included Brian Behlendorf, Roy T. Fielding, Rob Harthill, David Robinson, Cliff Skolnick, Randy Terbush, Robert S. Thau, Andrew Wilson, Eric Hagberg, Frank Peters, and Nicolas Pioch. The mailing list new-httpd eventually became the Apache developers list. The archives are available at http://mail-archives.apache.org/mod_mbox/httpd-dev/ and cited hereafter as â€œApache developer mailing list,â€ followed by sender, subject, date, and time.
17 For another version of the story, see Moody, Rebel Code, 127â€“28. The official story honors the Apache Indian tribes for â€œsuperior skills in warfare strategy and inexhaustible endurance.â€ Evidence of the concern of the original members over the use of the name is clearly visible in the archives of the Apache project. See esp. Apache developer mailing list, Robert S. Thau, Subject: The political correctness question . . . , 22 April 1995, 21:06 EDT.
18 Mockus, Fielding, and Herbsleb, â€œTwo Case Studies of Open Source Software Development,â€ 3.
19 Apache developer mailing list, Andrew Wilson, Subject: Re: httpd patch B5 updated, 14 March 1995, 21:49 GMT.
20 Apache developer mailing list, Rob Harthill, Subject: Re: httpd patch B5 updated, 14 March 1995, 15:10 MST.
21 Apache developer mailing list, Cliff Skolnick, Subject: Process (please read), 15 March 1995, 3:11 PST; and Subject: Patch file format, 15 March 1995, 3:40 PST.
22 Apache developer mailing list, Rob Harthill, Subject: patch list vote, 15 March 1995, 13:21:24 MST.
23 Apache developer mailing list, Rob Harthill, Subject: apache-0.2 on hyperreal, 18 March 1995, 18:46 MST.
24 Apache developer mailing list, Cliff Skolnick, Subject: Re: patch list vote, 21 March 1995, 2:47 PST.
25 Apache developer mailing list, Paul Richards, Subject: Re: vote counting, 21 March 1995, 22:24 GMT.
28 Apache developer mailing list, Robert S. Thau, Subject: My Garage Project, 12 June 1995, 21:14 GMT.
30 Apache developer mailing list, Rob Harthill, Subject: Re: Shambhala, 30 June 1995, 14:50 MDT.
31 Apache developer mailing list, Rob Harthill, Subject: Re: Shambhala, 30 June 1995, 16:48 GMT.
32 Gabriella Coleman captures this nicely in her discussion of the tension between the individual virtuosity of the hacker and the corporate populism of groups like Apache or, in her example, the Debian distribution of Linux. See Coleman, The Social Construction of Freedom.
33 Apache developer mailing list, Robert S. Thau, Subject: Re: Shambhala, 1 July 1995, 14:42 EDT.
34 A slightly different explanation of the role of modularity is discussed in Steven Weber, The Success of Open Source, 173â€“75.
36 See Steven Weber, The Success of Open Source, 117â€“19; Moody, Rebel Code, 172â€“78. See also Shaikh and Cornford, â€œVersion Management Tools.â€
37 Linus Torvalds, â€œRe: [PATCH] Remove Bitkeeper Documentation from Linux Tree,â€ 20 April 2002, http://www.uwsg.indiana.edu/hypermail/linux/kernel/0204.2/1018.html. Quoted in Shaikh and Cornford, â€œVersion Management Tools.â€
38 Andrew Orlowski, â€œâ€˜Cool it, Linusâ€™â€”Bruce Perens,â€ Register, 15 April 2005, http://www.theregister.co.uk/2005/04/15/perens_on_torvalds/page2.html.
39 Similar debates have regularly appeared around the use of non-free compilers, non-free debuggers, non-free development environments, and so forth. There are, however, a large number of people who write and promote Free Software that runs on proprietary operating systems like Macintosh and Windows, as well as a distinction between tools and formats. So, [PAGE 341] for instance, using Adobe Photoshop to create icons is fine so long as they are in standard open formats like PNG or JPG, and not proprietary forms such as GIF or photoshop.
43 Gabriella Coleman, in â€œThe Social Construction of Freedom,â€ provides an excellent example of a programmerâ€™s frustration with font-lock in EMACS, something that falls in between a bug and a feature. The programmerâ€™s frustration is directed at the stupidity of the design (and implied designers), but his solution is not a fix, but a work-aroundâ€”and it illustrates how debugging does not always imply collaboration.
8. â€œIf We Succeed, We Will Disappearâ€
1 In January 2005, when I first wrote this analysis, this was true. By April 2006, the Hewlett Foundation had convened the Open Educational Resources â€œmovementâ€ as something that would transform the production and circulation of textbooks like those created by Connexions. Indeed, in Rich Baraniukâ€™s report for Hewlett, the first paragraph reads: â€œA grassroots movement is on the verge of sweeping through the academic world. The open education movement is based on a set of intuitions that are shared by a remarkably wide range of academics: that knowledge should be free and open to use and re-use; that collaboration should be easier, not harder; that people should receive credit and kudos for contributing to education and research; and that concepts and ideas are linked in unusual and surprising ways and not the simple linear forms that textbooks present. Open education promises to fundamentally change the way authors, instructors, and students interact worldwideâ€ (Baraniuk and King, â€œConnexionsâ€). (In a nice confirmation of just how embedded participation can become in anthropology, Baraniuk cribbed the second sentence from something I had written two years earlier as part of a description of what I thought Connexions hoped to achieve.) The â€œmovementâ€ as such still does not quite exist, but the momentum for it is clearly part of the actions that Hewlett hopes to achieve.
2 See Chris Beam, â€œFathom.com Shuts Down as Columbia Withdraws,â€ Columbia Spectator, 27 January 2003, http://www.columbiaspectator.com/. Also see David Nobleâ€™s widely read critique, â€œDigital Diploma Mills.â€
3 â€œProvost Announces Formation of Council on Educational Technology,â€ MIT Tech Talk, 29 September 1999, http://web.mit.edu/newsoffice/1999/council-0929.html.
4 The software consists of a collection of different Open Source Software cobbled together to provide the basic platform (the Zope and Plone content-management frameworks, the PostGresQL database, the python programming language, and the cvs version-control software).
5 The most significant exception has been the issue of tools for authoring content in XML. For most of the life of the Connexions project, the XML mark-up language has been well-defined and clear, but there has been no way to write a module in XML, short of directly writing the text and the tags in a text editor. For all but a very small number of possible users, this feels too much like programming, and they experience it as too frustrating to be worth it. The solution (albeit temporary) was to encourage users to make use of a proprietary XML editor (like a word processor, but capable of creating XML content). Indeed, the Connexions projectâ€™s devotion to openness was tested by one of the most important decisions its participants made: to pursue the creation of an Open Source XML text editor in order to provide access to completely open tools for creating completely open content.
7 The movement is the component that remains unmodulated: there is no â€œfree textbookâ€ movement associated with Connexions, even though many of the same arguments that lead to a split between Free Software and Open Source occur here: the question of whether the term free is confusing, for example, or the role of for-profit publishers or textbook companies. In the end, most (though not all) of the Connexions staff and many of its users are content to treat it as a useful tool for composing novel kinds of digital educational materialâ€”not as a movement for the liberation of educational content.
9 Lessigâ€™s output has been prodigious. His books include Code and Other Laws of Cyber Space, The Future of Ideas, Free Culture, and Code: Version 2.0. He has also written a large number of articles and is an active blogger (http://www.lessig.org/blog/).
10 There were few such projects under way, though there were many in the planning stages. Within a year, the Public Library of Science had launched itself, spearheaded by Harold Varmus, the former director of the National Institutes of Health. At the time, however, the only other large scholarly project was the MIT Open Course Ware project, which, although it had already agreed to use Creative Commons licenses, had demanded a peculiar one-off license.
11 The fact that I organized a workshop to which I invited â€œinformantsâ€ and to which I subsequently refer as research might strike some, both in anthropology and outside it, as wrong. But it is precisely the kind of occasion I would argue has become central to the problematics of method in cultural anthropology today. On this subject, see Holmes and Marcus, â€œCultures of Expertise and the Management of Globalization.â€ Such strategic and seemingly ad hoc participation does not exclude one from attempting to later disentangle oneself from such participation, in order to comment on the value and significance, and especially to offer critique. Such is the attempt to achieve objectivity in social science, an objectivity that goes beyond the basic notions of bias and observer-effect so common in the social sciences. â€œObjectivityâ€ in a broader social sense includes the observation of the conceptual linkages that both precede such a workshop (constituted the need for it to happen) and follow on it, independent of any particular meeting. The complexity of mobilizing objectivity in discussions of the value and significance of social or economic phenomena was well articulated a century ago by Max Weber, and problems of method in the sense raised by him seem to me to be no less fraught today. See Max Weber, â€œObjectivity in the Social Sciences.â€
12 Suntrust v. Houghton Mifflin Co., U.S. Eleventh Circuit Court of Appeals, 2001, 252 F. 3d 1165.
13 Neil Strauss, â€œAn Uninvited Bassist Takes to the Internet,â€ New York Times, 25 August 2002, sec. 2, 23.
14 Indeed, in a more self-reflective moment, Glenn once excitedly wrote to me to explain that what he was doing was â€œcode-switchingâ€ and that he thought that geeks who constantly involved themselves in technology, law, music, gaming, and so on would be prime case studies for a code-switching study by anthropologists.
17 Hence, Boyleâ€™s â€œSecond Enclosure Movementâ€ and â€œcopyright conservancyâ€ concepts (see Boyle, â€œThe Second Enclosure Movementâ€; Bollier, Silent Theft). Perhaps the most sophisticated and compelling expression of the institutional-economics approach to understanding Free Software is the work of Yochai Benkler, especially â€œSharing Nicelyâ€ and â€œCoaseâ€™s Penguin.â€ See also Benkler, Wealth of Networks.
20 In particular, Glenn Brown suggested Oliver Wendell Holmes as a kind of origin point both for critical legal realism and for law and economics, a kind of filter through which lawyers get both their Nietzsche [PAGE 344] and their liberalism (see Oliver Wendell Holmes, â€œThe Path of the Lawâ€). Glennâ€™s opinion was that what he called â€œpunting to cultureâ€ (by which he meant writing minimalist laws which allow social custom to fill in the details) descended more or less directly from the kind of legal reasoning embodied in Holmes: â€œNote that [Holmes] is probably best known in legal circles for arguing that questions of morality be removed from legal analysis and left to the field of ethics. this is what makes him the godfather of both the posners of the world, and the crits, and the strange hybrids like lessigâ€ (Glenn Brown, personal communication, 11 August 2003).
9. Reuse, Modification, Norms
1 Actor-network theory comes closest to dealing with such â€œontologicalâ€ issues as, for example, airplanes in John Lawâ€™s Aircraft Stories, the disease atheroscleroris in Annemarie Molâ€™s The Body Multiple, or in vitro fertilization in Charis Thompsonâ€™s Making Parents. The focus here on finality is closely related, but aims at revealing the temporal characteristics of highly modifiable kinds of knowledge-objects, like textbooks or databases, as in Geoffrey Bowkerâ€™s Memory Practices in the Sciences.
3 See Johns, The Nature of the Book; Eisenstein, The Printing Press as an Agent of Change; McLuhan, The Gutenberg Galaxy and Understanding Media; Febvre and Martin, The Coming of the Book; Ong, Ramus, Method, and the Decay of Dialogue; Chartier, The Cultural Uses of Print in Early Modern France and The Order of Books; Kittler, Discourse Networks 1800/1900 and Gramophone, Film, Typewriter.
4 There is less communication between the theorists and historians of copyright and authorship and those of the book; the former are also rich in analyses, such as Jaszi and Woodmansee, The Construction of Authorship; Mark Rose, Authors and Owners; St. Amour, The Copywrights; Vaidhyanathan, Copyrights and Copywrongs.
5 Eisenstein, The Printing Press as an Agent of Change. Eisensteinâ€™s work makes direct reference to McLuhanâ€™s thesis in The Gutenberg Galaxy, and Latour relies on these works and others in â€œDrawing Things Together.â€
7 On this subject, cf. Pablo Boczkowskiâ€™s study of the digitization of newspapers, Digitizing the News.
8 Conventional here is actually quite historically proximate: the system creates a pdf document by translating the XML document into a LaTeX document, then into a pdf document. LaTeX has been, for some twenty years, a standard text-formatting and typesetting language used by some [PAGE 345] sectors of the publishing industry (notably mathematics, engineering, and computer science). Were it not for the existence of this standard from which to bootstrap, the Connexions project would have faced a considerably more difficult challenge, but much of the infrastructure of publishing has already been partially transformed into a computer-mediated and -controlled system whose final output is a printed book. Later in Connexionsâ€™s lifetime, the group coordinated with an Internet-publishing startup called Qoop.com to take the final step and make Connexions courses available as print-on-demand, cloth-bound textbooks, complete with ISBNs and back-cover blurbs.
10 On fixity, see Eisensteinâ€™s The Printing Press as an Agent of Change which cites McLuhanâ€™s The Gutenberg Galaxy. The stability of texts is also questioned routinely by textual scholars, especially those who work with manuscripts and complicated varoria (for an excellent introduction, see Bornstein and Williams, Palimpsest). Michel Foucaultâ€™s â€œWhat Is an Author?â€ addresses a related but orthogonal problematic and is unconcerned with the relatively sober facts of a changing medium.
11 A salient and recent point of comparison can be found in the form of Lawrence Lessigâ€™s â€œsecond editionâ€ of his book Code, which is titled Code: Version 2.0 (version is used in the title, but edition is used in the text). The first book was published in 1999 (â€œancient history in Internet timeâ€), and Lessig convinced the publisher to make it available as a wiki, a collaborative Web site which can be directly edited by anyone with access. The wiki was edited and updated by hordes of geeks, then â€œclosedâ€ and reedited into a second edition with a new preface. It is a particularly tightly controlled example of collaboration; although the wiki and the book were freely available, the modification and transformation of them did not amount to a simple free-for-all. Instead, Lessig leveraged his own authority, his authorial voice, and the power of Basic Books to create something that looks very much like a traditional second edition, although it was created by processes unimaginable ten years ago.
12 The most familiar comparison is Wikipedia, which was started after Connexions, but grew far more quickly and dynamically, largely due to the ease of use of the system (a bone of some contention among the Connexions team). Wikipedia has come under assault primarily for being unreliable. The suspicion and fear that surround Wikipedia are similar to those that face Connexions, but in the case of Wikipedia entries, the commitment to openness is stubbornly meritocratic: any article can be edited by anyone at anytime, and it matters not how firmly one is identified as an expert by rank, title, degree, or experienceâ€”a twelve year oldâ€™s knowledge of the Peloponnesian War is given the same access and status as an eighty-year-old classicistâ€™s. Articles are not owned by individuals, and [PAGE 346] all work is pseudonymous and difficult to track. The range of quality is therefore great, and the mainstream press has focused largely on whether Wikipedia is more or less reliable than conventional encyclopedias, not on the process of knowledge production. See, for instance, George Johnson, â€œThe Nitpicking of the Masses vs. the Authority of the Experts,â€ New York Times, 3 January 2006, Late Editionâ€”Final, F2; Robert McHenry, â€œThe Faith-based Encyclopedia,â€ TCS Daily, 15 November 2004, http://www.techcentralstation.com/111504A.html.
13 Again, a comparison with Wikipedia is apposite. Wikipedia is, morally speaking, and especially in the persona of its chief editor, Jimbo Wales, totally devoted to merit-based equality, with users getting no special designation beyond the amount and perceived quality of the material they contribute. Degrees or special positions of employment are anathema. It is a quintessentially American, anti-intellectual-fueled, Horatio Algerâ€“style approach in which the slate is wiped clean and contributors are given a chance to prove themselves independent of background. Connexions, by contrast, draws specifically from the ranks of intellectuals or academics and seeks to replace the infrastructure of publishing. Wikipedia is interested only in creating a better encyclopedia. In this respect, it is transhumanist in character, attributing its distinctiveness and success to the advances in technology (the Internet, wiki, broadband connections, Google). Connexions on the other hand is more polymathic, devoted to intervening into the already complexly constituted organizational practice of scholarship and academia.
14 An even more technical feature concerned the issue of the order of authorship. The designers at first decided to allow Connexions to simply display the authors in alphabetical order, a practice adopted by some disciplines, like computer science. However, in the case of the Housman example this resulted in what looked like a module authored principally by me, and only secondarily by A. E. Housman. And without the ability to explicitly designate order of authorship, many disciplines had no way to express their conventions along these lines. As a result, the system was redesigned to allow users to designate the order of authorship as well.
15 I refer here to Eric Raymondâ€™s â€œdiscoveryâ€ that hackers possess unstated norms that govern what they do, in addition to the legal licenses and technical practices they engage in (see Raymond, â€œHomesteading the Noosphereâ€). For a critique and background on hacker ethics and norms, see Coleman, â€œThe Social Construction of Freedom.â€
16 Bruno Latourâ€™s Science in Action makes a strong case for the centrality of â€œblack boxesâ€ in science and engineering for precisely this reason.
17 I should note, in my defense, that my efforts to get my informants to read Max Weber, Ferdinand TÃ¶nnies, Henry Maine, or Emile Durkheim [PAGE 347] proved far less successful than my creation of nice Adobe Illustrator diagrams that made explicit the reemergence of issues addressed a century ago. It was not for lack of trying, however.
20 In December 2006 Creative Commons announced a set of licenses that facilitate the â€œfollow upâ€ licensing of a work, especially one initially issued under a noncommercial license.
21 Message from the cc-sampling mailing list, Glenn Brown, Subject: BACKGROUND: â€œAS APPROPRIATE TO THE MEDIUM, GENRE, AND MARKET NICHE,â€ 23 May 2003, http://lists.ibiblio.org/pipermail/cc-sampling/2003-May/000004.html.
22 Sampling offers a particularly clear example of how Creative Commons differs from the existing practice and infrastructure of music creation and intellectual-property law. The music industry has actually long recognized the fact of sampling as something musicians do and has attempted to deal with it by making it an explicit economic practice; the music industry thus encourages sampling by facilitating the sale between labels and artists of rights to make a sample. Record companies will negotiate prices, lengths, quality, and quantity of sampling and settle on a price.
This practice is set opposite the assumption, also codified in law, that the public has a right to a fair use of copyrighted material without payment or permission. Sampling a piece of music might seem to fall into this category of use, except that one of the tests of fair use is that the use not impact any existing market for such uses, and the fact that the music industry has effectively created a market for the buying and selling of samples means that sampling now routinely falls outside the fair uses codified in the statute, thus removing sampling from the domain of fair use. Creative Commons licenses, on the other hand, say that owners should be able to designate their material as â€œsample-able,â€ to give permission ahead of time, and by this practice to encourage others to do the same. They give an â€œhonorableâ€ meaning to the practice of sampling for free, rather than the dishonorable one created by the industry. It thus becomes a war over the meaning of norms, in the law-and-economics language of Creative Commons and its founders.
1 See http://cnx.org, http://www.creativecommons.org, http://www.earlham.edu/~peters/fos/overview.htm, http://www.biobricks.org, http://www.freebeer.org, http://freeculture.org, http://www.cptech.org/[PAGE 348]a2k, http://www.colawp.com/colas/400/cola467_recipe.html, http://www.elephantsdream.org, http://www.sciencecommons.org, http://www.plos.org, http://www.openbusiness.cc, http://www.yogaunity.org, http://osdproject.com, http://www.hewlett.org/Programs/Education/oer/, and http://olpc.com.
3 See especially Christen, â€œTracking Propernessâ€ and â€œGone Digitalâ€; Brown, Who Owns Native Culture? and â€œHeritage as Property.â€ Crowdsourcing fits into other novel forms of labor arrangements, ranging from conventional outsourcing and off-shoring to newer forms of bodyshopping and â€œvirtual migrationâ€ (see Aneesh, Virtual Migration; Xiang, â€œGlobal Bodyshoppingâ€ ).
Posted by Christopher Kelty on May 8, 2008