Scholarship@Cornell Law: A Digital Repository

The Internet is a semicommons. Private property in servers and network links coexists with a shared communications platform. This distinctive combination both explains the Internet's enormous success and illustrates some of its recurring problems.Building on Henry Smith's theory of the semicommons in the medieval open-field system, this essay explains how the dynamic interplay between private and common uses on the Internet enables it to facilitate worldwide sharing and collaboration without collapsing under the strain of misuse. It shows that key technical features of the Internet, such as its layering of protocols and the Web's division into distinct "sites," respond to the characteristic threats of strategic behavior in a semicommons. An extended case study of the Usenet distributed messaging system shows that not all semicommons on the Internet succeed; the continued success of the Internet depends on our ability to create strong online communities that can manage and defend the infrastructure on which they rely. Private and common both have essential roles to play in that task, a lesson recognized in David Post's and Jonathan Zittrain's recent books on the Internet.


THE INTERNET IS A SEMICOMMONS
James Grimmelmann * I. INTRODUCTION As my contribution to this Symposium on David Post's In Search of Jefferson's Moose and Jonathan Zittrain's The Future of the Internet, 2 I'd like to take up a question with which both books are obsessed: what makes the Internet work ?Post's answer is that the Internet is uniquely Jeffersonian; it embodies a civic ideal of bottom-up democracy 3 and an intellectual ideal of generous curiosity. 4Zittrain's answer is that the Internet is uniquely generative; it enables its users to experiment with new uses and then share their innovations with each other. 5Both books tell a story about how the combination of individual freedom and a cooperative ethos have driven the Internet's astonishing growth.
In that spirit, I'd like to suggest a third reason that the Internet works: it gets the property boundaries right.Specifically, I see the Internet as a particularly striking example of what property theorist Henry Smith has named a semicommons. 6It mixes private property in individual computers and network links with a commons in the communications that flow through the network. 7Both private and common uses are essential.Without the private aspects, the Internet would collapse from overuse and abuse; without the common ones, it would be pointlessly barren.But the two together are magical; their combination makes the Internet hum.
Semicommons theory also tells us, however, that we should expect difficult tensions between these two very different ways of managing resources. 8Because private control and open-to-all-comers common access necessarily coexist on the Internet, it has had to develop distinctive institutions to make them play nicely together.These institutions include the technical features and community norms that play central roles in Post's and Zittrain's books: everything from the layered architecture of the Internet's protocols 9 to Wikipedia editors' efforts to model good behavior.' 0As I'll argue, the dynamic interplay between private and common isn't just responsible for the Internet's success; it also explains some enduring tensions in Internet law, reveals the critical importance of some of the Internet's design decisions, and provides a fresh perspective on the themes of freedom and collaboration that Post and Zittrain explore.
Here's how I'll proceed: In Part II of this essay, I'll set up the problem.Part II.A will use Post's and Zittrain's books to describe two critical facts about the Internet-it's designed and used in ways that require substantial sharing and openness, and it's sublimely gigantic.Part II.B will explain why this openness is problematic for traditional property theory, which sees resources held in common as inherently wasteful.Part II.C will explain how commons theory can make sense of commonly held resources, but only at the price of introducing a new problem: an internal tension about the scale at which these resources should be held.The theory of tangible common-pool resources tells a tragic story that emphasizes the need to keep the group of those with access to the commons small.But the theory of peer-produced intellectual property tells a happier tale, one that emphasizes the importance of massive collaboration--of openness to as many people as possible.
In Part III, I'll resolve this tension between pressures for smallness and pressures for bigness by showing how a semicommons can accommodate both.Part III.A will introduce Henry Smith's theory of the semicommons, which he illustrates with the example of fields open to common grazing for sheep but held in private for farming.Part III.B will explain how treating the Internet as a semicommons elegantly transforms the small-and-private 7. Id. at 131. 8. See Smith, supra note 6, at 145 (" [T]he open-field system is a mixture of common and private ownership, and the question is, why not one or the other?");infra Part III.versus large-and-common antithesis into a compelling synthesis.Simultaneously treating network elements as private property and the "network" as a commons captures the distinctive benefits of both resource models.And in Part III.C I'll briefly describe how semicommons theory is implicit in Zittrain's argument.
In Part IV, I'll demonstrate the analytical power of this way of looking at the Internet-in particular, how it makes sense out of a wide range of technical and social institutions commonly seen online.Part IV.A will illustrate the importance of layering in enabling uses of the Internet at different scales to coexist.Part IV.B will discuss how user-generated content (UGC) sites solve semicommons governance problems.And Part IV.C will consider the role of boundary-setting in the failure of Usenet and the success of e-mail.
Finally, in Part V, I'll briefly argue that semicommons theory usefully helps us focus on the interdependence between private and common, rather than seeing them as implacable opposites.

II. PROPERTY AND THE PROBLEM OF SCALE
David Post and Jonathan Zittrain both link the Internet's extraordinary growth to its extraordinary openness.You don't need to ask anyone's official permission to create a new community or a new application online.Result: more freedom and more innovation, enabling the Internet to outcompete proprietary, controlled networks. 1ut this openness, which draws on property-theoretic ideas about sustainable commons, comes with its own theoretical puzzle.Big things have a tendency to collapse under their own weight, and the Internet is nothing if not big. 12The conventional wisdom in property circles is that a commons in any finite resource becomes increasingly untenable as its scale increases. 13The intellectual "commons" that many intellectual property scholars celebrate escapes this trap because (and only because) information isn't used up when it's shared. 1 4That tells us why writers and musicians and inventors and programmers can benefit from robust sharing and a rich public domain, but it doesn't seem to be directly relevant to the underlying question of why the Internet didn't flame out spectacularly several orders of magnitude ago, as users took advantage of its openness to use up its available capacity.This part will articulate, in somewhat more detail, the nature of this theoretical tension between openness and size on the Internet.Part II.A will 11.See POST, supra note 1, at 103 ("Perhaps it was a coincidence that the network that became 'the Internet' was the one that operated this way .... discuss the problem of scale, using Post's musings on Jefferson as the point of departure.Part II.B will tell what I call the "Tragic" story within commons theory-that a commons can be a sustainable alternative to private property or direct regulation, but only for small-scale resources.Part II.C will tell a different story, which I call the "Comedic" one: that for nonrival information goods, where exhaustion isn't an issue, unrestricted sharing can have benefits far outweighing costs. A. On Being the Right Size (for an Internet) In Search of Jefferson's Moose takes its title, its cover art, and its central metaphor from the stuffed moose that Thomas Jefferson proudly displayed in Paris in 1787.15 Jefferson was serving as the United States' official representative in France, and he saw himself as a cultural and intellectual ambassador, not just a political one.The French naturalist George Buffon had written that New World animals were smaller than their Old World counterparts, owing to the defectively cold and wet American climate. 16hile Jefferson's Notes on the State of Virginia attempted to refute this analysis with facts and figures, the moose offered a more demonstrative proof that American species could stand tall with the best that Europe had to offer. 17n Post's telling, the political subtext is hard to miss.Jefferson's moose was big, standing seven feet tall; it was novel, existing only on the North American continent; 18 and it was robust, an example of the rude good health of North American wildlife. 19It was, in short, a metaphor for the newly formed United States, another product of North America.Contemporary political theory considered large republics inherently unstable, and the United States was the largest republic in human history. 20efferson's moose was meant to "dazzle" his visitors into what Post calls an "'Aha!' moment" of belief-that creatures of its size could thrive in the New World, and so could the equally large American republic. 2 1efferson's metaphor for the United States thus becomes Post's metaphor for the Internet.He's looking for a way to dazzle his readers into their own "Aha!" moments about it.Post wants his readers to believe that it really is there, that it really is something new, and that it really does work.Just as Jefferson's moose was meant to impress visitors with its scale, the first half of Jefferson's Moose is meant to impress readers with the Internet's scale.
This point bears emphasis.The Internet is sublimely large; in comparison with it, all other human activity is small.It has more than a billion users, 22 who've created over two hundred million websites 23 with more than a trillion different URLs, 24 and send over a hundred billion emails a day. 25American Internet users consumed about ten exabytes of video and text in 2008-that's 10,000,000,000,000,000,000 bytes, give or take a few. 26Watching all the videos uploaded to YouTube alone in a single day would be a full-time job-for fifteen years. 27The numbers are incomprehensibly big, and so is the Internet. 2 8hese statistics tell us beyond peradventure that the Internet has been wildly successful, but they don't by themselves tell us why.Post's answer is that the Internet is built in a uniquely Jeffersonian way.Technologically, it depends on bottom-up, self-organized routing. 2 9 Its political and social structures, as well, are self-organized in a bottom-up fashion, with decisions made by local groups on the basis of consensus and voluntary association. 30hese features are also characteristic of Jefferson's ideal democratic republic, making the Internet the truest realization yet of his political vision. 3 expected them to settle the American West: to build new lives and new communities for themselves on a firm foundation of liberty. 32ittrain's theory of the Internet's success is that it's a generative system, open to unfiltered contributions from anyone and everyone. 33The Internet lets its users innovate and share their innovations with each other without being thwarted by gatekeepers who can veto proposed changes and system designers who can make change impossible in the first place. 34This greater openness to unanticipated developments gives the Internet a powerful flexibility: it can draw on the best of what all its users have come up with. 35Like Post's, this is a bottom-up story: these new protocols, technologies, and communities are being assembled by individuals, rather than being dictated from on high.
Property theory has a word for this form of resource management: commons.Post focuses on the self-assembly inherent in the Internet Protocol (IP) 36 and on self-governance, 37 while Zittrain focuses on technical innovation 38 and norm creation, 39 but these are very much the same story.The Internet's users are individually empowered to use the network as they see fit.There's no Internet Tycoon who owns the whole thing and can kick everyone else off; there's no Internet Czar who sets the rules for everyone else.That makes the Internet, on this view, a nearly ideal commons: a resource that everyone has a privilege to use and no one has a right to control.
So far, so good.But there's a reason that Post calls it the "problem" of scale. 40Size is more than just proof of success; it also creates new and distinctive problems of its own.The biological metaphor is helpful.Following Haldane's classic On Being the Right Size, Post writes, "Large organisms are not and cannot be simply small organisms blown up to larger size." 41A moose blown up by a factor often, to be seventy feet tall instead  424, 427 (1926) ("I find it no easier to picture a of seven, would have one thousand times the body mass but only one hundred times the bone cross section.It would quite literally collapse under its own weight.
The same is true of computer networks: some architectures that work well with one hundred users fail embarrassingly with one million-or one billion.Post gives another back-of-the-envelope demonstration to show that a centrally operated Internet could not possibly have a strong enough skeleton 42 to support all of the communications between its billion-plus users. 4 3Thus, the decentralization and ad hoc ethos that Post and Zittrain celebrate about the Internet are technological necessities.It took packet switching and distributed best-efforts routing to make a global network on the scale of the Internet feasible.
Property theory has its own problem of scale.If Buffon thought that nature made large New World wildlife impossible, and if political theorists thought human nature made large republics impossible, and if network engineers thought that physics made large decentralized networks impossible, then property theorists have long thought that large commons were self-defeating. 44Anything held in common would be overused or underproduced, and the larger the relevant community, the more severe the problem. 45As the Internet asymptotically approaches the whole of human experience, it would seem that its usability ought to be trending toward a limit of zero.The benefits of openness are clear, but so are the immense costs of wasteful and self-interested overuse.Since the Internet, like Jefferson's moose or the aerodynamically unlikely bumblebee that Zittrain uses as a metaphor for Wikipedia, 46 unarguably is, this success requires explanation.To find one, we will need to delve deeper into property theory.
completely socialized British Empire or United States than an elephant turning somersaults or a hippopotamus jumping a hedge.").Had Haldane been exposed to the Internet, he might have noted that it has an inordinate fondness for pictures of cats.
42.The major networks that carry the heaviest volumes of Internet traffic are referred to as "backbones."See, e.g., ZITTRAIN, supra note 2, at 158.Note the use of the plural. 43.

B. The Tragedy of the (Rival) Commons
In order to make sense of the Internet as a species of commons, we first need to situate the commons within property theory.Our starting point is the standard distinction between two kinds of resources: private goods and pure public goods. 47rivate goods such as cars, farms, and handbags have two key characteristics.First, they're rival: one person's use of the good makes it unavailable for someone else.Only one person at a time can carry a handbag or plant the same furrow.Second, they're excludable: it's possible to prevent people from using the good.We can hold tightly to handbags and put fences around farms.These distinctions are illustrated in the following, wholly conventional figure: For rival goods, excludability has three salutary effects.First, it helps prevent wasteful underuse.If you own a resource but my proposed use is higher value than yours, it will be profitable for both of us for you to sell it to me. 48Second, it prevents dissipation of the resource's value as we fight over it; without excludability I could simply take the resource from you and you could take it back, ad nauseam.Third, it promotes efficient investment: companies will invest in creating or improving resources if they can also reap the gains from the increase in value. 49hings are trickier when excludability fails.Resources that are rival but not excludable are common goods (in the bottom-left quadrant of the diagram).Common goods are subject to a wasteful race that Garrett Hardin termed the "tragedy of the commons" in his influential 1968 article. 5 commons.Therein is the tragedy.Each man is locked into a system that compels him to increase his herd without limit-in a world that is limited.Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons.Freedom in a commons brings ruin to all. 5 1 The tragedy flows from the lack of excludability.The herdsmen race to graze more sheep because no one can stop them, and they know that no one will stop the others.If excludability could be restored and the sheep could somehow be kept off the pasture, the race would be terminated before it got out of hand. 52As Hardin put it, "the necessity of abandoning the commons" 53 would require "mutual coercion, mutually agreed upon" 54 to establish effective restrictions on overuse.
Hardin assumed that these restrictions could take one of two forms: "private property" or governmental "allocation.'55 A sole private owner absorbs the full costs and benefits of using the pasture and therefore will choose the right number of sheep to graze. 56A government regulator, on the other hand, could allow multiple shepherds access, but limit the number of sheep each may graze so that the total comes out right.Either way, the key is to recreate excludability. 57On our diagram, these are moves from the bottom-left quadrant to the top-left.
Henry Smith has generalized this dichotomy by recasting it as a division between two organizational forms: exclusion and governance. 58Exclusion, which corresponds to private property, involves giving one designated gatekeeper complete control over the resource. 59In a governance regime, on the other hand, multiple users are allowed to use the resource, but are subject to rules specifying how, when, and in what ways. 60An exclusion regime puts a fence around the pasture and gives one person the key; a governance regime brands all the sheep and limits the number each person may graze.
The exclusion/governance distinction focuses on the institutional characteristics that matter, rather than on the formal label of "property" or "regulation." Exclusion and governance both depend on a source of authority to enforce the rules that recreate excludability.To use Hardin's term, "coercion" was essential. 6 1But, as scholars of the commons have recognized, that coercion need not come from above, in the form of the state. 62It could also come from below, from the other users of the resource themselves.
Elinor Ostrom's work shows that many communities have successfully managed common resources. 63Her list of successes includes Spanish irrigation ditches, Japanese forests, and even Swiss grazing meadows-Hardin's signature example of failure, turned on its head.These communities have created and then enforced on themselves a governance regime controlling use of their common resource.These bottom-up institutions are neither archaic holdovers nor illusory bulwarks; under the right circumstances, common ownership can thrive for hundreds of years.
It doesn't always work, though.Many common ownership regimes have succeeded, but many others have failed. 64The question thus becomes, what distinguishes the commons that work from the ones that suffer Hardin's tragic fate?Ostrom and others give lists of the core factors that make a commons sustainable, including good institutions to gather information about the resource, forums to discuss its management, graduated sanctions to punish misusers, and community participation in making and enforcing the rules. 65n this tradition, one factor stands out as essential to the success of a commons: community coherence.Mancur Olson's theory of collective action argues that small groups can better coordinate their actions than large ones. 66Ostrom emphasizes the importance of well-defined boundaries, not just around the resource, but around the community too. 67 outsiders. 70Only a small, well-defined, tightly knit group can recognize outsiders to keep them away, monitor and act against its own members with the necessary intensity, and have a sufficiently strong incentive to bother.Size, in other words, ought to be fatal to commons self-management.All else being equal, a small and coherent community is more likely to succeed at running a commons; a large and diffuse one is more likely to botch the job.The bigger the group, the greater the tendency towards ruin.
Let's call this strain of commons theory the Tragic story.Its basic lesson is Hardin's: commonly held resources are vulnerable to self-interested overuse.That fate can be staved off through a variety of arrangements, including private property, governmental regulation, or common selfmanagement.This last institutional form requires community members to craft their own rules of appropriate use, monitor each other's behavior, and punish violators. 7 1This is a fragile enterprise; only strong and well-defined communities will be able to sustain the constant work of self-control required.Success requires closing the commons off to outsiders; to throw the community open to them is to court disaster.
This Tragic story has been frequently told about the Internet.Telecommunications analysts predict an impending bandwidth crunch, as users deplete the limited supply of available connectivity. 72elecommunications companies complain that selfish bandwidth hogs are destroying the Internet experience for other customers. 73Scholars warn that weak incentives to cooperate make Wikipedia unsustainable 74 --or perhaps online sharing more generally is doomed. 75The Tragic story explains skepticism about YouTube's business model 76 and fears about the rise of malware, botnets, and denial-of-service attacks. 77f, as the Tragic story predicts, bigger means riskier, then the immense Internet ought to be an immense smoking ruin.Peer-to-peer file sharing and video downloads always look poised to overwhelm capacity in a host of I-want-mine overuse.Just as soon as a few more people crank up their usage, or one really clever bad apple figures out how to use it all up, it'll be the endgame for the Internet.As of this writing though, the Internet still stands, the Tragic story notwithstanding.To make sense of why that might be, let's return to commons theory-another strand of which has come to almost exactly the opposite conclusion.

C. The Comedy of the (Nonrival) Commons
So far, we've been discussing rival resources, in the left-hand column of the diagram.Here, to repeat, the traditional economic view is that efficiency flows from excludability; commons theory accepts that view but offers a different, bottom-up way of creating exclusivity.Now, let's turn to the right-hand column, where nonrival resources dwell.Once again, traditional economic theory prizes excludability, albeit with considerably less certitude than for rival resources.But this time, commons theory takes an altogether more radical turn-arguing that exclusivity itself is overrated.
The starting point of the analysis is that many nonrival goods can be shared with others for much less than it costs to make them in the first place.Information in digital form, for example, can be copied and transmitted around the world for almost nothing.But even tangible goods can often be shared without imposing costs on current users: at 2:00 a.m., a second car on the road doesn't limit the first driver's ability to use the highway, too. 7 876.See, e.g., Farhad Manjoo, Do You Think Bandwidth Grows on Trees?, SLATE, Apr.14, 2009, http://www.slate.com/id/2216162/("YouTube has to pay for a gargantuan Internet connection to send videos to your computer and the millions of others who are demanding the most recent Dramatic Chipmunk mash-up .... [N]ot even Google can long sustain a company that's losing close to halfa billion dollars a year.").77.See, e.g., ZITrRAIN, supra note 2, at 43-54 (describing the "untenable" state of online security).

Brett M. Frischmann, An Economic Theory of Infrastructure and Commons
Management, 89 MINN.L. REv.917, 945-46 (2005).To be more precise, as Brett Frischmann explains, goods vary in their capacity to accommodate multiple uses.But even a good with a finite capacity can still be effectively nonrivalrous if that capacity is also renewable.Id. at 950-56.A stretch of highway may be able to accommodate 2000 cars per hour, but its use at 6:00 a.m. has essentially no effect on its ability to accommodate cars at 6:00 p.m.As long as we're beneath the level of use at which adding cars would create a traffic jam now, the highway is nonrival.Yochai Benkler has developed this point into a theory of "sharable" goods.See Yochai Benkler, Sharing Nicely: On Shareable Goods and the Emergence of Sharing as a Modality of Economic Production, 114 YALE L.J. 273, 330-

(2004).
Where the nonrival resource is also nonexcludable, and thus a pure public good, this is a problem.As soon as the seller has created the goodsay, a photograph of a sheep--everyone else can have access to it for free; only suckers and patsies would pay for it if they didn't have to.But that leaves the photographer with no economic incentive to go out and spend days taking the perfect photograph, so she won't create it in the first place, which leaves no original for others to copy.Result: everyone loses. 79he conventional response here has been to focus on making the resource more excludable. 80This is the usual economic apology for granting a photographer copyright over her photographs, for example. 8 1It allows her to prevent the nonrival sharing that would otherwise flood the market with cheap copies and undercut her ability to recoup her costs. 82The same logic also explains various self-help substitutes for intellectual property, such as end-user license agreements and digital rights management (DRM): they all recreate excludability. 83If she can move all the way up the right-hand column and make the resource perfectly excludable, then she can capture the full social value of her work, giving her an efficient incentive for the optimal level of creativity. 84In this respect, at least, full excludability makes public nonrival goods look like private rival goods. 8 5ut excludability's prevention of free riding is not a free lunch.For one thing, it's expensive to establish: IP laws have to be enforced, licenses have to be drafted, and DRM has to be programmed. 86For another, excludability can be harmful in itself Even though the good (being nonrival) could be shared freely or cheaply, a rational owner will instead price it to maximize her profits.But that means she'll sell it for more than some people would 84.See Demsetz, supra note 49, at 300-03 (arguing that the resulting equilibrium "allocates resources efficiently to the production of the public good").
85.There are, however, other important ways in which they differ.Because these nonrival goods have high fixed (or first-copy) costs but very low marginal costs.there's an enormous competitive advantage to being the bigger competitor in a market.Your average costs will be lower than your competitors, helping you undercut their prices and seize the whole of the market.This gives these markets-one kind of "network industry"-distinctive have been willing to pay; 87 the good ends up being used less than would have been socially efficient. 8 8Thus, the conventional economic narrative of intellectual property law is of a dialectic pitting her ex ante incentives to create the information good against the ex post value of broad access to it. 89hatever balance we choose is likely to impose costs on both sides.
Scholars have therefore looked for ways to avoid the difficulties of finding market incentives to create public goods.One approach is right there in the name: the government could directly invest in these "public" goods.That's the conventional way of paying for physical public goods, like lighthouses. 90At times, governmental investment has also been used to pay for information goods, such as the NEA's grants to artists and prizes for scientific discoveries. 9 1Once government funding succeeds in bringing these goods into existence, they can be given away freely.Voild: no costs from imposing excludability. 92ommons theory takes this idea-maximal circulation of information goods at no cost-and runs with it.The key move is the recognition that solving the ex post distribution problem can also, paradoxically, help solve the ex ante production problem. 93Making information more widely available doesn't just benefit passive couch-potato information consumers; 87.In a world in which she cannot price discriminate perfectly and costlessly, that is.If she could, perfect price discrimination would also in theory lead to an efficient outcome, one in which she appropriates all the value of the good, rather than other users.92.On the other hand, having the government pay for it doesn't solve the problem of deciding how much to pay for it.Here, it's even more difficult to decide how much the government should spend for the sheep photograph.Since the photograph will ultimately be given away for free, the government will find it well-nigh impossible to learn how much each individual would have been willing to pay for it.it also helps other information producers. 94Information goods are critical inputs into the production of other information goods, so increasing their circulation gives creators more to work with.Information is the oxygen of the mind; lowering the cost of air lets minds breathe more freely. 95All creativity is influenced and inspired by what has come before; all innovation incrementally builds on past inventions.The public domain is not simply a negative space of the unprotected, but a positive resource of immense richness available to all. 9 6On this account, reducing the excludability of nonrival information goods will often lead to more information production, not less, because the reduced incentives for creators will be more than outweighed by the increased access to raw materials.

97
In a further twist, scholars of the information commons have argued that often we don't need any external incentives for the production of information goods. 98In these cases, we can dispense with excludability completely.Some people take photographs of sheep because they want the pictures for themselves; others want to express a vision of pastoral serenity; still others want to hone their skills with a camera, or to show off those skills to potential employers.This diversity of motivations means that even though the vast majority of photographers in the world are unpaid, they're still enthusiastically snapping pictures.Steven Weber's studies of open source software and Yochai Benkler's theory of peer production emphasize that personal expression, generosity, reciprocity, desire to show off, and other purely social motivations can be just as strong as economic ones. 99.Indeed, one of the other virtues of commons theory is its willingness to recognize that "consumers" and "producers" are often the exact same people, that individuals move between these roles seamlessly in their cultural, social, and intellectual lives.See, e.g., Jack M. Balkin, Digital Speech and Democratic Culture: A Theor of Freedom of Expression for the Information Society, 79 N.Y.U.L. REV. 1, 4 (2004) ("Freedom of speech . . . is interactive because speech is about speakers and listeners, who in turn become speakers themselves .... [I]ndividual speech acts are part of a larger, continuous circulation.").The idea, however, has led to some unfortunate portmanteaus.See, e.g., DON TAPSCOTT & When they think about the ideal scale of an information commons, these thinkers generally say, "the more the merrier." 1 0 0 There are network effects from increased participation; the more people who are sharing with you, the greater the riches available for you to draw on.They don't cost you anything; indeed they may actively help out your own creative processes, for example by pointing out bugs in your open-source software.' 0 1If the community is engaged in cooking up a batch of informational stone soup, the larger the community grows, the richer the soup becomes, and the less of a burden the cooking places on any individual member.' 0 2Moreover, as Benkler argues, increased community scale leads to more opportunities for productive collaboration, so that sharing catalyzes creativity and vice-versa, accelerating the virtuous circle. 1 0 3 Let's call this strain of commons theory the "Comedic" story. 104It applies to nonrival resources, and particularly to information goods.To review, its basic argument is that repudiating excludability is often better than embracing it.Since the resources are nonrival, free riding poses no threat of waste.Instead, a commons ensures the maximum possible use of valuable information, avoiding the waste associated with exclusive rights.The incentives to produce and share come from the internal and social motivations of participants, motivations that under the right circumstances may even be supplied by the commons itself.
Like its doppelgdnger, the Comedic story has also frequently been told about the Internet. 105The blogosphere, built on an ethos of sharing one's own thoughts and linking to others', is numerically dominated by noncommercial blogs written for personal reasons; even bloggers who make money selling ads still give the actual words away. 1 0 6 The last half-decade on the Web has been the great era of UGC sites like YouTube, Flickr, Facebook, and Twitter-all of which offer users access to content uploaded, for unpaid sharing, by other users.Sharing makes the Web go round.'There's also a strong argument that many of these sharing-based sites are successfully outcompeting their more restricted competitors.Wikipedia's outrageous success, as compared with Nupedia, Citizendium, Knol, Encarta, and every other would-be online encyclopedia, could reasonably be attributed to its extraordinary openness to unfiltered contributions from anyone.' 0 8 And on a deeper level, one noted by both Post 10 9 and Zittrain,I 1 0 the Internet is itself largely built with nonproprietary technical standards, free for anyone to reuse and implement for themselves.IlEven when private companies develop and commercialize services, they've thrived best when the companies have released well-documented public interfaces, free for anyone to use and build upon in making new mashup applications. 112Even much of the software on which the Internet itself runs is the freely shared product of collaborative open-source development, carried out collaboratively worldwide.., on the Internet.113

III. THE SEMICOMMONS
It should by now be clear that the Tragic and Comedic stories point in diametrically opposite directions.The Tragic story embraces exclusion; it tells us that the only way to make a commons work is to make it small and jealously keep outsiders out. 1 14 The Comedic story rejects exclusion; it tells us that the best way to make a commons thrive is to make it large and invite in as many participants as possible."1 5As applied to the immensity of the Internet, the Comedic story predicts utopia and the Tragic story predicts utter devastation.The Internet's success at scale suggests that there must be something to the Comedic story's optimism, but so far, we have no good theoretical they create things for one another's pleasure and to conquer their uneasy sense of being too alone.").
108.On Smith's account, a resource must satisfy two conditions to be a good candidate for semicommons ownership.There must be multiple possible uses of the resource that are efficient at different scales, so that one use is naturally private, and one is naturally common.These uses must also have significant positive interactions with each other, so that there will be a benefit from allowing both rather than choosing one.The combination of scale mismatch and positive interactions offers rewards for mixing private and common. 1 17 Smith's "archetypal example of a semicommons is the open-field system of medieval [Europe]."' 1 18 Sheep could be grazed freely across the fields of a village during fallow seasons, but during growing seasons, individual farmers had exclusive rights to their strips of land.The same fields were held in common for grazing and privately for farming: a semicommons.As he shows, the open-field system displayed both scale mismatch and positive interactions. 1 19irst, the two valuable uses of its land-grazing sheep and raising crops-were efficient at different scales.Medieval grazing had scale economies: one shepherd could watch a large flock on a correspondingly large pasture. 120Medieval farming was a labor-intensive, small-scale affair.Each farmer could plow, seed, tend, and harvest only a limited quantity of land.Nor was there much benefit in teaming up; two men couldn't plow the same furrow, and combining holdings would have tempted each farmer to shoulder less than his share of the work.121Thus, grazing made sense as a commons in which each villager was entitled to contribute sheep to a large flock grazed across large tracts of land.This is a natural governance regime: the extent of each villager's use could easily be monitored by counting his sheep.On the other hand, farming made sense as private property in which each villager farmed his own small plot of land.This is a natural exclusion regime: it's easy to tell who's harvesting from which piece of land. 1 22 As for positive interactions, the same land could profitably be used for both farming and grazing.Land needs to sit fallow between growing seasons; 123 a village might as well graze sheep during the off-season. 124etter still, the best source of fertilizer for the fields was the manure left behind by the sheep. 12 5Thus, the private plots of land worked better for their private purpose because they were also open to the common use of grazing sheep.
In addition to explaining why a semicommons might come into being, Smith's theory also explains some of the distinctive threats it will face from strategic behavior. 126I'd like to emphasize three.First, in a semicommons, users will be tempted not just to overuse the common resource, but to strategically dump the costs onto others' private portions, bearing none of the costs themselves. 12 7On a rainy day, when trampling hooves will do the most damage and create the most mud, a shepherd might be tempted to direct the herd onto someone else's plot of land and well away from his own. 128Second, users may be tempted to take expensive and socially wasteful precautions to guard against others' strategic uses-say, sitting outside all day in the rain to watch the shepherd. 1 29And third, private users will be tempted to disregard the commons use in pursuit of their private gain; 130 imagine a farmer who decides to plant a profitable crop that's poisonous to sheep.The semicommons only makes sense if the benefits from combining the private and common uses outweigh the costs from these kinds of strategic behavior.' 31emicommons also have important strategies for dealing with these challenges.One is sharing rules, in which some of the private portions of the resource are collected and divided among the various users. 132Smith gives the example of general average in admiralty, in which all those with an interest in a ship or its cargo must contribute proportionately to reimburse anyone whose property is damaged in the course of avoiding a common peril. 133That eliminates the captain's temptation to throw other people's cargo overboard first.In our hypothetical village, pooling some of the crops after each season would be a sharing rule protecting victims of excessive trampling.
Another characteristic semicommons device is boundary-setting. 134Smith's example here is scattering; each villager's landholdings were divided into multiple small strips in different fields, rather than one larger plot. 135Scattering was costly; farmers sometimes got confused about which strip was theirs. 13 6But it also made abusive herding less attractive.With many thin strips, it's harder for the shepherd to park the sheep over his own plot while they poop, and over someone else's plot while they stomp.Getting the property boundaries right thus helps prevent strategic behavior.

B. The Internet Semicommons
Smith's semicommons model accurately describes the Internet.We'll see every element of it online: private and common uses of the same resource, efficient at wildly different scales, but productively intertwined; strategic behavior that also causes these uses to undermine each other; 137 sharing rules and boundary-setting to keep the whole thing functioning. 1 38he productive but fraught interplay between private and common uses in a semicommons elegantly captures the tension between the Tragic and Comedic stories on the Internet.
Let's start with the private and common uses.On the one hand, the computers and network cables that actually make up the Internet are private personal property, managed via exclusion.My laptop is mine and mine alone.If I decide to laser-etch its case, or to wipe it clean and reinstall the operating system, no one can stop me.Nor can anyone else decide what outlet it's plugged into; if you try to treat it as common property and take it home with you, you'll be brought up on charges.The same goes for Rackspace's servers 139 and Level3's fiber-optic network: 140 private property, all of it.2010) ("In a managed hosting environment, the provider owns the data centers, the network, the server and other devices, and is responsible for deploying, maintaining and monitoring On the other hand, as a communications platform, 14 1 the Internet is remarkably close to a commons, managed via governance in the form of technical standards and protocols. 1 42Fill out an IP datagram with the 32-bit IP address of a computer you'd like to send a message to, and dozens of routers will cheerfully cooperate to get it there.' 4 3 The destination computer has also likely been configured to collaborate with you.Send it an HTTP GET message and you'll get back a Web page; 144 send it an SMTP HELO and it will get ready to accept an e-mail from you. 145With this kind of support-yours for the asking-you and your friends can set up a new website, a new online application, a new protocol, a new peer-to-peer network, a new whatever you want.That's common use of a fairly profound sort-precisely as Post and Zittrain describe. 1 46ext, these private and common uses are efficient at different scales.Private property makes sense for individual computers and cables; the Tragic story warns us to restrict access. 147Computers are rival: they can be stolen, hijacked, or crashed.Hardware remains expensive enough that people will try to get their hands on it (physically or virtually), and when they succeed, it creates real costs for others.1 4 8 Private property empowers individual owners to protect against laptop thieves, virus writers, and botnet wranglers. 149oreover, private use of computers aligns incentives well.Computers require their owners to invest time, effort, and money: buying the hardware, setting up the software, keeping the power and bandwidth flowing.An exclusion regime both allows and encourages computer owners to select the configuration that's value-maximizing for them.I use my laptop to write papers, check e-mail, and surf the Web wirelessly; you use your broadband-connected desktop to run regressions and play World of Warcraft.Our ideal computers are profoundly different; asking us to play sysadmin for each other would only pile up the agency costs.All of this pushes towards small-scale private ownership.
On the other hand, commons use makes sense when we look at the Internet as a communications platform.The Comedic story tells us that where information exchange is concerned, we should design for the widest participation possible.A communications network's value plummets if it's fragmented.15 0If you have something to say to even one other person, you may as well post it publicly, so that anyone else can take advantage of it.When you do, better to share with the world than with any smaller group. 15 1urther, this large-scale communications platform wouldn't work efficiently if it were private.Scholars have noted the immense transaction 151.Lauren Gelman observes that users often post sensitive information in publicly accessible ways online.Her point is that public accessibility allows you to reach others who share your interests, even when you couldn't identify them at the time of the posting.The value of reaching them can outweigh even significant privacy risks of being noticed by outsiders.See Lauren Gelman, Privacy, Free Speech, and "Blurry-Edged" Social Networks, 50 B.C. L. REv.1315, 1334-35 (2009).costs of negotiating individual permission from every system owner, 152 along with the potential anticommons holdup problems. 153It's almost a mantra at this point that if you want to create a successful online service, you need to make it free and freely available. 154And even as ISPs threaten to introduce stringent usage caps, they can barely manage to tell customers how much bandwidth they're actually using. 15 5The commons has a powerful hold on the Internet as communications platform.
In other words, the Tragic story is right, for the individual computers and wires that constitute the Internet.It recommends private property, which is indeed how these physical resources are held.And the Comedic story is also right, for those computers and wires considered as the communications platform that is the Internet.It recommends an unmanaged commons, which is indeed how this virtual resource is held.
Not only are both stories right, they're right at wildly different scales.Remember how there are over a billion users and over two hundred million websites on the Internet? 156That's a difference of eight or nine orders of magnitude between the scale of the individual computers on which private owners operate and the scale of the worldwide common platform.That divergence isn't a one-time anomaly; it's what the Internet does, every millisecond of every day, everywhere in the world.Satisfying Smith's other condition, these two uses "impact ... on each other"' 157 profoundly and positively.You couldn't build a common global network without the private infrastructure to run it on, but most of the value of that infrastructure comes from the common global network dancing atop it.To see why it's the interaction that adds the value, remember that the private owners could disconnect their computers from the semicommons at any moment-and choose not to.Would you buy a computer incapable of being connected to the Internet?Neither would I.This is a semicommons that works.
Indeed, the Internet is probably the greatest, purest semicommons in history.
152. Paul Ohm has pointed out that in many cases, just trying to meter or monitor these information flows-a necessary step in privatizing them-would in many circumstances be ruinously costly.

C. The Generative Semicommons
In a moment, we'll complete this portrait of the Internet semicommons by looking at its characteristic forms of strategic behavior.But first, I'd like to point out how semicommons theory offers a fresh perspective on generativity.The Future of the Internet describes the generativity of both the personal computer (PC) 158 and the Internet, 159 which map neatly onto the private and common aspects of the Internet.
A generative PC is one that its owner fully controls.Not a tethered appliance that's physically yours but practically under someone else's governance. 160Not a cloud service that you might be excluded from tomorrow. 16 1And not an insecure box actually under the control of a shadowy Elbonian hacker syndicate.No, the generative PC is your private property, yours to do with exactly as you choose.
The generative Internet, on the other hand, is defined by its connectedness and commonality.Once you have a great new hack, the best thing to do is to send it out to others, so they can replicate its benefits for themselves and build their own improvements on it.That means the network ought to be as common as possible; you should be able to share your innovations with anyone, not subject to any third party's veto.Although Zittrain rejects a "categorical" end-to-end principle, 162 he treasures the way that the Internet's lack of control permits a "flexible, robust platform for innovation."1 63he two stages of generativity-creating and sharing-thus map onto private and common, respectively.The cycle works best when the two are not just available, but conjoined.Generativity is the story of the Internet as innovating semicommons.

IV. CHALLENGES
As a semicommons, the Internet is valuable because it combines private and commons uses.Merely saying that it does, however, gives little guidance on how to make this coexistence work.Indeed, one of Smith's central points is that problems of strategic behavior are actually worse in a semicommons than in a pure commons. 164His account of the open-field system doesn't just explain why it might make sense to treat the village fields as a semicommons; it also describes how villagers were able to solve these strategic behavior problems. 165In so doing, he also provides a potentially more satisfying explanation of one of the characteristic features of the open fields: scattering. 1 66 This part will do the same for the Internet: link the theoretical question of the mitigation of strategic behavior in a semicommons to well-known descriptive characteristics of the Internet.I'll argue that the success of many Internet technical features and online institutions can be cleanly explained in terms of their ability to overcome semicommons dilemmas.Part IV.A discusses "layering," one of the Internet's basic technical characteristics, in which one protocol provides a clean abstraction atop which another can run, and so on repeatedly.Layering helps mediate the private/common interface and helps prevent these uses from interfering with each other.Part IV.B takes up UGC sites, which exhibit a consistent correlation between a community and a particular piece of infrastructure.This linkage enables them to use governance rather than exclusion to detect and prevent misuse of their little comer of the Internet.And Part IV.C looks at some problems of boundary-setting, in particular how the Usenet system of distributed bulletin boards has failed to cope with strategic behavior.Its experience tells us much about the challenges of institution design on the Internet.

A. Layering
The term "layering" comes from computer science, where it describes the division of a system into components, each of which only interacts with the ones immediately "above" or "below" it. 16 7Programmers and system designers use layered architectures because they enable modularity: each component can be designed without needing to know how the others work, which reduces the complexity of the programming task and simplifies debugging by reducing interactions between components. 168n the Internet, layering is prevalent.When one user writes another an e-mail, the actual exchange typically involves six or more layers.The email ("content" layer) is sent from one e-mail program to another ("application" layer) using the Simple Mail Transport Protocol (another application, but one operating at the service of e-mail programs), which uses the Transmission Control Protocol to open a stable connection from one computer to the other ("transport" layer).That connection, in turn, is 166.Smith, supra note 6. at 146-54 (arguing that border-setting semicommons explanation of scattering is superior to other economic explanations).made up of a sequence of discrete IP datagrams ("network" layer), which in turn are moved from one computer to the other using a network protocol such as Ethernet ("link" layer) that is tailored to run well on specific hardware like a Category 5e twisted-pair copper cable ("physical" layer).The e-mail program doesn't need to know how Ethernet works, and viceversa. 169or our purposes, one layer is more equal than the others: the network layer, where IP is universal.1 70 Lower layers have a diversity of networking protocols and hardware; higher layers have a diversity of transports and applications.But there is only one network-layer protocol worthy of note on the Internet: IP. 17 1 Everyone uses it.The Internet itself can be defined as the global network of computers using IP to communicate with each other. 1 72 As legal scholars have recognized, layering has policy implications. 173n particular, it permits different resource allocation regimes at different layers.The same network may be fully private at the physical layer (only the company IT manager can enter the room with the server), a limitedaccess common-pool resource at the link layer (only employees in the building can connect to it, but they can do so freely), a governed open-tothe-world common-pool resource at the application layer (the company allows outside e-mail connections but filters them for spam), and mixed commons and private at the content layer (outside users send the employees both proprietary company documents and freely shared jokes).These different regimes coexist: the same physical network is simultaneously participating in all of them. 1 74 Any given electrical signal is meaningful at multiple layers.
169.See TANENBAUM, supra note 167, at 37-71.170.To be more precise, this claim would also include the ancillary routing protocols, such as the Border Gateway Protocol (BGP) and the Routing Information Protocol (RIP), that tell IP-implementing systems which other computers they should forward IP traffic through.See COMER, supra note 143, at 249-93 (describing BGP and RIP).
171.The current version of IP in broad use is version 4; there is a worldwide effort underway to upgrade to version 6.But IP's universality makes this upgrade both technically challenging and politically contentious: any widely adopted change to IP will change the nature of the Internet itself.See DENARDIS, supra note 142, at 4; LAWRENCE LESSIG, CODE: AND OTHER LAWS OF CYBERSPACE 207-09 (1999) ("[W]e again will have to decide whether this architecture of regulability is creating the cyberspace we want.A choice.A need to make a choice.").IP plays a critical role in this system of overlapping regimes; it is the layer on which the Internet is most fully a commons.Beneath are privately owned hardware, managed networks, and (sometimes) proprietary protocols.Above it are specialized protocols that form direct connections between individual computers, tightly controlled applications, and copyrighted content.But IP itself is wide open: specify the IP address of a system you'd like your datagram to be delivered to, and dozens of computers along the way will voluntarily help get it there.Most of the time, no one asks for payment; no one inspects the contents; no one asks to see your signed authorization.This makes IP not just a universal layer sandwiched between more fragmented ones, 175 but also a commons layer sandwiched between more private ones.
It was an inspired design decision; IP has a lot of nice desiderata for a commons.A Comedic commons should be as large as possible, as easy as possible to join, as minimally demanding as possible on its participants, and as flexible as possible in the uses to which it can be put.IP checks every one of those boxes.Not only is it universal in the sense of being widely used, it's also universal in the sense of being able to run on any kind of network. 176It's also a remarkably simple protocol; all it does is move datagrams from point A to point B. 177 That makes it easier to implement and reduces the need for explicit coordination, both factors that make it easier to participate in the IP commons. 178Because IP's simplicity also precludes specialization for particular uses, it's suitable for almost every use.179 It can be a jack of all trades by being a master of none. 180The IP Internet is thus both easy for private infrastructure owners to join and easy for them to use profitably once they've joined.
This leaves, however, the problem of strategic behavior identified by Smith.That the Internet has a billion users is a sign of success, but that it has hundreds of millions of computers is a sign of challenge-that's a massive amount of resources to be devoting to the Internet semicommons.Every use of the IP commons imposes very real costs on the private infrastructure beneath it, and the natural question is why those costs don't overwhelm it.Routers can become overwhelmed with heavy traffic and drop packets; network links can become saturated; servers can crash.
This isn't merely a theoretical concern.Strategic behavior is everywhere in the Internet semicommons.Network backbone operators routinely route packets in ways designed to dump as much of the cost as possible on each other. 18 1Virus, worm, and malware authors use the commons to deliver their malicious software, hijack users' private computers, and send out spam-which in turn puts costs on private mail servers and actually degrades the content commons by polluting it with unwanted, distracting messages. 182We also see the wasteful precautionary costs identified by Smith: spain filters, 18 3 CAPTCHAs, 184 firewalls, 185 and so on are the Internet equivalents of farmers sitting out in the rain watching to see where the sheep are driven.This is all rather discouraging, but, as Galileo apocryphally said, eppur si muove. 186The Internet does work, fortunes are made on it, and millions of afternoons are enjoyably frittered away reading Twilight fan fiction.The benefits from combining private and common uses online must outweigh the costs, notwithstanding all of these abuses, or the Internet would not exist.
Layering is one reason that the costs of strategic behavior and fencing are manageable.
IP's simplicity creates a kind of forced sharing of 181.For example, network A will hand off packets destined for network B's users as soon as possible, so that network B does the bulk of the work to deliver them.infrastructure for users.A former housemate used to saturate our Internet connection running a peer-to-peer Gnutella node.It brought my Web surfing to a crawl-but it also brought his Web surfing to a crawl.There was no way for him to reach down further in the protocol stack and prioritize his physical computer over mine.It wasn't full internalization of the costs he was creating for the rest of us-but it was enough to convince him to moderate his file sharing once he figured out the connection.
Layering also has salutary boundary-setting effects. 187Because the IP layer hides those beneath it, it functions like scattering in preventing commons users from strategically targeting specific private pieces of infrastructure.It's impossible to know with certainty the path that a packet will follow, and if one link fails under overload, the packets will flow along another route.You can graze your traffic across particular networks, but you can't easily park all of it in one spot.Conversely, John Gilmore's quip that "[t]he Internet interprets censorship as damage and routes around it"'188 captures the point that it's difficult to impinge on commons uses by targeting specific pieces of the private infrastructure.Take out one node and the Internet's overall flows will be largely unaffected.
Looking upwards in the protocol stack rather than down, as long as my ISP and other infrastructure providers implement IP without violating its layering abstractions-that is, as long as they don't "look inside" the IP packets-they have almost no choice but to provide a "neutral" network.' 8 9 Layering becomes an architectural constraint that prevents them from selectively choosing to block, alter, or slow my communications.That limits the power of these private owners to engage in self-interested bargaining with commons users; they can't go to Google and demand a premium for allowing its traffic to pass, or slow down all video content, or otherwise start tinkering with commons uses.' 90

B. User-Generated Content Sites
Next, consider a large and important category of Internet activity: UGC sites.All of these sites face the Internet's semicommons problem in miniature.They typically run on private infrastructure supplied by a single entity, but anyone in the world can view and contribute to them. 19 1Some 187.Solum & Chung, supra note 173, discuss at length the policy virtues of respecting boundaries between layers.are tiny, like my blog-laboratorium.net--whichruns on a server operated by a group of my friends and has a few comments per day.1 92 Others are gigantic, like YouTube, which runs on a massive Google server farm and serves up over a billion video views daily.1 93 Once more, we wouldn't observe the semicommons form unless it were worthwhile.If the juxtaposition of private and common uses weren't creating value, then either the private owners of the servers would turn them off or the common users would stop participating. 194In view of Smith's analysis of semicommons incentives, we should therefore ask how these sites produce value and how they keep costs under control.
The first answer is that on the Web, normal commons use can often be a source of value for server owners rather than a net cost.Consider my blog, for which I pay about $250 a year.Each pageview and comment costs me something, true, but it gives me more in return.By allowing readers to access my blog, I spread my ideas to a larger audience.By allowing comments, I learn from them.Even server owners motivated only by money can reap value from free commons use.The secret, as the first commercial bloggers discovered, is advertising. 19 5More users mean more ad revenue, so that users are like pooping sheep: well worth the bother. 196 site that is free to its users can take advantage of powerful Comedic effects.Any nonzero price requires some form of signup, login, and billing; in addition to being costly to implement, these exclusion mechanisms are 194.In this respect, these particular Internet semicommons are more susceptible to Demsetzian explanations than many offline property systems and legal regimes, where the problem of collective action looms larger.See Harold Demsetz, Toward a Theory of Property Rights, 57 AM.ECON.REV.PROC.347 (1967) (describing evolution of property rights as efficient response when value from more intensive use increases).Scholars, however, have raised difficult questions about the mechanism by which this evolution would take place.See, e.g., Saul Levmore, Two Stories About the Evolution of Property Rights, 31 J. LEGAL STUD.S421, S425-33 (2002) (noting ambiguity between optimistic Demsetzian story of the evolution of efficient property regimes and pessimistic story about selfish interest groups capturing value for themselves).A UGC site, however, as a resource, doesn't preexist the semicommons form (so that its users can hardly be accused of appropriating a commons for their exclusive use), and its users make individual voluntary decisions to take part when the rewards outweigh the costs (thus providing a straightforward mechanism for the collective decision to use a particular governance regime).This isn't to say that UGC sites are free of interest-group dynamics, or that they don't face collective action dilemmas, only that their initial development of a property system may pose less of a puzzle than the development of property systems in purely tangible online resources.
195.See ROSENBERG, supra note 106, at 178-85.196.Sometimes, as with Twitter, the ad revenue isn't there yet (and may not ever be, if the skeptics are to be believed), but the prospect of monetizing the eyeballs justifies the upfront expenses of building the community.surprisingly strong psychological impediments to participation.' 97 That means a huge, discontinuous jump in usage when access is truly open.On a site where users interact with each other in creative ways, this spike in usage has powerful feedback effects.My commenters don't just respond to me; they also riff on each others' thoughts.It is, in short, the Comedic story all over again: creating a community of sharers produces value for everyone involved.
The second piece of the puzzle is that the server owner doesn't disappear from the picture entirely.She retains residual exclusionary power. 198She may not exercise it ex ante, at least in the first instance.But she can and regularly does exercise it ex post, to target specific instances of abuse.If YouTube users flag an inappropriate video, YouTube will yank it. 199If I see a spam comment on my blog, I delete it.If the abuses are flagrant enough, the user is likely to be kicked off the site entirely-and eventually, to be blocked at the IP address level. 20 0n other words, YouTube and I are paying the monitoring costs to watch what commons users do on our private pieces of the Web.We're using governance to control behavior and self-help exclusion to enforce our decisions.The openness of our sites creates a classically Tragic scenario; we use our platform power to deter misuse. 20 1We're willing to pay these costs because the overall benefits to us of common usage are larger still.
197.Cf Anderson, supra note 154, at 146.  198.Lest this seem unremarkable, keep in mind that it would be nearly inconceivable for an Internet backbone provider to decide sua sponte that it needed to block traffic from a particular IP address, and that when Comcast started blocking particular traffic, it drew an FCC investigation and injunctive relief.201.See Jonathan Zittrain, The Rise and Fall of Sysopdom, 10 HARV.J.L. & TECH.495, 501-06 (1997) (discussing role played by amateur "sysops" of online forums, newsgroups, and bulletin boards in fostering community).Zittrain's message in 1997 was pessimistic; he saw the sysop as a dying breed presiding over fragile communities.The Future of the Internet is far more optimistic about the potential of bottom-up collaboration and altruistic community creation in creating a healthy online society.One possible difference between then and now, I would submit, is that the benefits of linking these communities together on the Internet-putting the "commons" in "semicommons"-are much clearer today.Zittrain's invocation of the "sysop" also leads us off into the world of bulletin-board systems (or "BBSes").Time and space constraints don't permit me to discuss them in detail as an additional example of an online organizational form.Their basic model, however-privately owned servers, connected to the telephone network, accessible to anyone who wished to dial in via modem-fits the basic semicommons pattern described in this essay, and their history also illustrates the applicability of Smith's model.For more on BBSes, see generally BBS: THE DOCUMENTARY (Bovine Ignition Systems 2005); Textfiles.com,History, http://www.textfiles.com/history/(last visited Apr. 10, 2010).
The fact that we have this residual platform-based power and are willing to use it creates its own countervailing dangers of strategic behavior.I could delete comments from people who disagree with me.YouTube could make it impossible for users to get their videos off the site.These are private uses that impose costs on the common uses, and they're well studied in the literature (if not usually in these terms). 20 2Abating these costs is itself a question in institution design.
But note that within the semicommons framework we can still describe the problem as overall cost minimization across a number of different forms of strategic behavior and strategic behavior prevention.
A third critical point about these sites is that their success depends on their users.A coherent and motivated user can collectively exercise governance to prevent abuse.Some users guard the private infrastructure from commons overuse.Craigslist knows which posts are spain because its users tell it. 203 Other users defend the commons from its own enemies.Wikipedia's first line of defense against lies and propaganda is eagle-eyed users who look for self-interested or bad-faith edits and undo them. 20 4A strong user community can even defend the commons from the private infrastructure owner, as Facebook discovered when it tried to introduce privacy-invading advertising technologies. 20 5cale matters in this story.The Tragic story reminds us that smaller groups will be better at self-monitoring and enforcement.And that's exactly what we see on the Internet, after a fashion."The Internet" as a whole doesn't have a generic "flag this content for removal" button.Instead, the pruning and weeding that help UGC sites flourish take place on those sites.The division of the Web into distinct "sites" makes it easier for close-knit communities to form.
These local communities, in turn, are coupled to each other in important ways. 20 6Individual blogs are connected to each in a network of linking, 202.See, e.g., Jack M. Balkin, Digital Speech and Democratic Culture: A Theory of Freedom of Expression for the Information Society, 79 N.Y.U.L. REV. 1 (2004) (discussing censorship powers of platform owners); James Grimmelmann, Saving Facebook, 94 IOWA L. REV.1137, 1192-95 (2009) (discussing platform lock-in).There's also the larger question of the proper division of value between private owner and commons users.One could argue that the platform owner who becomes rich off of user contributions is engaged in a form of digital exploitation.See, e.g., Soren Mork Petersen, Loser Generated Content: From quotation, and conversation. 20 7YouTube took off because it offered users the trivially easy capacity to embed videos in other Web pages-that is, to bridge the YouTube community and others. 20 8This is a modular structure: tight community coupling within a site and looser coupling across sites. 20 9his arrangement makes each site's Tragic internal governance problem more manageable while also facilitating Comedic conversations across sites.

10
The same is also true within large sites.
Wikipedia has many "WikiProjects": groups of pages on a similar topic. 2 11 They tend to have common groups of editors who focus on them.Bingo: a small and closeknit community, loosely coupled to others within Wikipedia.Similarly, social network sites divide the world into small networks centered around every user, forming overlapping communities.These Internet institutions bridge the optimal scales for private and common uses.

C. Usenet and Boundary-Setting
Semicommons theory explains the Internet's failures as well as its successes.Only those online institutions that can cost-effectively deter strategic behavior at the interface between private and common will prosper at the planetary scale of the Internet.Those that can't will stagnate rather than grow-or even collapse entirely under the strain of a worldwide semicommons.
As an example of a failed online semicommons, consider Usenet, a distributed set of message boards.Line Commc'n Servs., Inc., 907 F. Supp.1361, 1366 n.4, 1367-68 (N.D. Cal.1995).Purists may insist that "Usenet" refers only to one particular set of newsgroups, and that "Network News" is the correct umbrella term that also includes local newsgroups and even a few alternative hierarchies.See, e.g., KROL, supra, at 151-57.In practice, though, users often also referred to these other hierarchies as "Usenet."See id at 452.Similarly, one could technically distinguish between the higher-level protocol governing Usenet's messages and newsgroups, see M. HORTON  with each other via a peer-to-peer protocol. 2 16Each server talks only to a few others, but most Usenet servers are linked together so that any given message will eventually be propagated to all servers in the network. 2 17 This technical structure coexists with Usenet's semantic structure: a hierarchy of topical "newsgroups." 2 18 That hierarchy, established in 1987 in a coordinated event known as the "Great Renaming," divides the Usenet universe into "newsgroups" such as soc.culture.welsh(on Welsh culture) and sci.math (on mathematics). 2 19Individual messages are typically posted to a single newsgroup, but replicated across the whole network of servers. 220he administrators of individual servers decide which newsgroups to carry on their servers. 2 21senet also sported a higher level of governance (of the sort detailed by Ostrom) 222 in its institutions for collective decision making about which newsgroups to support. 223A centralized board coordinated the process: a proposed new newsgroup (or proposed deletion of an old one) would be publicly announced, discussed, and put to a vote.The certified results of this process were generally accepted as legitimate by server operators: a proposed newsgroup that won its vote would typically be added by enough servers that it would achieve critical density in the network and connect those users who wanted to join it. 224nce again, the semicommons structure is evident.Each server is a private use; each newsgroup is a common use.The two are inextricably intertwined.The worldwide network of servers gives each newsgroup a global reach, 225 but each server remains locally owned and operated.By 2002, Usenet was carrying over 1000 gigabytes of messages a day. 226hat's both a remarkable volume of shared content and a tremendous technical burden on each server.
Through the 1980s and into the mid-1990s, Usenet was a highly successful semicommons, reaching more than 2.5 million people by 1992.227 Both boundary-setting devices-between servers and between newsgroups-played a role.Dividing Usenet across servers (rather than centralizing it) made it technically feasible and enabled it to grow by accretion as individual server operators connected and joined the semicommons. 228Meanwhile, dividing Usenet into newsgroups supported the formation of smaller communities capable of exercising good internal governance.Strong social norms of netiquette discouraged off-topic posts, for example: a post about football in sci.math would draw a scolding. 229he norms of sci.math and the norms of soc.culture.welshcould be different, making both stronger.

How Usenet Failed
But past performance is no guarantee of future results, and Usenet didn't deal well with the Internet's massive surge in popularity during the 1990s.As the number of new Internet users increased exponentially year after year, so did the technical and social strains on the Usenet semicommons. 230The most visible form of abuse was spam: messages (usually, but not exclusively, commercial) posted to thousands of newsgroups at once.It placed enormous technical burdens on the private infrastructure and substantially degraded the readability of the common newsgroups. 23 1Both Usenet's property boundaries and its institutions proved incapable of dealing with this influx.
Architecturally, Usenet got the property boundaries wrong.Each meaningful community-a newsgroup-was split across many pieces of private infrastructure-servers-and vice-versa.This form of scattering inhibited opportunism by censorious private server operators: other servers could exchange a message even if one server deleted it, 23 2 and its own users could switch Usenet providers. 233But it also meant that neither private server operators nor commons newsgroup communities were in a position to deal effectively with spain.The private infrastructure owners couldn't individually take effective action against heavily cross-posted spam and garbage; they each had to monitor all of Usenet, and a message deleted on one server would still crop up on the others.
Meanwhile, the community of readers of a particular newsgroup being overrun also had no good tools to stop the flood.Social norms collapsed under the first sustained assault from outsiders.In 1994, a pair of immigration lawyers advertised their services on over 6000 newsgroups. 2 34 The outcry was remarkable: not just online condemnation, but also selfhelp denial-of-service e-mail attacks, threats to the lawyers' ISP, and "huge numbers of magazine[] subscriptions in [the lawyers'] names." 23 5Legal scholars have discussed the remarkable vehemence of this response, 236 but it's more a sign of weakness than of strength.Effective social norms don't require such extensive enforcement, precisely because they're effective.The immigration lawyers were outsiders to the newsgroup communities they spammed, afraid of no threats the newsgroups could wield. 237n any event, later events established that social norms were essentially ineffective against spam.
Other spammers soon followed, in large numbers. 238The green-card lottery spam was the proof of concept-the lawyers behind it even published a book of advice for other would-be 232.See Giganews, Usenet Interview with John Gilmore, http://www.giganews.com/usenet-history/gilmore.html (last visited Apr. 10, 2010) ("For example, the quote I seem to be most famous for, 'The net treats censorship as damage and routes around it', came directly out of my Usenet experience.I was actually talking about the Usenet when I first said it.And that's how the Usenet works-if you have three news feeds coming in, and one of those feeds censors the material it handles, the censored info automatically comes in from the other two.").
233.Usenet spammers, claiming they'd made over $100,000 at it. 239Efforts at educating new users also proved futile. 240Without clear boundaries to exclude outsiders and effective enforcement mechanisms, norm-based selfgovernance runs into exactly the barriers described by Ostrom.
In the late 1990s, as these failures were becoming obvious, Usenet citizens (and a few legal scholars 24 1 ) celebrated instead the possibility of technical self-defense.
One older technique was moderation: as on a moderated listserv, messages would be sent to a newsgroup administrator, and only posted after he or she approved them. 24 2While effective in dealing with spam, moderation imposes substantial costs on the Comedic potential of a group: it slows down messages as they wait for the moderator's approval, inhibiting conversation; 243 it depends on the volunteer labor of a moderator willing to perform this round-the-clock job; 244 and it allows the moderator to behave opportunistically, shaping or censoring the flow of dialogue. 245ost Usenet groups were unmoderated, 246 and it's not hard to think of a reason.Moderation doesn't scale.
Newer techniques of technological self-defense fared little better.
Consider the killfile: a personal list of users whose messages you don't want to see. 247Your personal newsreading program hides those messages; they remain on the server for others to read. 248The killfile sounds like a perfectly speech-friendly system: the killfiler's freedom not to read is respected, and so is the killfilee's freedom to speak and other users' freedom to hear from her. 249 And it does work well enough in small communities for dealing with specific annoying users: a kind of virtual silent treatment. 250But it fails in a larger commons.Most of the spammers and trolls are new users you've never heard of (and will never hear from again). 25 1Nor does killfiling reduce the technical burdens felt by server operators-the servers still need to carry the messages, even though users ignore them.
Cancelbots failed, too.They take their name from the fact that a Usenet post author can send a follow-up message to "cancel" her original post, thereby deleting it. 252These messages are easily forged, leading to a selfhelp mechanism for dealing with spam and abuse: just forge a cancel message for the offending post. 253As the spam problem grew, vigilante Usenet users started automating the cancels, using programs called "cancelbots." 254In practice, though, cancelbots didn't so much end the Usenet spam wars as escalate them. 2 55 Spammers vied to send out ads faster than the vigilantes could cancel them; this competition was a wasteful arms race. 256Worse, spammers themselves could use cancels against their (including killing, marking as read, and sorting messages) in Unison (a newsreader) and Mail.app (an e-mail client) for Mac OS X.
249.See SHIRKY, supra note 212, at 82 ("Kill files perfectly illustrate the burden placed on the reader on Usenet where freedom of speech is as absolute as its gets anywhere.For all intents and purposes, anyone can say anything to anyone.If a certain kind of speech causes upset, it is usually up to the reader not to read posts about those subjects or mail from those people.").
250.This observation is based on the personal experience of the author.I would rather not, for reasons that should be obvious, name the specific newsgroups and mailing lists on which I have resorted to using a killfile.
251.See, e.g., James "Kibo" Parry, Killfiles and You, http://www.kibo.com/kibokill/(last visited Apr. 10, 2010) (providing detailed suggestions for efficient use of a killfile).Note the assumption that filtering out specific unwanted users will not suffice to make a newsgroup readable; more detailed filtering is required.
252. 256.See Froomkin, supra note 212, at 829-31 (discussing "Usenet Death Penalty" in which a site considered to be too lax in stopping spain "has every single Usenet post originating from it immediately canceled or at least not forwarded.Thus, every person using that ISP loses the ability to post to Usenet regardless of his or her guilt or, in most cases, enemies 2 57 (leading to the use of pseudonymous anti-spam entities like the Cancelmoose[tm]). 258Worse still, griefers 259 could use cancels against completely innocent Usenet posters 26 0 -so that many server administrators simply ignored cancels entirely. 26 1erhaps some larger institution could have developed coherent cancelation policies and consistently applied those policies across Usenet's mishmash of servers and newsgroups. 262But Usenet's existing institutions were too weak and too distrusted to develop and enforce such policies on the diverse and dispersed Usenet community. 263 2010)) ("Cancelmoose[tm] is, to misquote some wise poster, 'the greatest public servant the net has seen in quite some time.'Once upon a time, the 'Moose would send out spam-cancels and then post notice anonymously to news.admin.policy,news.admin.misc,and alt.current-events.net-abuse.The 'Moose stepped to the fore on its own initiative, at a time (mid 1994) when span-cancels were irregular and disorganized, and behaved altogether admirably-fair, even-handed, and quick to respond to comments and criticism, all without self-aggrandizement or martyrdom.... Nobody knows who Cancelmoose[tm] really is, and there aren't even any good rumors.").
260.See Cancel FAQ, Part 2/4, supra note 252, at V.D (listing "rogue cancellers of various skill, competence, and intelligence").Notable examples include Ellisd, who tried on moral grounds to cancel all messages posted to alt.sex, and the so-called CancelBunny, which tried to cancel posts containing the scriptures of Scientology.Id.
261.Froomkin, supra note 212, at 829. 262.See id. at 828-31 (discussing attempt by "Internet vigilantes" to coordinate their efforts through the news.admin.net-abusenewsgroup and impose collective punishments on servers deemed to be excessively spam-friendly and discussing debates over legitimacy and existence of consensus to act against particular spammers).
263.See id. at 823-25 (discussing difficulty of coordinating process of selecting which newsgroups to carry); Hardy, supra note 219 (discussing how dissatisfaction with decisions by administrators of "backbone cabal" systems not to carry newsgroups discussing sex or drugs, leading to creation of alternative hierarchy for distribution of news and abdication of previous coordinators of newsgroup-creation process); Lee S. Bumgarner, The Great Renaming FAQ: Part 4, http://www.linux.it/-md/usenet/gr4.htm(last visited Apr. 10, 2010) (discussing near "constitutional crisis" on Usenet, including forged votes, over whether to create a newsgroup devoted to discussion of aquaria); Giganews, 1987: The Great Renaming-Page 2, http://www.giganews.com/usenet-history/renaming-2.html(last visited Apr. 10, 2010) (discussing controversy over Great Renaming and suspicion of the administrators who pushed it through).
264.See Froomkin, supra note 212, at 823 (noting that "backbone cabal" systems carried disproportionate share of Usenet traffic); id. at 824-25 (observing that "a large number, perhaps a majority, of sites had effectively delegated administration of the newsgroup redesigning Usenet's technical protocols would have required widespread user and server-owner agreement-but that would also have meant giving up some of the private control and commons freedom that these groups prized about Usenet.
In the end, Usenet's distributed openness left it vulnerable to exactly the pressures Smith identifies: griefers used the commons to strategically target private users, and spammers used the commons without heeding the effects on private infrastructure. 265Usenet itself is not dead.One can still go to Google Groups or Giganews and participate in ongoing conversations in groups with strong norms that allowed them to weather the storm.But it has nowhere near the relative importance that it once did to the life of the Internet.
ISPs are gradually dropping their support for Usenet newsgroups, 2 66 and it seems unlikely that the system will ever meaningfully rise again. 267 Why E-mail Succeeded Where Usenet Failed This diagnosis-bad boundary-setting-is specific to Usenet.The entire Internet suffers from spam 2 6 8 -any sufficiently advanced technology is indistinguishable from a spam vector.269 Spare is the most characteristic form of strategic behavior in the Internet semicommons; a commons use that imposes serious costs both on commons and private use. 2 70 But other applications have managed to cope with the spam problem-because their boundaries are drawn in ways that permit more effective monitoring and enforcement.Contrast Usenet, for example, with the UGC sites described in the previous section.As detailed above, these sites align commons community with private server infrastructure, giving them governance and exclusion creation process to one person" who was willing to "take the time started to wonder why they should be reserving big chunks of their own disk space for pirated movies and repetitive porn.").
advantages that newsgroups lacked. 27 1But a UGC site configured as a discussion board behaves-from a user perspective-almost exactly like a newsgroup. 272Together with blogs and other similar social software, these Web-based discussion boards have taken over many of the communityforum roles that Usenet newsgroups previously played.Now that dispersed servers are no longer technically necessary-as they were in the days before the Intemet 273 -the dedicated website is a superior institutional form from a semicommons perspective.
Or contrast Usenet with e-mail.While e-mail spam is certainly a serious and costly problem, e-mail has nonetheless been one of the Internet's great success stories.This success is all the more remarkable, given that a decade and a half ago, e-mail and Usenet looked very similar. 274They were started within a few years of each other, and they're both text-based, Internet-wide communications systems that allow users to communicate with each other through a peer-to-peer process of message exchange. 275And yet the e-mail of 2010 is essential to the Internet as we know it and is used by almost everyone; the Usenet of 2010 is an archaic survival used by small groups of enthusiasts.
What happened?E-mail got its property boundaries right.Usenet was created with the expectation that users throughout the network would share the same newsgroups; its design works to coordinate everyone's experiences. 2 76By contrast, e-mail cares only about delivery: getting a particular message to a particular recipient. 277There are no e-mail equivalents to newsgroups-coordinated entities that all users see in a substantially identical form.Each e-mail server is a dedicated piece of infrastructure designed to enable incoming and outgoing e-mail for its own 271.See supra Part IV.B. 272.See Usenet, Wikipedia, http://en.wikipedia.org/wiki/Usenet(last visited Apr. 10, 2010) ("Usenet . . . is the precursor to the various Internet forums that are widely used today .... ").
273.See Giganews, Usenet Newsgroups History, http://www.giganews.com/usenethistory/index.html (last visited Apr. 10, 2010) (describing switch from UUCP to NNTP as being designed to take advantage of "cutting-edge networking concepts" including the "always-on" Internet).Without the always-on Internet, unless most users are willing to pay long-distance phone charges to connect, they need to have servers located near them.
274.Krol, writing in 1994, thought that e-mail and Usenet each deserved a chapter.Indeed, he gave the Web roughly the same amount of space he gave to Usenet.KROL, supra note 212, at 101-48 (e-mail); id. at 151-87 (Usenet); id. at 287-322 (Web).[T]he newsgroup is removed from every site on the network ... ").
277.See, e.g., RFC 821, supra note 145, § 3.2 (discussing forwarding of message by intermediate relays, with no expectation that they will retain copies for themselves or transmit to other, unspecified recipients).users. 278This difference means that e-mail servers have defensible borders.I can install a spam filter without disrupting any e-mail except that to and from users on my piece of the network. 279That lets me experiment with local anti-spain policies without needing anyone else's permission or cooperation. 2 80Users, in turn, can choose e-mail providers based on the quality of their spam filtering. 28 1n a very important sense, e-mail is less ambitious than Usenet.E-mail may be a common protocol open to everyone, and most e-mail servers may be "common" in the sense that anyone can send a mail to users on them, but e-mail itself is deeply nonpublic.A great e-mail message can only be widely shared through successive forwarding.Some people who might have benefited from it won't ever be on the cc: lists.That's a loss to the commons, and it's a reason that e-mail coexists with all sorts of systems designed to offer more community, like mailing lists and the discussion boards we've already met.But the price we pay for needing to turn elsewhere for a fuller commons is that e-mail actually works.

V. CONCLUSION
In the scholarly debates over the significance of the Internet, the privateversus-common dichotomy looms large.Triumphalists proclaim that the Internet creates new forms of collaboration and that the commons is the way of the future.Skeptics respond that the stability and sustainability of the Internet depend on private ownership.These are the Comedic and Tragic stories, and they animate scholarly controversies in telecommunications, intellectual property, privacy, intermediary regulation, virtual worlds, and almost every other corner of Internet law.
In addition to its analytical virtues in explaining why some Internet systems thrive and others fail, semicommons theory also speaks to these debates.It reminds us not to take the seeming schism between "private" and "common" too seriously.The greatest commons the world has ever seen is built out of private property; the highest, best, and most profitable use of that property is to create a commons.Private and common need each other, and we need them both on the Internet.Our task is not to choose between them but to find ways to make them work well together. 282 9. See POST, supra note 1, at 80-86 (discussing layering); ZITTRAIN, supra note 2, at 67-69 ("Layers facilitate polyarchies .... ); infra Part IVA. 10.See ZITrRAIN, supra note 2, at 142-43, 147-48 (discussing "netizenship" and "personal commitments" of Wikipedia editors); infra Part IV.B. See generally ANDREW DALBY, THE WORLD AND WIKIPEDIA: How WE ARE EDITING REALITY (2009) (describing the history and norms of Wikipedia in detail).
economics and creates special managerial and regulatory challenges.See generally CARL SHAPIRO & HAL R. VARIAN, INFORMATION RULES: A STRATEGIC GUIDE TO THE NETWORK ECONOMY (1999); Oz SHY, THE ECONOMICS OF NETWORK INDUSTRIES (2001).The semicommons analysis developed in this essay may have implications for these industries.86.It also has other unfortunate side effects: it cripples otherwise useful devices and smothers innovation.See Wendy Seltzer, The Imperfect Is the Enemy of the Good: Anticircumvention Versus Open Development, 25 BERKELEY TECH.L.J. (forthcoming 2010).
91. See, e.g., MASHA GESSEN, PERFECT RIGOR: A GENIUS AND THE MATHEMATICAL BREAKTHROUGH OF THE CENTURY, at vii-xi (2009) (predicting that the $1,000,000 prize for proof of Poincar6 Conjecture is likely to be refused by mathematician who proved it); DAVA SOBEL, LONGITUDE: THE TRUE STORY OF A LONE GENIUS WHO SOLVED THE GREATEST SCIENTIFIC PROBLEM OF HIS TIME 16 (2006) (describing the £20,000 prize offered in 1714 for discovery of an accurate method of determining longitude while at sea).
275.Usenet was born in 1979, WALDROP, supra note 178, at 427-28, modem SMTPbased e-mail in 1983, id. at 465. 276.See, e.g., MARK R. HORTON, RFC 850, STANDARD FOR INTERCHANGE OF USENET MESSAGES § 3.4 (1983), http://tools.ietf.org/html/rfc850("This message removes a newsgroup with the given name .... 1People choose to go to the Internet for the same reasons Jefferson See YouTube, YouTube Fact Sheet, http://www.youtube.com/t/fact_sheet(last visited Apr. 10, 2010) (reporting twenty hours of video uploaded per minute).My calculation assumes a forty-hour workweek.If you didn't stop to sleep or eat, you could watch a day's worth of YouTube videos in only three years and a few months.28.Cf James Grimmelmann, Information Policy for the Library of Babel, 3 J.Bus.& TECH.L. 29, 38-40 (2008) (comparing the Internet to Borges's infinite Library of Babel).
Id.at 61.Haldane's essay itself links biological and political scale, concluding with a passage on the maximum size of a democratic state (increasing with technological change) and the impossibility of socialist governance of truly large states.J.B.S. Haldane, On Being the Right Size, 152 HARPER'S MAG.
32. See id. at 116-17, 172-78 (discussing Jefferson's vision of the settlement of the American West).33.See ZITrRAIN, supra note 2, at 19-35; see also Jonathan L. Zittrain, The Generative Internet, 119 HARV.L. REv.1974 (2006).34.ZITTRAIN, supra note 2, at 80-90.35.See id. at 31 (discussing "procrastination principle" of deferring decisions by leaving architecture open initially); cf Carliss Y. Baldwin & Kim B. Clark, The Architecture of See infra Part II.B. 46.ZiTTRAtN, supra note 2, at 148 ("Wikipedia is the canonical bee that flies despite scientists' skepticism that the aerodynamics add up." (citing YOCHAI BENKLER, THE WEALTH OF NETWORKS: HOW SOCIAL PRODUCTION TRANSFORMS MARKETS AND FREEDOM 76-80 (2006))).In his conclusion, Post applies his own biological metaphor to Wikipedia, writing that Wikipedia might well be "a pretty good moose, something we could bring with us ... to show to people of the Old World."POST, supra note 1, at 209.

MANCUR OLSON, THE LOGIC OF COLLECTIVE ACTION: PUBLIC GOODS AND THE
British lighthouse system, maintained by Trinity House, a governmental body).But see id. at 363-72 (discussing history of private lighthouses in Britain).
See James Boyle, Cruel, Mean, or Lavish?Economic Analysis, Price Discrimination and Digital Intellectual Property, 53 VAND.L. REV.2007, 2021-35 (2000) (discussing "Econo-World" view of price discrimination).But that world isn't our world, and, in ours, price discrimination is costly and imperfect, leaving us to argue over second bests.See, e.g., Yochai Benkler, An Unhurried View of Private Ordering in Information Transactions, 53 VAND.L. REV.2063, 2072 (2000).88.See Niva Elkin-Koren, Copyright Policy and the Limits of Freedom of Contract, 12 BERKELEY TECH.L.J. 93, 99-100 (1997).89.See Shyamkrishna Balganesh, Forseeability and Copyright Incentives, 122 HARV.L. REv.1569, 1577-79(2009).90.See R.H. Coase, The Lighthouse in Economics, 17 J.L. & ECON.357, 360-62 (1974) (discussing 0 7 100.The phrase comes from Carol Rose, The Comedy of the Commons: Custom, Commerce, and Inherently Public Property, 53 U. CHI.L. REV.711, 768 (1986).101.See ERIC S. RAYMOND, THE CATHEDRAL AND THE BAZAAR: MUSINGS ON LINUX AND OPEN SOURCE BY AN ACCIDENTAL REVOLUTIONARY 33-36 (2001).102.Eric Raymond gives the metaphor of a "magic cauldron" that produces soup ex nihilo, then argues that open source software is that cauldron made real.See id. at 115. 103.BENKLER, supra note 46; Benkler, supra note 98, at 415 ("[T]here are increasing returns to the scale of the pool of individuals, resources, and projects to which they can be applied.").104.The inspiration for this term comes from Carol Rose's remarkable Comedy of the Cf Eben Moglen, Anarchism Triumphant: Free Software and the Death of Copyright, FIRST MONDAY, Aug. 2, 1999, http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/684/594("So Moglen's Metaphorical Corollary to Faraday's Law says that if you wrap the Internet around every person on the planet and spin the planet, software flows in the network.It's an emergent property of connected human minds that Aug. 11, 2009, http://techcrunch.com/2009/08/11/poor-google-knol-hasgone-from-a-wikipedia-killer-to-a-craigslist-wannabe/(decline of Knol).109.POST, supra note 1, at 133-41.110.ZrrTRAIN, supra note 2, at 141. 111.These same standards were in many cases developed in open participatory processes, where all-important decisions were made on a consensus basis.See MILTON L. MUELLER, RULING THE ROOT: INTERNET GOVERNANCE AND THE TAMING OF CYBERSPACE89-94 (2002).to pick one or the other.The Tragic story seems perfectly plausible as a description of the sad fate awaiting all of the shared and all-tooexhaustible aspects of the Internet: bandwidth, server space, processor cycles, and human attention.The Comedic story seems equally plausible as a description of the great achievements that result from instant, inexpensive, worldwide sharing of inexhaustible information.Our task for this part will be to reconcile the two.In a semicommons, a resource is owned and used in common for one major purpose, but, with respect to some other major purpose, individual economic units-individuals, families, or firms-have property rights to separate pieces of the commons.Most property mixes elements of See, e.g., ZITTRAIN, supra note 2, at 133 (closure of Nupedia); Randall Stross, Encyclopedic Knowledge, Then vs. Now, N.Y.TIMES, May 3, 2009, at BU3 (end of Encarta); TECHCRUNCH, 114.See supra Part II.B. 115.See supra Part II.C. reason common and private ownership, but one or the other dominates ....In what I am calling a semicommons, both common and private uses are important and impact significantly on each other. 116 Id.at 132.Yes, sheep and pastures again.Hardin's tragic commons and Ostrom's potentially sustainable one are the same as Smith's semicommons, just theorized differently.
See Rackspace, Definitions and Technical Jargon of the Hosting Industry, http://www.rackspace.com/information/hostingl01/definitions.php(last visited Apr. 10, 139. 147.Cf Henry E. Smith, Governing the Tele-Semicommons, 22 YALE J. ON REG.289(2005)(applying semicommons theory to argue against the use of common property treatment of individual physical network elements).ZITrRAIN, supra note 2, at 159 (advocating "a simple dashboard that lets the users of PCs make quick judgments about the nature and quality of the code they are about to run").Such sentiments assume that users have the sort of autonomy over their PCs that a private property owner would, a principle Zittrain strongly endorses.See id. at 108-09.150.See Andrew Odlyzko & Benjamin Tilly, A Refutation of Metcalfe's Law and a Better Estimate for the Value of Networks and Network Interconnections 4 (Mar.2, 2005) (unpublished manuscript), available at http://www.dtc.umn.edu/-odlyzko/doc/metcalfe.pdf(arguing that the value of an n-user network grows as n log n); see also Bob Briscoe et al., Metcalfe's Law Is Wrong, IEEE SPECTRUM, July 2006, at 35 (later version of Odlyzko & Tilly article).
See ZITTRAIN, supra note 2, at 36-51.Of course, we might classify these contentlevel costs as burdens on the private resources of users' attention, but saying that this is a cost imposed on the commons, even if less descriptively precise, is clearer in terms of pinpointing the problem.visited Apr. 10, 2010) ("A CAPTCHA is a program that can generate and grade tests that humans can pass but current computer programs cannot.").But CAPTCHAs are costly, too.See BaltTech, Towson U., National Federation of the Blind Re-Invent CAPTCHA, http://weblogs.baltimoresun.com/news/technology/2009/ll/towsonu nationalfederationo.html (Nov.18, 2009, 8:18 EST) (quoting computer science professor Jonathan Lazar as saying, "[b]asically, computer viruses are twice as successful as blind people on the old captchas.").185.See WILLIAM R. CHESWICK ET AL., FIREWALLS AND INTERNET SECURITY: REPELLING 183.See David Pogue, On the Job, a Spain Fighter Is Learning, N.Y.TIMES, Mar. 30, 2006, at C1 (describing the Spam Cube, a home device to filter spam).The Spam Cube, like other spam-fighting technologies, is costly in two different ways.It costs $150, and along with the spam it catches, it also blocks legitimate emails.Id. 184.See ReCAPTCHA, What Is a CAPTCHA?, http://recaptcha.net/captcha.html(last "And yet it moves."STEPHEN HAWKING, ON THE SHOULDERS OF GIANTS: THE GREAT WORKS OF PHYSICS AND ASTRONOMY 393 (2002) (quoting Galileo Galilei)."[M]ost historians regard the story as a myth."Id.
Formal Complaint of Free Press & Pub.Knowledge Against Comcast Corp. for Secretly Degrading Peer-to-Peer Applications, 23 F.C.C.R.
2 12 Its use of interconnected servers to 207.See ROSENBERG, supra note 106, at 205-06; Posting of James Grimmelmann to LawMeme, http://lawmeme.research.yale.edu/modules.php?name=News&file=print&sid= 1155 (June 18, 2003, 4:03 EDT).208.See Posting of Deepak Thomas and Vineet Buch to Startup Review, http://www.startup-review.com/blog/youtube-case-study-widget-marketing-comes-of-age.php(Mar.18, 2007).209.See SIMON, supra note 168, at 197-205 (describing common pattern of tightly coupled modules themselves loosely coupled to each other); Mark S. Granovetter, The Strength of Weak Ties, 78 AM.J. Soc.1360 (1973) (describing power of loose links to bridge different social groups).210.This point may have implications for Zittrain's goal of stopping malware through "suasion" and "experimentation."ZITrRAIN, supra note 2, at 173.Zittrain's discussion of the challenges and goals of the StopBadware project clearly recognizes the dangers of both too much and not enough private control, at multiple scales.Id. at 168-73.Sites experimenting with security policies in an informed way are private and Tragic.Internetwide monitoring and information-sharing are common and Comedic.See id.shared worldwide "newsgroups" made it a thriving semicommons through the 1980s. 2 13But this same structure couldn't cope with the abuses caused by the Internet's exponential takeoff in the 1990s. 2 14A comparison of Usenet with e-mail and UGC sites shows that they did a better job of boundary-setting. 215Semicommons theory explains how different design choices can help one online institution succeed where another fails.1.How Usenet Works Individual Usenet users post and read messages on a local server on which they have an account; these servers then exchange new messages USENET BOOK: FINDING, USING, AND SURVIVING NEWSGROUPS ON THE INTERNET (1994).For discussion of its culture and sociology, see generally MICHAEL HAUBEN & RHONDA HAUBEN, NETIZENS: ON THE HISTORY AND IMPACT OF USENET AND THE INTERNET (1997); HOWARD RHEINGOLD, THE VIRTUAL COMMUNITY: HOMESTEADING ON THE ELECTRONIC FRONTIER 117-31 (1993); and CLAY SHIRKY, VOICES FROM THE NET 80-89 (1995).Within the law review literature, see generally A. Michael Froomkin, Habermas@Discourse.net:Toward a Critical Theory of Cyberspace, 116 HARV.L. REv.749, 821-31 (2003); Paul K. Ohm, On Regulating the Internet: Usenet, a Case Study, 46 UCLA L. REV.1941 (1999); David G. Post, Pooling Intellectual Capital: Thoughts on Anonymity, Pseudonymity, and Limited Liability in Cyberspace, 1996 U. CHI.LEGAL F. 139, 163 n.54; and Charles D. Siegal, Rule Formation in Non-hierarchical Systems, 16 TEMP.ENVTL.L. & TECH.J. 173, 181-83, HARRISON, THE USENET HANDBOOK: A USER'S GUIDE TO NETNEWS (1995); ED KROL, THE WHOLE INTERNET: USER'S GUIDE & CATALOG (2d ed.1994); TIM O'REILLY & GRACE TODINO, MANAGING UUCP AND USENET (10th ed.1992); and BRYAN PFAFFENBERGER, THE 2010] create & R. ADAMS, RFC 1036 STANDARD FOR INTERCHANGE OF USENET MESSAGES (1987), http://tools.ietf.org/html/rfc1036[hereinafter RFC 1036], and the lower-level protocols governing how those messages are transferred from one computer to another.See MARK R. HORTON, RFC 976, UUCP MAIL INTERCHANGE FORMAT STANDARD (1986), http://tools.ietf.org/html/rfc976;BRIAN KANTOR & PHIL LAPSLEY, RFC 977, NETWORK NEWS TRANSFER PROTOCOL (1986), http://tools.ietf.org/html/rfc1036.But in practice, the same social conventions causing users and administrators to standardize on the one protocol also led them to standardize on the other.As Paul Ohm puts it, "Just as you can get from downtown to Westwood without a car, you can communicate via Usenet without NNTP [Network News Transfer Protocol].But most people would not take this trip without a car, just as most people do not use USENET except over NNTP."Ohm, supra, 1949-50 n.28 (citing RFC 1036, supra, § 4).213.See infra Part IV.C. 1. 214.See infra Part IV.C.2.215.See infra Part IV.C.3.
See KROL, supra note 212, at 132 ("If you are offended [by your server administrator's refusal to carry a newsgroup], you have two choices: find another server or beat up on your administrator.").See Siegal, supra note 212, at 193 ("Moreover, Canter and Seigel, unbowed by their role as outcasts on the Internet, published a book telling other would be cyber-entrepreneurs how to profit by following their example . ... ").238.See id.("[S]pamming is common .... ).