IPcentral Weblog
  The DACA Blog

Friday, August 1, 2008

If Bandwidth Is Abundant, It Can't Be Scarce, So Why Can't We Have Net Neutrality?
(previous | next)

Web Pro News' Jason Lee Miller seems to think he's hoisted my colleague Bret Swanson, and The Progress & Freedom Foundation in general, on our own collective petard. Bret had responded to Tim Wu's NYT op-ed by questioning Wu's argument for developing "alternative supplies of bandwidth" to free us from the tyranny of the OPEC-like broadband cartel:

Unlike natural resources such as oil, which, while abundant, are at some point finite, bandwidth is potentially infinite. The miraculous microcosmic spectrum reuse capabilities of optical fiber and even wireless radiation improve at a rate far faster than any of our macrocosmic machines and minerals. It is far more efficient to move electrons than atoms, and yet more efficient to move photons. Left unfettered, these technologies will continue delivering bandwidth abundance.

Miller suggests that this response to Wu destroys arguments Bret and others at PFF have made against net neutrality regulation--a crusade led by Wu (who taught me Internet law, as it happens):
So what [Swanson is] saying is bandwidth scarcity is a notion invented by internet service providers and wireless providers to jack up prices and provide excuses for interfering with competing services on their networks. Nice. In a weird way, Swanson focuses so hard on disproving Wu's analogy one way, he misses how the analogy is proved in another: a few organizations (government or not) controlling an important resource and forcing artificial scarcity in order to control the market for that resource is called a cartel.

Miller's "Gotcha!" rests on the seemingly undeniable premise that broadband can't be both abundant (as Bret argues) and scarce (such that ISPs must management traffic on their networks, however non-neutral that may be). But in fact, this seeming contradiction is inherent in the very nature of the Internet--and the way Internet access is currently priced.

On the one hand, Bret is right that broadband is "abundant" in a way that resources in the real world cannot be: Continued investments in broadband networks by network operators have dramatically increased the amount of bandwidth available--causing prices to plummet for both wireline and wireless broadband. Consumers today enjoy greater download speeds while paying constantly decreasing prices per bit. So much for Wu's OPEC analogy.

But contrary to those defenders of net neutrality regulation who think we can somehow grow our way out of the problem of network congestion merely by increasing the amount of bandwidth available, the demand for bandwidth is also infinitely elastic. Making more bandwidth available simply encourages the development of new services and content whose use and consumption requires more bandwidth. The significant advances in bandwidth available to U.S. broadband consumers in recent years have made it possible for us all to share huge amounts of data through peer-to-peer file-sharing services, view essentially infinite amounts of video back up hundreds of gigabytes on online storage services like Amazon's reasonably-priced S3, and even begin moving our most basic computing tools like email and word processing into the "cloud." One has only to contemplate the kind of bandwidth that will be required when YouTube goes hi-def (something the less-popular Vimeo has already done) to realize that, from the network operator's perspective, trying to solve network congestion problems simply by increasing the amount of bandwidth available is like a pie-eating contest where the prize is... more pie.

Thus, broadband can be increasingly "abundant" in the sense that there is always more of it available than ever before and, in the narrow and particular sense of Internet network congestion, "scarce" (i.e., unlimited) at the same time. This apparent contradiction stems from three facts:

  1. Internet content and services are increasingly free to the user, either supported by advertising revenues or by some "up-sell" of additional features beyond the basic, free version. This means that consumers have no economic reason not to gobble up the "new, new thing"--which usually consumes more bandwidth than whatever content or service it replaces.

  2. Similarly, and more importantly, data use is priced on an all-you-can-eat basis. Consumers pay a flat monthly fee for essentially "unlimited" broadband.

  3. The secret to the Internet's efficiency lies in its architecture as a packet-switching network of networks: Unlike the circuit-switched traditional telephone network, the Internet works precisely because only a small fraction of its users are sending or requesting bits at any particular moment. If everyone tried to watch a hi-def video at once (or watch any video, for that matter), the Internet would simply crash to a screeching halt. Thus, even "abundant" bandwidth is necessarily scarce in terms of how many people can try to use it at any particular moment for a particular application. Yes, it's conceivable that in some future world with orders of magnitude more bandwidth than exists today, every person could indeed watch a classic YouTube video from 2008, but if they all tried to host holographic conference calls... the same basic limitation would apply.

Thus it is that a tiny number of network users can consumer the vast majority of its bandwidth. Since fact #1 is essentially a law of the Internet universe, and since no amount of additional bandwidth will overcome the constraints inherent in fact #3, ISPs must find some way of dealing with the problem of network congestion if they are to satisfy the vast majority of their customers whose network use is degraded by those who use more bandwidth than they do. Proponents of network neutrality regulation like Wu would limit the ability of ISPs to deal with this problem through traffic management by putting government bureaucrats in charge of deciding which forms of management are benign and which are not. (On this very day, the FCC is about to hold Comcast in violation of a non-binding 2005 policy statement for throttling, but not blocking, certain bandwidth-hogging users of the peer-to-peer file-sharing system BitTorrent.)

While some amount of traffic management will always be necessary, the need for it can certainly be reduced by changing fact #2: moving to a different pricing structure for Internet access. As my PFF colleague Adam Thierer has explained,

a "Ramsey two-part tariff" ... would involve a flat fee for service up to a certain level and then a per-unit / metered fee over a certain level. I don't know where the demarcation should be in terms of where the flat rate ends and the metering begins; that's for market experimentation to sort out. But the clear advantage of this solution is that it preserves flat-rate, all-you-can-eat pricing for casual to moderate bandwidth users and only resorts to less popular metering pricing strategies when the usage is "excessive," however that is defined.

Experiments in this area are indeed underway (and see further discussion here). Their ultimate success will likely depend on setting the all-you-can-eat threshold high enough that ordinary users (say, 90-95% of all users) are not affected.

In the meantime, those of us who defend the accelerating abundance of broadband as the result of ongoing investments by network operators while opposing net neutrality regulation as an impediment to such investments clearly have our work cut out for us in explaining the apparent contradiction of "scarcity"-in-abundance that is unique to Internet bandwidth.

posted by Berin Szoka @ 3:14 PM | Broadband , Internet , Net Neutrality

Share |

Link to this Entry | Printer-Friendly

Post a Comment:

Blog Main
RSS Feed  
Recent Posts
  EFF-PFF Amicus Brief in Schwarzenegger v. EMA Supreme Court Videogame Violence Case
New OECD Study Finds That Improved IPR Protections Benefit Developing Countries
Hubris, Cowardice, File-sharing, and TechDirt
iPhones, DRM, and Doom-Mongers
"Rogue Archivist" Carl Malamud On How to Fix Gov2.0
Coping with Information Overload: Thoughts on Hamlet's BlackBerry by William Powers
How Many Times Has Michael "Dr. Doom" Copps Forecast an Internet Apocalypse?
Google / Verizon Proposal May Be Important Compromise, But Regulatory Trajectory Concerns Many
Two Schools of Internet Pessimism
GAO: Wireless Prices Plummeting; Public Knowledge: We Must Regulate!
Archives by Month
  September 2010
August 2010
July 2010
June 2010
  - (see all)
Archives by Topic
  - A La Carte
- Add category
- Advertising & Marketing
- Antitrust & Competition Policy
- Appleplectics
- Books & Book Reviews
- Broadband
- Cable
- Campaign Finance Law
- Capitalism
- Capitol Hill
- China
- Commons
- Communications
- Copyright
- Cutting the Video Cord
- Cyber-Security
- Digital Americas
- Digital Europe
- Digital Europe 2006
- Digital TV
- E-commerce
- e-Government & Transparency
- Economics
- Education
- Electricity
- Energy
- Events
- Exaflood
- Free Speech
- Gambling
- General
- Generic Rant
- Global Innovation
- Googlephobia
- Googlephobia
- Human Capital
- Innovation
- Intermediary Deputization & Section 230
- Internet
- Internet Governance
- Internet TV
- Interoperability
- IP
- Local Franchising
- Mass Media
- Media Regulation
- Monetary Policy
- Municipal Ownership
- Net Neutrality
- Neutrality
- Non-PFF Podcasts
- Ongoing Series
- Online Safety & Parental Controls
- Open Source
- PFF Podcasts
- Philosophy / Cyber-Libertarianism
- Privacy
- Privacy Solutions
- Regulation
- Search
- Security
- Software
- Space
- Spectrum
- Sports
- State Policy
- Supreme Court
- Taxes
- The FCC
- The FTC
- The News Frontier
- Think Tanks
- Trade
- Trademark
- Universal Service
- Video Games & Virtual Worlds
- VoIP
- What We're Reading
- Wireless
- Wireline
Archives by Author
PFF Blogosphere Archives
We welcome comments by email - look for a link to the author's email address in the byline of each post. Please let us know if we may publish your remarks.

The Progress & Freedom Foundation