[Not sure if someone else has mentioned this here yet, but... ] There's a terrific piece by Paul Korzeniowski in Forbes this week about the Comcast-BitTorrent debacle called, "Feds and Internet Service Providers Don't Mix." It's well worth reading the whole thing, but I particularly like this passage:
For whatever reason, some believe ISPs should not be able to put any restrictions on the volume of information that any user transmits. That's absurd. Per-bit and per-byte pricing models have long been used for data transmissions. In trying to build and sustain their businesses, carriers constantly balance their attractiveness and viability versus unlimited usage pricing models. By government decree, they no longer have that option. In effect, the FCC has decided to tell ISPs how to run their networks.
A related issue is Comcast's reluctance to disclose its network management processes. The reason seems obvious. Carriers spend literally billions of dollars installing and fine-tuning their networks each year. If they can move traffic more efficiently from one location to the next than their competitors, it translates to a more profitable bottom line. But network neutrality advocates maintain that Comcast has an obligation to open its network operation to the world. Why not have Kentucky Fried Chicken publish its original recipe or Coca-Cola tell us how it makes soft drinks?
Exactly. It gets back to a point I stressed in one of our podcasts on this issue about how "transparency" regulations are great in theory but in practice might have some rather profound implications. More generally, there's just the fact that it further puts the camel's nose in the Internet tent by inviting regulators in to meddle more in the name of "transparency."
As always, Richard Bennett has far more interesting things to say about the issue than me. Check out his essay about this same Forbes piece over at Circle ID.
Problems in Muni Wi-fi Paradise, Part 9 (amazing article about Philly failure)
In my nearly 17 years of public policy work, I have never felt so vindicated about something as I did this weekend when I read Dan P. Lee's Philadelphia magazine feature on "Whiffing on Wi-Fi." It is a spectacularly well-written piece about the spectacular failure of Philadelphia's short-lived experiment with municipally-subsidized wi-fi, which was called Wireless Philadelphia. You see, back in April 2005, I wrote a white paper entitled "Risky Business: Philadelphia's Plan for Providing Wi-Fi Service," and it began with the following question: "Should taxpayers finance government entry into an increasingly competitive , but technologically volatile, business market?" In the report, I highlighted the significant risks involved here in light of how rapidly broadband technology and the marketplace was evolving. Moreover, I pointed to the dismal track record of previous municipal experiments in this field, which almost without exception ended in failure. I went on to argue:
Keeping these facts in mind, it hardly makes sense for municipal governments to assume the significant risks involved in becoming a player in the broadband marketplace. Even an investment in wi-fi along the lines of what Philadelphia is proposing, is a risky roll of the dice. [... ] the nagging "problem" of technological change is especially acute for municipal entities operating in a dynamic marketplace like broadband. Their unwillingness or inability to adapt to technological change could leave their communities with rapidly outmoded networks, and leave taxpayers footing the bill.
I got a stunning amount of hate mail and cranky calls from people after I released this paper. Everyone accused me of being a sock puppet for incumbent broadband providers or just not understanding the importance of the endevour. But as I told everyone at the time, I wasn't out to block Philadelphia from conducting this experiment, I just didn't think it had any chance of being successful. And, again, I tried to point out what a shame it would be if taxpayers were somehow stuck picking up the tab, or if other providers decided not to invest in the market because they were "crowded-out" by government investment in the field.
But even I could have never imagined how quickly the whole house of cards would come crumbling down in Philadelphia. It really was an astonishing meltdown. Dan Lee's article makes that abundantly clear:
The Great 'Open v. Closed' Debate Continues: Google Phone v. Apple iPhone
"Hasn't Steve Jobs learned anything in the last 30 years?" asks Farhad Manjoo of Slate in an interesting piece about "The Cell Phone Wars" currently raging between Apple's iPhone and the Google's new G1, Android-based phone. Manjoo wonders if whether Steve Jobs remembers what happen the last time he closed up a platform: "because Apple closed its platform, it was IBM, Dell, HP, and especially Microsoft that reaped the benefits of Apple's innovations." Thus, if Jobs didn't learn his lesson, will he now with the iPhone? Manjoo continues:
Well, maybe he has--and maybe he's betting that these days, "openness" is overrated. For one thing, an open platform is much more technically complex than a closed one. Your Windows computer crashes more often than your Mac computer because--among many other reasons--Windows has to accommodate a wider variety of hardware. Dell's machines use different hard drives and graphics cards and memory chips than Gateway's, and they're both different from Lenovo's. The Mac OS, meanwhile, has to work on just a small range of Apple's rigorously tested internal components--which is part of the reason it can run so smoothly. And why is your PC glutted with viruses and spyware? The same openness that makes a platform attractive to legitimate developers makes it a target for illegitimate ones.
I discussed these issues in greater detail in my essay on"Apple, Openness, and the Zittrain Thesis" and in a follow-up essay about how the Apple iPhone 2.0 was cracked in mere hours. My point in these and other essays is that the whole "open vs. closed" dichotomy is greatly overplayed. Each has its benefits and drawbacks, but there is no reason we need to make a false choice between the two for the sake of "the future of the Net" or anything like that.
In fact, the hybrid world we live in -- full of a wide variety of open and proprietary platforms, networks, and solutions -- presents us with the best of all worlds. As I argued in my original review of Jonathan Zittrain's book, "Hybrid solutions often make a great deal of sense. They offer creative opportunities within certain confines in an attempt to balance openness and stability." It's a sign of great progress that we now have different open vs. closed models that appeal to different types of users. It's a false choice to imagine that we need to choose between these various models.
The Chairman of the FCC appears poised to attempt yet another appropriation of private property from cable operators. This time the vehicle is a regulation that would require cable operators to carry hundreds of low-power television stations. The practical effect would be to force cable operators to devote already limited channel capacity to stations that have programming of such small appeal that there exists no market demand for it. Now, one can criticize this latest proposal on many levels - it almost certainly is unconstitutional, it is inconsistent with the Communications Act, it is inequitable, unfair, bad policy, and bad economics.
But, taking the proposal seriously, what it really suggests is that the time has come to take back the broadcast spectrum allocated to these stations and devote it to services that people actually want. The stations demanding new carriage rights can't, after all, apparently survive based on their over-the-air viewing audience, and their programming schedule is so weak that no cable operator would carry it voluntarily without a federal mandate. At some point, the federal government has to stop trying to prop up failed enterprises. In this case, the costs of doing so are measured in terms of inefficient spectrum usage and burdensome regulations on an industry that is providing a service that consumers demand in large numbers.
Although the ISTTF is looking at a wide variety of tools and methods associated with online child protection (ex: filters, monitoring tools, educational campaigns, etc.), many of the AGs who crafted the agreement with MySpace that led to the Task Force's formation have made it clear that they are most interested in having the ISTTF evaluate age verification / online verification technologies. In fact, at the start of this week's session at Harvard Law School, AGs Martha Coakely of Massachusetts and Richard Blumenthal of Connecticut both spoke and made it abundantly clear they expect the Task Force to develop age and identify-verification tools for social networking sites (SNS). AG Blumenthal said we need to deal with "the dangers of anonymity" and repeated his standard line about online age verification: "If we can put a man on the moon, we can make the Internet safe." [Of course, putting a man on the moon took hundreds of billions of dollars and a decade to accomplish, but never mind that fact! Moreover, one could also argue that if we can put a man on the moon we can cure hunger, AIDS, and the common cold, but some things are obviously easier said than done. Finally, putting a man on the moon didn't require all Americans or their kids to give up their anonymity or privacy rights in order to accomplish the feat!]
On many occasionshere before, I have outlined various questions and reservations about proposals to mandate online age verification. Last year, I also published a lengthy white paper on the issue and hosted a lively debate on Capitol Hill [transcript here] about this. I also have discussed age verification in my book on parental controls and online child safety. [Braden Cox also talked about his experiences up at Harvard this week here, and CNet's Chris Soghoian had a brutal assessment of this week's proposals on his "Surveillance State" blog.]
In this essay, I will discuss the new fault lines in the debate over online age verification and outline where I think we are heading next on this front. I will argue:
There is now widespread understanding that it is extraordinarily difficult to verify the ages and identities of minors online using the methods we typically use to verify adults. Because of this, age verification proponents are increasingly proposing two alternative models of verifying kids before they go online or visit SNS...
First, for those who continue to believe that we must do whatever we can to verify kids themselves, schools and school records are increasingly being viewed as the primary mechanism to facilitate that. This raises two serious questions: Do we want schools to serve as DMVs for our children? And, do we want more school records or information about our kids being accessed or put online?
Second, for those who are uncomfortable with the idea of verifying kids or using schools, or school records, to accomplish that task, parental permission-based forms of authentication are becoming the preferred regulatory approach. Under this scheme, which might build upon the regulatory model found in the Children's Online Privacy Protection Act of 1998 (COPPA), parents or guardians would be verified somehow and then would vouch for their children before they were allowed on a SNS, however defined. But how do we establish a clear link between parents and kids? And will parents be willing to surrender a great deal more information (about themselves and their kids) before their kids can go online? And, is it sensible to use a law that was meant to protect the privacy and personal information of children to potentially gather a great deal more information about them, and their parents?
It remains very unclear how either of those two verification methods would make children safer online. Indeed, that could actually make kids less safe by compromising their personal information and creating a false sense of security online for them and their parents.
It is highly unlikely the Internet Safety Technical Task Force will be able to reach consensus on this complicated, controversial issue. A small camp will likely flock to the sort of proposals mentioned above. Another, larger camp (including me) will flock to education-based approaches to child safety as well increased reliance on other parental empowerment tools and strategies, industry self-regulatory efforts, social norms, and better intervention strategies for troubled youth. But the age verification debate will go on and, as was the case over the past two years, the legal battleground will be state capitals across America, with AGs likely pushing for age verification mandates regardless of what the Task Force concludes.
Continue reading if you are interested in the details.
The House Commerce Committee Ranking Member recently circulated a discussion draft meant to address perceived procedural failures at the FCC. In a short paper released this week by PFF, Barbara Esbin voices support for the modest reforms outlined in the draft but suggests more fundamental changes may be needed:
Our domestic regulatory policy debate is continually hobbled by the need to conduct it in the "terms of the past" rather than in accordance with the reality of the networks and services of today, let alone the needs of tomorrow's network, service and applications developers and providers. At the very least, the process of debating thorough-going reform of the Act and the agency would have the benefit of focusing attention on the functions the agency performs well, that should be left alone, and those is performs poorly, that should either be reformed or given to another agency or department of government better suited to the particular function.
Tom Sydnor released a short paper this week urging Congress to pass the Enforcement of Intellectual Property Rights Act. Tom lays to rest some of the concerns voiced about the bill, including the cost to the federal government:
In the case of ERIPA, the usually sound impulse to avoid further federal spending is misplaced. Dynamic analysis of ERIPA's costs and benefits shows that ERIPA is better than "revenue neutral"--it is "revenue enhancing."
The Coalition Against Counterfeiting and Piracy made this point by commissioning the Tyson Report, a conservative economic analysis of the probable costs and benefits of IPR-enforcement reform. The Tyson Report concluded that because counterfeiting and piracy annually drain about $225 billion from the U.S. economy, IPR-enforcement reforms that only slightly decreased counterfeiting and piracy over three years would increase U.S. output, earnings, and employment enough to increase federal tax revenues by $4.9 to $5.7 per dollar spent on reform, and generate another $1.25 billion in state and local tax revenues. For the American taxpayer, dollars spent on IPR-enforcement reform are investments that offer potential three-year returns of 490% to 570%, even when discounted to present value.
In response to Comcast's recent announcement regarding its new traffic management technique, which involves limited capping to prevent excessive bandwidth use, Verizon has countered that it does not employ bandwidth caps. Now I have no view as to whether Comcast's prior traffic management "throttling" or the newer "capping" (for lack of better shorthand terms) provides superior service from a consumer perspective. And I don't know what precise traffic management protocols Verizon uses or how those protocols will evolve over time as applications become more sophisticated and more bandwith intensive.
It is clear, however, that the cable and telecommunications companies are engaged in ferocious competition to gain market share in the broadband market. Both industries are rapidly working to upgrade and improve their networks, both are trying to differentiate their service offerings to the public, both are striving to improve service quality and enhance the consumer experience. Most importantly from a policy perspective is that the billions of dollars invested in this effort has not resulted from government mandates or intrusive regulation, but from vibrant marketplace competition. Want more and better broadband in the U.S.? The best thing the government can do is stay out of the way and allow competition to flourish.
Online Advertising & User Privacy: Principles to Guide the Debate
By Berin Szoka & Adam Thierer
Progress Snapshot 4.19 (PDF)
Over the last year, a debate has raged in Washington over "targeted online advertising," an ominous-sounding shorthand for the customization of Internet ads to match the interests of users. Not only are these ads more relevant and therefore less annoying to Internet users, they are more cost-effective to advertisers and more profitable to websites that sell ad space. While such "smarter" online advertising scares some--prompting comparisons to a corporate "Big Brother" spying on Internet users--it is also expected to fuel the rapid growth of Internet advertising revenues from $21.7 billion last year to $50.3 billion in 2011--an annual growth rate of more than 24%. Since this growing revenue stream ultimately funds the free content and services that Internet users increasingly take for granted, policymakers should think very carefully about what's really best for consumers before rushing to regulate an industry that has thrived for over a decade under a layered approach that combines technological "self-help" by privacy-wary consumers, consumer education, industry self-regulation, existing state privacy tort laws, and FTC enforcement of corporate privacy policies.
In an upcoming PFF Special Report, we will address the many technical, economic, and legal aspects of this complicated policy issue--especially the possibility that regulation may unintentionally thwart market responses to the growing phenomenon of users blocking online ads. We will also issue a three-part challenge to those who call for regulation of online advertising practices:
1. What is the harm or market failure that requires government intervention?
2. Prove that there is no less restrictive alternative to regulation.
3. Explain how the benefits of regulation outweigh its costs.