In case you live under a digital rock (whaddyamean, you don't check TechMeme hourly?), you have probably heard that EPIC filed a complaint with the Federal Trade Commission Thursday, alleging that Facebook's revised privacy settings (and their implementation) constitute "unfair and deceptive trade practices" punishable under the FTC's Section 5 statutory consumer protection authority. Specifically, EPIC demands, in addition to "whatever other relief the Commission finds necessary and appropriate," that the FTC "compel Facebook to restore its previous privacy settings allowing users to:"
- "choose whether to publicly disclose personal information, including name, current city, and friends" and
- "fully opt out of revealing information to third-party developers"
In addition, EPIC wants the FTC to "Compel Facebook to make its data collection practices clearer and more comprehensible and to give Facebook users meaningful control over personal information provided by Facebook to advertisers and developers."
I'll have more to say about this very complicated issue in the days to come, but I wanted to share, and elaborate on, two press hits I got on this issue today. First, in the PC World story, I noted that "we're already seeing the marketplace pressures that Facebook faces move us toward a better balance between the benefits of sharing and granular control" and expressed my concern "about the idea that the government would be in the driver's seat about these issues." In particular, Facebook has made it easier for users to turn off the setting that includes their friends among their "publicly available information" that can be accessed on their profile by non-friends (unless the user opts to make their profile inaccessible through Facebook search and outside search engines).
In other words, this is an evolving process and Facebook faces enormous pressure to strike the right balance between openness/sharing and closedness/privacy. While Facebook's critics assume that it is simply placing its owyou and I saw an article that is as oldyou you as it isn financial interests above the interests of its users, the reality is more complicated: Facebook's greatest asset lies not in the sheer number of its users and not just in the information they share, but in the total degree of engagement in the site. The more time users spend on the site, the better, because Facebook is rewarded by advertisers for attracting and keeping the attention of users.
Reputational Incentives
The best part of EPIC's complaint is the seven (of 29) pages spent providing examples to prove that "Facebook Users Oppose the Changes to the Privacy Settings." Thanks, EPIC, for proving my point: users are not
helpless sheep; they are actually capable of packing their bags and walking if they don't like the deal they're being offered. We can have a legitimate and fair conversation about whether that deal is being offered clearly enough; the FTC certainly does have a role to play there in avoiding truly unfair and deceptive offerings and changes. But if you really don't like Facebook that much, no one is forcing you to stay. The complaint quotes approvingly from an editorial in the
Boston Globe: "Over time, privacy changes can only alienate users." (Ah,
now I see... EPIC simply wants the FTC to make Facebook to what's really in its own best interests, because Facebook is no more capable of recognizing this subtle point of business strategy than most users can recognize just how dangerous it is to share more information than EPIC thinks they should. If users really want to be "protected" so badly, perhaps EPIC should get out of the advocacy business and start a social network of their own. "PrivateBook" has a nice ring to it: the site where nobody can friend or message you and information is locked down tighter than the Green Zone. That sure sounds like a fun and useful site! I can't imagine why no one has tried it.)
But Facebook has to fear not just driving away some users who will actually shut down their profiles and switch to the dozens of other social networking tools out there, but, more importantly, the possibility that many more users will simply be discouraged from using the site as actively as they might have. The very "chilling effects" that so concern Facebook's critics are also a serious problem for Facebook in the aggregate: Every minute a user doesn't spend on the site because of privacy worries, whether specific and articulated or vague and generalized, is simply lost revenue for Facebook.
the Globe argues, "Facebook should be helping its 350 million
members keep more of their information private."
Worse, Facebook knows that time users spend on its site is only as valuable as Facebook's advertising inventory. The best way to drive down its ad prices is to make advertisers wary of associating their brand with Facebook's. So Facebook needs to make not just users users feel comfortable, but also advertisers. Bad headlines mean bad ad prices mean bad stock prices.
Compende?
Moreover, Facebook must know that, just as it grew from a dorm room project into a rival for the likes of Google--another outgrowth of a dorm room project--the next Facebook-killer, like "The Truth..." in X-Files, is "...Out There." Perhaps it's Twitter, perhaps it's some array of Google products, or perhaps it's some service we haven't yet even conceived of. But Facebook knows that if it doesn't keep up with the growing trend towards information-sharing, it will eventually "jump the shark" and go the way of Friendster, the AOL walled garden, BBSs, etc.
Granularity of Privacy Controls
So, while "privacy advocacy" groups like EPIC have an important role to play in helping to focus and articulate the concerns of privacy-sensitive users, it's not obvious that calling the government in to call the shots is the best way to produce privacy controls that empower privacy-sensitive users within the context of an increasingly open system. In general, that means building more granular controls over privacy. Facebook has done a great job of that with the publishing controls, which allow users to decide who can see every post a user makes or photo user shares. But EPIC's 9,000+ word complaint doesn't even mention this radical increase in the granularity of control given to users over each new piece of information they publish.
Instead, EPIC--you know, the folks who've tried repeatedly to shut down Gmail and Google docs because they just can't stand certain kinds of data sharing--focuses on three issues:
- The fact that some information (name, profile picture, gender, current city, networks, friend list, and Pages) is included in the "publicly available information" accessible by anyone--unless, again, the user chooses to make their profile and accessible through search engines or not to list their friends.
- The sharing of user information through third party applications.
- The fact that, although Facebook presents each user with a truly "unmissable notice" about the new privacy policies and requires them to view their new settings, the "recommended" (defaults) settings are to share most information with "everyone."
On the first point, I can understand that some users might legitimately worry about having their Pages made publicly available if there pages include potentially controversial subjects. So yes, I'd personally like to have the ability to opt out of having certain pages listed on my profile. Why didn't Facebook give me this granular control (instead of the cruder control of simply de- indexing my profile)? Perhaps it's because groups like EPIC usually criticize privacy interfaces that give users lots of granular controls as "too hard," "too complex" and "unusable." As Adam Thierer recently noted in
FCC comments on the subject of parental controls as a "less restrictive" alternatives to government regulation of content deemed inappropriate for children:
There is a trade‐off between complexity and convenience...: Some critics argue parental control tools need to be more sophisticated; others claim parents can't understand the ones already at their disposal. But there is no magical "Goldilocks" formula for getting it "just right." There will always be a trade‐off between sophistication and simplicity; between intricacy and ease‐of‐use.
"Damned if you do, damned if you don't," it seems. My point here is not to excuse Facebook for falling short of the sort of radical user empowerment I would like to see, but to point out that "privacy advocates" have unintentionally created some disincentives to build more granular systems of control. Or perhaps it's not unintentional: If you really think information sharing is dangerous and that it's just too hard for "average" users to make decisions about this, increased granularity only worsens the problem of getting users to make the "right" choices according to the personal preferences of the folks at EPIC, which they want to impose on everyone else--at least by setting a restrictive default ("opt-in"). As I noted in the
eWeek story:
[Facebook is] trying to encourage users to share more information... Unlike EPIC I don't think that's a bad thing, as long as they do it correctly. If EPIC had their way, they would impose on everybody this mandate that 'Thou shall not share unless ... you've checked this box and you've gone through all these careful setting changes... I just think that's unwarranted because most users aren't that concerned about sharing this information, and [for] those that are, this solution is [a way] to empower them.
One critical clarification: What I actually said here (or meant to say) was not that "
this solution" (the specific controls offered by Facebook in its latest iteration of its evolving privacy settings) is necessarily the
best way to empower users, but that the general approach here should be to focus on user empowerment, rather than setting restrictive defaults--which is what EPIC really seems to want. Note, in particular, the subtle but important difference in the two demands I listed quoted from their complaint at the top of this piece. EPIC is demanding a comprehensive "opt-out" from the sharing of personal information with applications, but that users "choose whether to publicly disclose personal information, including name, current city, and friends"--in other words, although EPIC is politically savvy enough not to use the term, at
opt-in.
While I would certainly like to see Facebook implement more granular controls for sharing of information with applications (a very thorny issue because so many applications rely on the sharing of data to be useful to users--about which I'll have more to say in the future), I just don't see what the big deal is about sharing such generic information. For certain categories of information that is unlikely to be sensitive to many users, it's okay by me for Facebook to say:
Look, here is the basic bundle of information that we are going to make available about users in order to make basic profiles a useful and consistent feature across the site, such as for identifying new friends or being able to distinguish to people with similar names from each other before you message or friend them. If you don't like this, you can choose not to make your profile available through search. And if that's not good enough for you, maybe you shouldn't use our service.
For certain information, like pages and friends, more granular control may well be merited. But Facebook is already moving in that direction for the reputational and other market reasons discussed above. For other information, like name, current city and photo, what's the big deal? I'm all in favor of empowering users to choose for themselves, because privacy is a profoundly subjective thing, but... name? Really? And what's the harm that requires government to start designing user interfaces? EPIC may claim that they are asking for nothing more than that Facebook revert back to the old privacy policies, but of course that just means that Facebook will have to play "Mother, may I?" if they are sent back to the drawing board and have to figure out how to update their privacy settings in an ever-changing world.
That's the best way to subtly convert Facebook into what is essentially a "public utility," subject to ongoing regulatory review under formal consent decree or simply because EPIC and its allies are constantly hanging the regulatory "Sword of Damocles" over Facebook's head. Sounds like a sure-fire remedy for innovation to me! Maybe Facebook could dispense with this whole "anti-privacy" "advertising" business model and just start (along with PrivateBook, no doubt) filing tariffs for taxpayer subsidies or fixed subscription rates with whatever government agency is going to be responsible for funding all media under an expanded version of the "public option" concept being kicked around for traditional media by the radical Digital Left.