April 26, 2010

Like It or Hate It, Facebook Shoves Us Down Slippery Slope

Last week, Facebook made some big announcements at the company's F8 conference, including a few aggressive moves to extend the company's presence to third party Web sites. The plan consisted largely to enable people to "like" content from around the Web, transmitting their downstream activity back to Facebook, and also - garnering an incredible amount of backlash from those opposed to the company's approach to automatically opt users in - they introduced the opportunity for partners to personalize your Web experience based on your profile. Facebook's introductions, as close as they are to exactly what I have been asking for over the last few years, are running contrary to what the company's large user base expected from the network, and the company's methods are presumably violating the public trust. While I do not have plans to delete my profile out of protest, as many have, I do not plan on acting as if any of my content placed on the site is done under the realm of being private. I expect my data, all of it, to be publicly searchable, shareable and indexable by the company and its partners, forever, and will act with that in mind.

In April of 2009, I practically begged for ad companies to "truly leverage our social profiles", specifically Facebook. At the time, I wrote:
" I think the real money is in Facebook offering to team up with all the major advertisers on the Web (Google/Doubleclick included) and letting said advertisers tap into our personal profiles, giving them a cut of the downstream revenue. Facebook has proven they know us, even as their ad team does not. Turn over the right data to ad people who do know what they're doing, and maybe I'll stop being annoyed by ads that have nothing to do with me."
Last week, Facebook opened up my profile to a select few partners, including Pandora, Yelp and Docs.com, a Microsoft front for their Web Office plans. While the ads are still terrible on Facebook (and other sites), presumably these three starter sites are going to be customized just for me. Sure enough, logging into each presents a "personalized" experience, featuring avatars of friends with whom I am connected. In theory, my recommendations on Yelp get better, my musical recommendations from Pandora get better and Docs... well, I have no idea. But two out of three's not bad, right?

Docs.com Uses Facebook to Personalize My Experience

I can't get too riled up about Facebook trying to make the Web a better place through leveraging my social profile. To be honest, I want this.

I want these downstream sites to tap into my personal profile and use the data they find. And I also believe that I am acting in a way online that doesn't have any skeletons in the closet, if all the data becomes public some day. I have seen Facebook work to extend to a more "public" (not open) status, and expect all my content to go with it.

I also can't get too riled up about Facebook's making this move unilaterally, especially given my stance on Buzz, when I thought the company (Google) made some initial mistakes around privacy, but I stuck with it, waiting for them to get the formula right. To accept Google's issues, but throw Facebook to the wolves would accurately appear two-faced.

From what I have seen, the major concerns around Facebook's moves are two-fold. First, there is a concern that they made these "special agreements" with the three mentioned merchants, and that this might be the tip of the iceberg, as they gain more relationships, and extend the ability for these relationships to see more and more of your content. Second, there is a concern, one I believe to be accurate, that the overwhelming majority of Facebook users will not only not understand what has changed, but they also will not know how to opt out of the process. It is clear that getting out of the process requires many steps, many more than it took to get into the process, effectively zero.

The flareup of response around Facebook's move has spawned more feedback from the network's blog, which aims to quell some of the concerns. In a post today, prefaced by "answers to your questions", Facebook said "Our highest priority is to keep and build the trust of the more than 400 million people who use our service every month," adding "None of your information—your name or profile information, what you like, who your friends are, what they have liked, what they recommend—is shared with the sites you visit with a plugin." The company also pointed to a help center, which tries to explain what is shared with whom when. The response wasn't very reassuring, but was better than nothing.

I am reminded of the phrase "first do no harm", which is often incorrectly attributed to the Hippocratic Oath. If Facebook has changed the rules in the middle of the game, and is slowly exposing more customer information than before, after they had an assumption of privacy, then that is wrong. But the question is, has Facebook violated users' trust based on the letter of the law, or the intent of the law? It seems to me that there are those who feel no data from Facebook should ever be shared with third parties without your explicit permission. But there are others, no doubt some at Facebook, who think you should be able to share as much as possible, until reaching the point where somebody could get hurt. So we ask, "where is the harm?" The benefit of a customized music station the borrows from your friends' likes seems fairly innocuous, as does personalized restaurant reviews. But then, what next?

One prominent tech geek I talked to on Friday asked me if I would be comfortable knowing my insurance carrier had shared my medical history with a third party, without giving me the opportunity to decline - suggesting Facebook's moves were the same - as one "trusted" resource passed along personal private data I had not intended to be public. While I understood his point, I did not share the outrage, as I never assumed my data on Facebook would be safe. In contrast to Blippy, Mint.com and others (which I discussed yesterday), I never had an explicit relationship with Facebook whereby I would give them information that, if it leaked, would fall into violation of contract or even legal territory.

As the tumult continues to rage, I think we are in an odd position in the industry, where we are in serious need of independent third party leadership who has our own data protection and privacy in mind. Many of the biggest names in the open standards front or in the privacy game are now working hard at the very companies who have a vested interest in monetizing our data. It's both a great thing and a bad thing, but when a company like Facebook makes moves like they have, it seems like we don't have anyone trustworthy to stand up and say "This is wrong" if it is indeed wrong, and spell it out. That's why on Saturday I posted a quick note to Google Buzz asking the leaders of these companies to find a way to help users without taking potshots at each other. I posted that note on Buzz, rather than this blog, as it was a casual and personal note, and it reached the people I hoped it would. But it also led to some tremendous conversation in the comments about why we need independent voices that have impact, and right now, it might have to be the government's oversight that lays down the law.

We are already seeing U.S. Senators asking for a solution to problems like those posted by Facebook and other companies who have our personally identifiable data and look to be cavalier about sharing it. Charles Schumer is already pushing the FTC for guidelines. But the case, as with most cases involving the Web, innovation and changing culture, is not cut and dry. There is not a "good guy" and a "bad guy" in this case. Take for example, David Recordon's position when he wrote: "Why f8 was good for the open web". I believe David thinks what Facebook is doing the right thing, just as much as I think some prominent Googlers and other techies think that Facebook is doing absolutely the wrong thing. It's a measure of whether you think Facebook is on the path to violating your trust, or if they already have, and depending on your experience, upbringing or personal preference, your results may vary.

As far as I am concerned, Facebook is in a very difficult position in terms of being able to make change and do so without upsetting the apple cart. The company enticed nearly a half-billion users into sharing some of their most private activity into the network, and now the company needs to grow in new ways that will require this data to be mined and shared. The company probably doesn't feel like it has the luxury of asking a percentage of its users for permission to share their data, because they know very few will outright, without convincing. To do so automatically and asking them to work hard to get out of the situation is a much more lucrative and beneficial position for them in terms of revenue and growth. It's very similar to what Dare Obasanjo of Microsoft said when we were at SXSW this March, when he suggested that users and companies' goals were in conflict.

Facebook has such a strong position in the social Web that they can do practically anything and keep users. My family uses Facebook for their networking, photo sharing and commenting. Even if they are vaguely aware of changes, they aren't going to delete their Facebook accounts because Facebook started being aggressive with its partners. And they know this. Facebook has an opportunity now to leverage their social capital and grow to be more of the rest of the Web - even if their users would not have given them permission in advance. Whether you think we are approaching the point of no return, or if we already crossed it, that's the direction and you can't undo it. Just plan on your data online being public - all of it.