Learning the Intracacies of Privacy in the Facebook Blowup

Bruce Schneier, considered a foremost expert on security and privacy, and relatively famous for some of his outspoken criticisms of the Bush Administration’s handling of post 9/11 security issues, writes in Lessons from the Facebook Riots :

“Welcome to the complicated and confusing world of privacy in the information age. Facebook didn’t think there would be any problem; all it did was take available data and aggregate it in a novel way for what it perceived was its customers’ benefit. Facebook members instinctively understood that making this information easier to display was an enormous difference, and that privacy is more about control than about secrecy.”

Bruce Schneier, considered a foremost expert on security and privacy, and relatively famous for some of his outspoken criticisms of the Bush Administration’s handling of post 9/11 security issues, writes in Lessons from the Facebook Riots :

“Welcome to the complicated and confusing world of privacy in the information age. Facebook didn’t think there would be any problem; all it did was take available data and aggregate it in a novel way for what it perceived was its customers’ benefit. Facebook members instinctively understood that making this information easier to display was an enormous difference, and that privacy is more about control than about secrecy.”

and

“But as the Facebook example illustrates, privacy is much more complex. It’s about who you choose to disclose information to, how, and for what purpose. And the key word there is “choose.” People are willing to share all sorts of information, as long as they are in control.”

And finally, he talked explicitly about what we talk about here at Bokardo: social web design

“We’re all still wrestling with the privacy implications of the internet, but the balance has tipped in favor of more openness. Digital data is just too easy to move, copy, aggregate and display. Companies like Facebook need to respect the social rules of their sites, to think carefully about their default settings (they have an enormous impact on the privacy mores of the online world) and to give users as much control over their personal information as they can.” (emphasis added)

Schneier’s right about all of this. It’s about control, and, perhaps more importantly, perception of control. Our social software needs to have the mechanisms for control, choice concerning those mechanisms, as well as ongoing feedback about them. We’re only beginning to learn the lessons of what having our personal information out there in the world really means.

We’re talking a lot about this at work, too. Jared wrote a nice Facebook piece last week on designing for what he calls embraceable change, meaning that we need to know how to change things so that users will embrace it, not flee from it. We talk with folks all the time who piss off their customers…trying to do something great but actually sullying the relationship a bit.

This all reminds me of my earliest experiences with social software: email. I was in college in 1996 and more often than not (it seemed) someone replied to “all” instead of just the person who sent the mail. There were many cases of inadvertent emails sent to the wrong person. And it would not only hurt the person who mistakenly received it, it hurt the person who it was meant for, because now the conversation was more public than it should have been.

In general, it seems that having users choose their(our) own level of privacy/publicity for content is vital to continued happiness and harmony.

Published: September 21st, 2006