Paul Rosania

Hi! I'm Paul. I work at Slack.
You should follow me at @ptr.
I can also be reached at paul@rosania.org.

On Privacy (or: What Buzz failed to learn from Newsfeed)

February 12, 2010

On Friday, September 8, 2006, Mark Zuckerberg published an apology for the botched launch of Facebook Newsfeed. The gist of his open letter was:

  • We did a poor job explaining the new features
  • We did not provide anywhere near enough privacy control
  • We are launching better privacy controls today, after a marathon 48 hour coding session

A public apology from the CEO of a major company is commendable. But contrast this with his post on Tuesday night, just 3 days beforehand:

  • Newsfeed is great and you need to give it a chance
  • Your privacy settings have not changed
  • Things that were private are still private
  • Your friends can see the exact same things they could see before

What changed? Why the sudden change in tone? Mark was coming to grips with a fundamental issue facing social software:

Companies don’t understand privacy!

... and as a result, they make the same mistakes, over and over again.

We live in public. And we always have. Moreso now, perhaps. But if you have ever had your picture taken, eaten at a restaurant, or had an argument in a public place, a little bit of your self has been copied into the ether. Yet, just because we do these things does not mean we want them to be public. That argument with your significant other would drop dead if a film crew showed up and pointed a camera in your direction.

In a declarative sense, one can think about privacy as action and context. What we do, and where and with whom we do it. Indeed, the Googles and Facebooks of the world have gotten pretty sophisticated about this kind of privacy. "Share my vacation photos with my close friends." "Invite my coworkers to my housewarming."

However, (as is typical with computers) things break down when you start to make inferences. She wrote on someone’s public wall, therefore she intended for everyone to read. She posted the pictures on her public blog, therefore she wants her coworkers to see them. It’s pretty easy to see where the wheels come off. That public argument we were discussing earlier? It happened in the middle of Faneuil Hall with thousands of people around. Let’s podcast it automatically!

Action: audio conversation; context: extraordinarily public place. Privacy setting: Everyone.

What’s missing is intent.

Just because something is public does not mean it is intended to be seen. We do things in public all the time that would be humiliating or destructive if broadcasted. We are so used to doing these things that sometimes we don’t even notice the context in which we are acting! Relying on the context of an action to determine intent is a recipe for failure.

Privacy and the human mind.

As we use software, we map its features to our intentions. We learn how to perform actions, and we learn to control context using privacy settings. But that map is not perfect. We can be confused or misled by the abundance of options or by our interpretation of instructions. We can be downright lazy. Or, like the public argument, we can simply forget (or ignore) context. When those things happen, we rely on software to understand our intent, and shield us from mistakes caused by our imperfect mental map. And unfortunately, when it comes to understanding intent, computers fail. (Sorry, Google.)

What about Buzz?

Back to Buzz. The controversy over privacy is not stemming from the design of Buzz. It stems from default settings. The Google Buzz team is attempting to make Buzz useful right out of the gate by guessing your preferred list of followers and followees. But computers fail at intent, and despite thoughtful design of privacy settings, Buzz is a privacy failure.

A path forward?

Failed launches like Newsfeed and Buzz can teach us something about how people think about privacy, and how to design software to be understood and accepted by its users. Facebook taught us that declarative privacy settings can work. However, both of these failures highlight two more key issues: Computers suck at intent. Inferring privacy preferences for new software, based on prior actions in old software, is a recipe for failure, and a PR nightmare. People assume computers are great at intent. We publish things to much wider contexts than we intend, and don’t notice or care until new products and features make incorrect inferences based on that. The good news is that smart people are working on these problems. Let’s just hope they are learning from each other.

In summary

  • Shockingly, these mistakes have been made before
  • Companies like Facebook and Google are only just beginning to understand privacy
  • Explicit, declarative privacy settings are good
  • Users are loose with their settings, and trust software to “do the right thing”
  • When it comes to social software, computers suck at figuring out what the right thing is
  • Products can improve if competitors can learn from each other