Personal Privacy Impact Assessments for Facebook

I’m reading Canada’s Assistant Privacy Commissioner Elizabeth Denham’s recently released findings into complaints levied against Facebook. (Report of Findings into the Complaint Filed by the Canadian Internet Policy and Public Interest Clinic (CIPPIC)against Facebook Inc. Under the Personal Information Protection and Electronic Documents Act.) My first reaction to this is, frankly, one of jealousy. I wish we had a similar commissioner/czar/wonk here in the US. I suppose elements of the FTC work in this regard but without the same charter, which is too bad.

Section 4 of the report is, for me, where the action is at. Section 4 is concerned with 3rd party application in Facebook and use of personal data by those applications. As the Facebook platform grows with new additions like Facebook Connect, issues of third-party access to user information will continue to be a concern to those who pay attention to such things. There’s a challenge here as the ways in which 3rd party applications use user information is hard to decipher, as it is, from an end-user perspective, a fairly black-box operation.

I wonder if Facebook could build a personal privacy impact assessment (PPIA) app. The PPIA would analyze the action you are about to take on Facebook, your privacy settings, the 3rd party apps you’ve allows access to your profile, and the privacy settings you have set for those apps. The PPIA could give you a quick read on which applications would be privy to the action you are about to do. It could indicate which groups of friends (based on your privacy settings) would see what you are about to do. Essentially, it would let you see across how much of your social graph a certain action (like posting a link or photo) will travel.

We all have PPIAs built in – one that is cultivated through social interactions schooled by social norms. When it comes to dealing with large systems, like Facebook, big business, or the government for that matter, we all can use a little help.  I wonder if someone can get a PPIA prototype up ahead of Catalyst to at least give me a warning about potentially embarrassing photos being posted somewhere…

(Cross posted from Burton Group’s Identity Blog.)

Putting privacy controls in the hands of your users

I mentioned yesterday that Bob and I have just finished up some research on privacy.  In this upcoming report, we stress the importance of establishing privacy principles and then using those principles to guide privacy practices.  I happen to see this NY Times article (via Nishant’s Twitter stream) and had a bit of a Baader-Meinhof moment.  The article talks about how social networking sites are giving their end-users more and more control over how information is disclosed.  Giving users choice as to how their information is disclosed and used is important.  Giving users meaningful choice as to how their information is used is much better. 

 One of the privacy principles that Bob and I examine in our report is the principle of Meaningful Choice:

Robbing others of the ability to exercise their free will is an affront to dignity; therefore we allow people to decide how we will use information about them.  When presenting people with choices about how we will be allowed to use their information, we design easy-to-understand interfaces which reduce the possibility of confusing people, and we avoid creating “Hobson’s choice” situations in which people are forced to choose the lesser of a set of evils.

As an ex-interface and product designer, I am especially sensitive to usability and the principle of Meaningful Choice directly addresses this.  Providing an end-user with a difficult to use privacy settings tool and then saying, “Well, we gave you choice as to how your information gets used” exploits the power imbalance between the service provider and the user.  As the interaction between the user and the service provider become more and more valuable (moving from social networking to, say, electronic health records), such an exploitation is less and less acceptable.

 In the course of our research we talked to one company who spent many months trying to get their privacy settings interface right.  They brought people (non-techies even!) into their usability lab and studied how these user set (or didn’t set) privacy settings.  The design team fully acknowledge that building a usable, meaningful interface for privacy settings was hard but considering the context, the effort was required.

 

 End-user privacy controls are mandatory.  But in the absence of a usable interface, end-user control is not control at all. 

(Cross-posted from Burton Groups Identity Blog.)

 

Trip report from the Privacy Symposium

This is a cross-post from Burton Group’s Identity Blog.

BTW, I am moderating a panel at Defrag. If you use “ig1” as a registration code, you get $200 off the registration fee. Hope to see you there!

A few weeks ago I was up in Cambridge at the Privacy Summer Symposium.  Gathered together on Harvard’s campus were a collection of lawyers, activists, government officials, and privacy officers discussing various aspects of privacy.  It was certainly a bit of a change for me to be in a non-tech heavy conference.  Besides hearing people like the chairman of the FTC, William Kovacic, speak, I got to witness the launch of EPIC’s Privacy ’08 campaign.  Further, I got to hear Jeffery Rosen share his thoughts on potential privacy “Chernobyls,” events and trends that will fundamentally alter our privacy in the next 3 to 10 years.

Privacy Chernobyl #1 – Targeted Ads

We’ve already seen enough concern over targeted ads to trigger Congressional hearings.  The Energy and Commerce Committee in the House has been asking ISPs and advertising providers to answer questions as to how they track and use clickstream data. Not be left out, the EU has notified the UK that it must respond to an inquiry whether the Phorm system violated EU data privacy laws.  From NebuAd to Phorm to DoubleClick and beyond, targeted ads are getting more and more targeted.  The real concern for Rosen is the downstream use of the collected clickstream data.  For example, my ISP may not directly do anything overly odious with the information about which sites I visit, but if they sell that information, the 2nd and 3rd generation data users may bit a more nefarious.

Privacy Chernobyl #2 – Personally identifiable search term leak

The knowledge of who I am and what I search for can be used in a variety of ways: from serving voyeuristic desires to putting me in a compromising situation.  Without being able to provide the context for the searches, I could be judged by a people like a potential employer, dating service, or insurance provider unfairly, and these judgments can have a real impact on my life.  As YouTube has to disclose data for its court case, I have to imagine there are some people who really don’t want identifiable searches being disclosed into the public record.  As more and more of our web browsing is search driven, the potential impact of this problem will only grow.

Privacy Chernobyl #3 – Unexpected data exposure on Facebook

There are two concerns to this issue.  The first concern is one that Rosen categorized as data exposure in unexpected ways.  I may have tailored my Facebook profile’s privacy settings to what I think strikes a decent balance between my desire to connect to people and maintaining some level of privacy.  But what is unclear is how Facebook application developers are using my data, including clickstream data.  I don’t expect to hear a friend tell me that I “friended” a product whose ad appeared in Facebook for her, and I certainly never want to hear that this has happened.

Continue reading Trip report from the Privacy Symposium