Personal Privacy Impact Assessments for Facebook

I’m reading Canada’s Assistant Privacy Commissioner Elizabeth Denham’s recently released findings into complaints levied against Facebook. (Report of Findings into the Complaint Filed by the Canadian Internet Policy and Public Interest Clinic (CIPPIC)against Facebook Inc. Under the Personal Information Protection and Electronic Documents Act.) My first reaction to this is, frankly, one of jealousy. I wish we had a similar commissioner/czar/wonk here in the US. I suppose elements of the FTC work in this regard but without the same charter, which is too bad.

Section 4 of the report is, for me, where the action is at. Section 4 is concerned with 3rd party application in Facebook and use of personal data by those applications. As the Facebook platform grows with new additions like Facebook Connect, issues of third-party access to user information will continue to be a concern to those who pay attention to such things. There’s a challenge here as the ways in which 3rd party applications use user information is hard to decipher, as it is, from an end-user perspective, a fairly black-box operation.

I wonder if Facebook could build a personal privacy impact assessment (PPIA) app. The PPIA would analyze the action you are about to take on Facebook, your privacy settings, the 3rd party apps you’ve allows access to your profile, and the privacy settings you have set for those apps. The PPIA could give you a quick read on which applications would be privy to the action you are about to do. It could indicate which groups of friends (based on your privacy settings) would see what you are about to do. Essentially, it would let you see across how much of your social graph a certain action (like posting a link or photo) will travel.

We all have PPIAs built in – one that is cultivated through social interactions schooled by social norms. When it comes to dealing with large systems, like Facebook, big business, or the government for that matter, we all can use a little help.  I wonder if someone can get a PPIA prototype up ahead of Catalyst to at least give me a warning about potentially embarrassing photos being posted somewhere…

(Cross posted from Burton Group’s Identity Blog.)

Privacy Risks Get Real – California Privacy Laws, Octomom, and Kaiser Permanente

No organization wants to be the first  to be fined because of a new regulation. Unfortunately, that’s exactly where Kaiser Permanente finds itself.  After some high profile cases of unauthorized access to celebrities’ medical records, the California legislature adopted two new privacy laws (SB 541 and AB 211);  these regulations were so swiftly enacted that they contained spelling errors. Both regulations went into effect on January 1 of this year. Five months later, Kaiser Permanente has become the first enterprise to be fined under this new regime.

Regulators have levied the maximum fine, $250,000, for the recent incident involving Nadya “Octomom” Suleman.  (Kevin commented on this previously.)  All in all, 23 individuals looked at Ms. Suleman’s records without authorization. Of these, 15 have either been fired or resigned.  And although the state regulators have fined Kaiser, they have yet to penalize any of these 23 individuals – which they can do under state law.

As reported in the LA Times, Suleman’s lawyer said:

I think Kaiser handled it professionally. They found out, they terminated the employees, they brought it to our attention. They certainly didn’t try to hide it.

It’s important to note that even though Kaiser acted appropriately, laws like SB 541 are clear cut: unauthorized access to medical information =  fine. Do not pass Go; do not collect $200.

As we’ve said before privacy risks are real. The fines are increasing. The number of regulations is increasing. Now more than ever is the time to register for this year’s Catalyst conference so you can attend our Privacy Risks Get Real track and learn how to reduce the chance your organization will become the next “first.”

(Cross posted from Burton Group’s Identity blog.)

The beginning of the beginning: our privacy report publishes

Over the last 6 or so months, Bob Blakley and I have been doing a lot of listening and thinking about privacy.  To successfully re-launch our privacy coverage, we needed to lay a wide foundation that would serve to support future research.  We needed to provide a meaningful starting point for our customers.  Since our customers’ jobs are not typically focused on privacy, we needed to start with a form of first principles and build outward. 

I’ve learned that it is generally frowned upon to use the second person in our reports – too informal I am told.  Use the blog if you want to address the audience directly.  Normally, I don’t have a problem avoiding the second person, but this report proved to be a challenge.  We had to work hard not to write without using “you.”  And why was that? Privacy discussions are and must be inclusive.  They involve each of us on a far more personal level than a discussion of, say, account lifecycle management.   Cognizant of privacy implications or not, the decisions you make on a daily basis have effects the privacy of your customers and partners.

Because privacy is personal, because it requires concerted behavior throughout the enterprise, discussions about privacy must include everyone.  You.  Me.  Everyone. To guide concerted behavior, in our recently released privacy report, we put forth a Golden Rule as a means of developing and evaluating privacy principles leadings to practices and behaviors:

We protect privacy when we consider the dignity of individuals about whom we know things, and when we use what we know about them only in ways which preserve and enhance that dignity.

This report is by no means the end of our exploration of privacy – it is just the beginning.  We will continuing the conversation this July, at Catalyst North America, in the “Privacy Risks Get Real” track.  We are working hard to ensure that these discussions reflect the inclusive nature of privacy.  We’ll be exploring privacy concerns across multiple domains: from healthcare to higher education.  Finally, to sweeten the deal, we have worked with the International Association of Privacy Professionals to get some of the tracks at Catalyst approved for Continuing Privacy Education credits.  We are looking forward to continuing the privacy conversations with all of you this July!

Speaking of Catalyst, we have special surprise for IdPS blog readers… Since it is Easter egg hunting season, we’ve placed a couple of them on the Catalyst web site. The prize inside is a super discount code to attend Catalyst. To find the eggs, go to the conference web site and do this:

  • Hover (but don’t click) over the “San Diego” icon for 20 seconds

-or-

  • Click and hold on the Catalyst logo and then drag your mouse off and release

Register right away – this discount is limited to 50 users and could disappear at any time!

(Cross posted from the Identity Blog @ Burton Group.)

Privacy risks get real

When you think of “the usual” privacy risks you think of things like brand and reputation damage, fines, and increased regulations. You don’t think of jail time for executives. But jail time is exactly what some Google executives face if an Italian prosecutor has his way.

The arrest of Peter Fleischer, Google’s Paris-based Global Privacy Counsel, in Milan on January 23 stems from video that was briefly available on Google’s site in Italy. The video showed high school students bullying a classmate with Down Syndrome. Google took down the video in less than 24 hours after receiving complaints about it. The view of Milan’s public prosecutor is that permitting posting of the video for any period of time was a criminal offense. Fleischer and three other Google employees have been charged with defamation and failure to control personal information.

In our forthcoming report, Bob and I explore the contextual nature of privacy. Google clearly operates in multiple geographic and legal contexts. In the US, Google enjoys protections similar to those afforded “common carriers”. However, in Italy, Google is being treated as a content provider and not a content distributor, and thus is not receiving any such protection.

The contextuality of privacy requires that you evaluate your business from all relevant contexts. In this case, Google may find that it should have looked at its video services from the perspective of an Italian user as well as an Italian regulator. This examination from all relevant contexts would highlight not only conflicts between contexts (someone’s desire to publish a video versus a state’s definition of what constitutes offensive or inappropriate content) but also conflicts between contexts and the organization’s business model. Google’s business of allowing anyone to post a video is in this case colliding with an Italian regulator’s desire to treat Google as a content provider, holding Google to an unanticipated set of requirements.

There’s no way that a small privacy team will be able to know everything about every context the company does business in. To that end, a side effect of doing business in multiple contexts can be a budgetary one. Organizations may need to budget for external legal counsel, counsel that specializes privacy for the contexts they are working in to aid privacy teams in their evaluation of relevant contexts.

We don’t expect criminal penalties for privacy violations to become common, and it’s not at all clear that the action against Google’s executives will be sustained by the Italian courts. But that being said, we do expect privacy regulations to become stricter and subsequent penalties to become more severe. Privacy risks are getting real. Join us at Catalyst this summer and learn how to adapt, and thrive, in the face of this new reality.

(Cross-posted from Burton Group’s Identity Blog.)