I’ll keep my paper passport, thanks

Here is a short piece on how a researcher, Chris Paget, bought a $250 RFID reader on eBay and used it to clone ePassports while driving 30 miles an hour near Fisherman’s Wharf in San Francisco.  I fully recognize that this demonstration doesn’t represent a method for fabricating complete paper-in-hand cloned passports.  Cloning is just the first step, but it is a big step.  More importantly, it is a step that the State department has is somewhere between impossible and unlikely.  The following is a passage from the privacy impact assessment (PIA) of TDIS – the Travel Document Issuance System:

The Department of State has taken extensive measures to prevent a third-party from reading or accessing the information on the chip without the passport holder’s knowledge. This includes safeguards against such nefarious acts as “skimming” data from the chip, “eavesdropping” on communications between the chip and reader, “tracking” passport holders, and “cloning” the passport chip in order to facilitate identity theft crimes. These safeguards are described in detail on the Department of State website.

Apparently those safeguards aren’t very strong.  

I invite you to read the State Department’s FAQ on e-Passports.  Notice the incredibly defensive tone in the opening of the answer to the question, “Will someone be able to read or access the information on the chip without my knowledge (also known as skimming or eavesdropping)?”  Also notice the tacit acknowledgment that passport RFID chips can be cloned.

Mr. Paget intends on driving around DC this weekend to see what he can clone, and with a macbre sense of humor, I look forward to reading his results.

Until then, I’ll keep my paper passport.

Putting privacy controls in the hands of your users

I mentioned yesterday that Bob and I have just finished up some research on privacy.  In this upcoming report, we stress the importance of establishing privacy principles and then using those principles to guide privacy practices.  I happen to see this NY Times article (via Nishant’s Twitter stream) and had a bit of a Baader-Meinhof moment.  The article talks about how social networking sites are giving their end-users more and more control over how information is disclosed.  Giving users choice as to how their information is disclosed and used is important.  Giving users meaningful choice as to how their information is used is much better. 

 One of the privacy principles that Bob and I examine in our report is the principle of Meaningful Choice:

Robbing others of the ability to exercise their free will is an affront to dignity; therefore we allow people to decide how we will use information about them.  When presenting people with choices about how we will be allowed to use their information, we design easy-to-understand interfaces which reduce the possibility of confusing people, and we avoid creating “Hobson’s choice” situations in which people are forced to choose the lesser of a set of evils.

As an ex-interface and product designer, I am especially sensitive to usability and the principle of Meaningful Choice directly addresses this.  Providing an end-user with a difficult to use privacy settings tool and then saying, “Well, we gave you choice as to how your information gets used” exploits the power imbalance between the service provider and the user.  As the interaction between the user and the service provider become more and more valuable (moving from social networking to, say, electronic health records), such an exploitation is less and less acceptable.

 In the course of our research we talked to one company who spent many months trying to get their privacy settings interface right.  They brought people (non-techies even!) into their usability lab and studied how these user set (or didn’t set) privacy settings.  The design team fully acknowledge that building a usable, meaningful interface for privacy settings was hard but considering the context, the effort was required.

 

 End-user privacy controls are mandatory.  But in the absence of a usable interface, end-user control is not control at all. 

(Cross-posted from Burton Groups Identity Blog.)

 

International Privacy Day: Synchronicity

Today is International Privacy Day (and also National Data Privacy Day here in the USA and maybe where you are too).  The day is set aside to celebrate the anniversary of the Council of Europe Convention on Data Protection.  Put on your reading list for today both the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data as well as the Organisation for Economic Co-operation and Development’s Guidelines on the Protection of Privacy and Transborder Flows of Personal Data.

 It’s also, felicitously, the end of the quarter for us here Burton Group, which means that we are trying to wrap up the final edits of our reports and send them off for peer review.  This quarter Bob Blakley and I have been researching privacy.  We’ve talked to a variety of different kinds of companies of all sizes in many industries, and we’ve come away with a lot of lessons.

 Two of these lessons are that privacy is deeply contextual, and that this contextual nature prevents privacy from being easily defined.  Without a strict definition, though, how does an enterprise privacy team proceed?  Can you write policies concerning something which means one thing in one setting and something different in another?  It turns out, we think, that you can.

 Principles.

 I practice martial arts.  Every martial art has a set of principles.  Though these principles may differ, their use is the same.  Principles guide practice.  You practice your art in multiple contexts to prepare you for whatever may come.  In each of those contextualized situations, your principles guide your response.  (Synchronicity moment number one).

 My friend Julie is one of the most amazing corporate and brand marketers I have ever met.  She uses a simple approach in building overall market strategies and brands: identify true corporate values (principles), then let those values lead you to tangible market strategies.  Corporate values guide the formation of market strategies.  (Synchronicity moment number two).

 In our forthcoming report, Bob and I examine sets of privacy principles, but we also look at the ways in which these principles can drive real practice.  We discuss the characteristics and activities of effective privacy teams, too.  In building our report, Bob and I used (self-referentially) this method of letting principles drive practice. We built the report by starting with what we are referring to as Burton Group’s “Golden Rule of Privacy” and let the Golden Rule guide our writing.  You’ll have to wait a bit for the full report (unless you want to be a pre-publication reviewer, in which case please drop me a line!), but I’ll share the Golden Rule with you now:

We protect privacy when we consider the dignity of individuals about whom we know things, and when we use what we know about them only in ways which preserves and enhances that dignity.

 Happy International Privacy Day!  And for those of you attending the IAPP’s Privacy After Hours event tonight in Washington DC, I’ll see you there.

(Cross-posted from Burton Group’s Identity blog.)

Stripping Search

In response to regulatory pressure and to apply some pressure on their competition, Yahoo has announced that after 90 days it will anonymize search queries and remove personally identifiable information (PII) from them as well.  Specifically, Yahoo will delete the last eight bits from the IP address associate with a search.  Further, Yahoo will remove some PII data, like names, phone numbers and Social Security numbers from the searches.  The goal is to (eventually) destroy the ties between a person and what that person searches for which could include embarrassing, compromising, or sensitive items such as information about medical conditions, political opposition materials, adult entertainment, etc.

There are two points I want to draw you attention to.  The first point is related to the amount of time search providers, like Yahoo, hold identifiable search queries.  Regulators have recommended to search vendors to reduce how long they hold identifiable searches.  The EU has recommended 6 months, for example.  Yahoo, reducing their retention time from 13 months, has taken a laudable step to reduce that time to 90 days.

In the future, the time it takes a search provider to extract whatever goodness it wants to out of a search query (to feed its varied businesses) and anonymize that query will reach zero.  External pressures aside, the Googles and Yahoos of the world will achieve near-instantaneous goodness-extraction/anonymization of search queries simply because it reduces what they have to store, maintain, and worry about.  That being said, even though search providers will be able to achieve near-instantaneous extraction and anonymization, they will never be able to put it into practice.  Why?  Because there will always be a desire on the part of law enforcement to gain access to those identifiable searches. Continue reading Stripping Search