Thursday 21 October 2010

Opting out of Street View....

Nearly 250,000 Germans have 'opted out' of having their homes visible when Google's Street View comes online, though Andreas Türk, Product Manager for Street View in Germany, has admitted that some of those homes will still be visible when the service comes online, which will be some time in the near future, as the process is complex and not all instructions were clear. His blog here provides the explanations.

It's an interesting figure - is 250,000 (or, to be more precise, 244,237) a large number? As Andreas Türk says, it amounts to 2.89% of those who could have objected, and the argument can be made both ways. Google might argue that it means that the vast, vast majority don't object to Street View, so their service has some kind of overall 'acceptance' or even 'support' by the populace. Privacy advocates might say the converse - in absolute terms, 250,000 is a LOT of people. If you had 250,000 people marching on the streets with banners saying 'NO TO STREET VIEW' it would make headline news, certainly in Germany, and probably throughout Europe.

Both sides have a point: 2.89% isn't a very large proportion, but 250,000 is a lot of people, and when you look closer at the process I suspect that the privacy advocates have a stronger position. Given that the opt-out required an active process (and Google say that 2/3 of those who objected used their own online tool to do so) it does suggest that quite a lot of people care about this. If the reverse system had been in place - and you had to actively choose to HAVE your home 'unblurred' on Street View, what kind of figures would you get? Would more than 250,000 have gone through a process to make their houses visible? I doubt it....

...and what of the rest of us? Germans got a choice because their government made a point about it, and demanded that Google give them the choice before the service went active. As the BBC reports, other governments have made other kinds of objections, but none have been given the choice that the Germans have had. As I've blogged before, Germany has a pretty active privacy lobby, so it's not surprising that they are the country that has taken this step - what would the result have been if the option had been given in the UK? Or the US? Probably not as dramatic as the German result - which makes me wonder whether Google has missed a trick by not providing the option elsewhere. If they did so, and an even tinier fraction than the 2.9% in privacy-aware Germany objected, they might be able to be even bolder about proclaiming that people love Street View.....

Thursday 7 October 2010

How personal is personal?

The Register is reporting that the ICO wants a clearer definition of what consititutes 'personal data' - and it is indeed a crucial question, particularly under the current data protection regime. The issue has come up in the ICO's response to the Government consultation on the review of the Data Protection Directive - and one of the key points is that there is a difference between how personal data is defined in the directive and how it is defined in the UK Data Protection Act. That difference gives scope for lots of legal argument - and is one of many factors that help to turn the data protection regime from something that should be about rights and personal protection into something often hideously technical and legalistic. The ICO, fortunately, seems to recognise this. As quoted in The Register, ICO Deputy Director David Smith says:
"We need to ensure that people have real protection for their personal information, not just protection on paper and that we are not distracted by arguments over interpretations of the Data Protection Act,"
That's the crux of it - right now, people don't really have as much real protection as they should. Will any new version of the directive (and then the DPA) be any better? It would be excellent if it did, but right now it's hard to imagine that it will, unless there is a fundamental shift in attitudes.

There's another area, however, that just makes it into the end of the Register's article, that may be even more important - the question of what constitutes 'sensitive personal data'.  Here, again, the ICO is on the ball - this is again from the Register:
"The current distinction between sensitive and non-sensitive categories of personal data does not work well in practice," said the submission. "The Directive’s special categories of data may not match what individuals themselves consider to be ‘sensitive’ – for example their financial status or geo-location data about them."
The ICO go on to suggest not a broadening of the definition of sensitive personal data, but a more 'flexible and contextual approach' to it - and they're right. Data can be sensitive in one context, not sensitive in another. However, I would suggest that they're not going nearly far enough. The problem is that the idea of the 'context' of any particular data is so broad as to be unmanageable. What matters isn't just who has got the data and what they might do with it, but a whole lot of other things concerning the data subject, the data holder, any other potential data user and so on.

For instance, consider data about someone's membership of the Barbra Streisand fan club. Sensitive data? In most situations, people might consider it not to be sensitive at all - who  cares what kind of music someone listens to? However, liking Barbra Streisand might mean a very different thing for a 22 year old man than it does for a 56 year old woman. Extra inferences might be drawn if the data gatherer has also learned that the data subject has been searching for holidays only in San Francisco and Sydney, or spends a lot of time looking at hairdressing websites. Add to that the real 'geo-tag' kind of information about where people actually go, and you can build up quite detailed profiles without ever touching what others might consider sensitive. When you have all that information, even supposedly trivial information like favourite colours or favourite items in your Tesco online shopping could end up being sensitive - as an extra item in a profile that 'confirms' or 'denies' (according to the kinds of probabilistic analyses that are used for behavioural profiling) that a person fits into a particular category.

What does all this mean? Essentially that ANY data that can be linked to a person can become sensitive - and that judging the context is so difficult that it is almost impossible. Ultimately, if we believe that sensitive data needs particular protection, then we should apply that kind of protection to ALL personal data, regardless of how apparently sensitive it is....