There was an error in this gadget

Tuesday, 26 April 2011

Dogs will be dogs...


The growing furore over the gathering and retention of location data by smartphones reminds me very strongly of a joke that I heard first in the school playground many years ago. ‘Why does a dog lick his balls? Because he can.’

The same is true about smartphone operators. Why do they gather location data? Because they can. Technically, they can, because of the very nature of smartphones. Legally they can, because our laws over this kind of thing are obtuse and opaque – and because they understand the way they can get ‘consent’ through the small print of terms and conditions that no-one ever reads, let alone understands.

A lot of the discussion about the current furore has centred around the individual companies concerned, and brought out all the usual views of the merits or otherwise of Apple, Google and Microsoft – but whether you consider each of them to be fancified show-bred French poodles, friendly and loveable Labradors or ageing but far from toothless Rottweilers, they’re all dogs, and dogs will be dogs. Even the best behaved and most presentable show dog will lick his balls if he’s allowed to.

Three questions arise for me. Firstly, why are people surprised? Many people seem to be genuinely shocked by what has been revealed – even people who know a great deal about the subject. Is it really such a surprise? We’ve known about the capabilities of smartphones since they first emerged, and about the behaviour of all the companies involved for even longer. Dogs will be dogs.

The second question is whether any of it matters – and for me the answer is clear. Of course it matters, and matters a lot. That doesn’t mean that we need to panic, or need to throw our iPhones, Blackberries and HTCs in the nearest river – just that we need to aware of what is going on, and do what we can to ameliorate or manage the situation.

That brings me to the last question – what, if anything, can be done about it? Well, if we were talking about dogs, the answer would be simple: make sure they’re well trained, and well managed. If badly looked after, dogs behave badly. If they’re well trained, they can be very useful, helpful and excellent pets. They can help us in our personal lives, in our work and in many social situations – but you still need to train them and manage them. We need to do the same for the likes of Apple, Google and Microsoft. Show them who’s boss – using all the tools we can to do so. That means putting the right laws in place, but also using our powers as consumers, as advocates, and lobbyists.

If dogs know what they can do and what they can’t, they’ll behave much better. It's very hard to train a dog not to lick his balls - and probably just as hard to train companies like Apple, Google and Microsoft not to push the limits of privacy - but it can be done. We need to tell them that this kind of thing is not acceptable – and back up what we say with the law and with our money. If we don’t want our location data gathered, we need to be clear about it.

My personal view is that we have the right not to have this kind of thing happen to us - and that we need to proclaim that right (and other rights) loud and clear.


Thursday, 21 April 2011

The real challenge for IT Lawyers: the law!

Sometimes it's tempting for an IT lawyer - or rather an academic IT lawyer - to feel that things are moving essentially in the right direction, that the subject is getting more mainstream, more understandable - and more importantly, more understood. In some ways, of course, that's true - but in others, we need to remember that things are far from positive, and that in many ways the 'establishment' - the legal system, the politicians, even the public - still don't really 'get it' at all. Perhaps the most important of these is the legal system. To a significant extent it seems as though the legal system - and the law - is just completely out of kilter with the reality of the IT world, and in particular the internet.

A couple of things in recent weeks have driven that home to me. Neither was surprising, but both were disappointing, particularly to those of us interested in privacy and autonomy. First of all, there was the announcement that there won't be any prosecutions arising from the Phorm secret trials, something which has been greeted with dismay by privacy advocates. Secondly, and most recently, was the failure of the judicial review to overturn the Digital Economy Act.

In both cases, it's easy to see how the results came about - and indeed to argue that from a precise legal standpoint the results might have been technically correct. In both - and in the case of the Digital Economy Act in particular - it shows that the legal system really doesn't understand what's going on in the internet, and how our online world functions. The Digital Economy Act - in its provisions concerning the policing of illegal downloading - is so clearly inappropriate that it's hard to find an academic lawyer in the field who believes it's appropriate or proportionate, or even who believes that it stands any real chance of being effective. Precisely the opposite. It won't work. It misses the point. It will victimise the innocent. It shows a fundamental misunderstanding of both the nature of the internet and the habits of most of those who use it. It's such a bad law it just makes many of us shake our heads in disbelief.

The Phorm story is a little less dramatic, but demonstrates some similar features. The CPS have decided not to prosecute - and they may be right that there might not be much chance of a result. That, however, just reveals that our legal system doesn't have the teeth or the capability to deal with the reality of the internet - for what Phorm and BT did was something that the law should have been able to deal with. It was a serious invasion of privacy on a very serious scale - secretly tracking the entire internet activities of 30,000 people without their knowledge or consent - and yet the law seems to be incapable of dealing with it, incapable of providing people with the kind of protection that people need. The kind of protection that people have a right to expect. The law should do this - and in its current form it doesn't.

In the grand scheme of things, neither of these two incidents are likely to matter in the end. Despite the failures of the law, Phorm still failed, brought down by a combination of the privacy advocacy of such excellent groups as the Open Rights Group and the Foundation for Information Policy Research, interventions by the European Commission, and the belated intelligence of businesses like BT who withdrew their support as they began to understand how things really work. Similarly, the Digital Economy Act is likely to end up an irrelevance, as the people who it is intended to catch find ways to sidestep it, as further legal challenges arise, and as embarrassing prosecutions fail - and something that gets closer to understanding the reality of the situation is brought in to replace it.

It feels, though, as if the legal system needs to be dragged kicking and screaming into the modern world. That's the challenge for IT lawyers. People are thinking and writing interesting, informative and insightful things about the nature of the internet - but right now, it isn't being sufficiently read or understood, and certainly isn't finding its way into the mindsets of those creating or enforcing the law. It needs to be - for though other forces will (and have, in the case of Phorm) stop many of the worst things from happening, without the law being 'fit for purpose' everything is a struggle, and many people suffer along the way.

Wednesday, 13 April 2011

Let's forget the right to be forgotten...

....and talk about the important part of that right in less emotive, less distracting, and more accurate terms. I'm not interested in rewriting or erasing history, I'm not interested in hiding my past - selectively or completely. I am, however, interested in cutting down the amount of data held about me for spurious purposes, and interested in having more control over what commercial enterprises do with the data they have on me. I don't want a right to be forgotten - I want a right to delete personal data!

I asked a question about this at the Westminster Media Forum a few weeks ago, and gave a presentation on the subject at BILETA in Manchester yesterday - but I think the subject needs more attention. At the Westminster Media Forum, there was a particularly acerbic attack on the right to forgotten by journalist Tessa Mayes, who seemed to think that the right was all about restricting journalists' rights to report on past events. At BILETA, even though I made the point very directly that the right to delete data isn't the same as a right to be forgotten, one of the biggest questions I got was actually exactly about how such a right would effectively mess up the historical archive that the internet provides. For me, the right to delete data isn't like that at all - but calling it a right to be forgotten can easily mislead people into thinking it is.

The right to delete, as I set it out in my presentation (slides of which are available by email), is something that allows individuals control over what data is held about them - but it has a number of specific exceptions, situations where data should be allowed to be kept, regardless of the desire of the individual concerned. They include examples such as medical records, criminal records, electoral rolls and so forth - and appropriate historical archives. All that should be clear - and should prevent the problems that would be associated with a real 'right to be forgotten'.

The point is, though, that those holding data should need to justify that holding - rather than the individual justify why they would like data to be removed. If there ARE good reasons to hold data, then say so. If not, then the individual should have the right to delete it. The key thing to understand, though, is that BUSINESS reasons - and in particular the fact that you can make money out of holding someone's data - are not sufficiently strong to override someones right to delete data. Privacy is more important than that.

Forgetting is important too - as Viktor Mayer-Schönberger has described so eloquently and compellingly in his excellent book Delete - but, as Mayer-Schönberger himself suggests, rights may not be the best way to bring it about. Talking about a bald 'right to be forgotten' doesn't really help either the understanding of what is a complex and important issue, or about dealing with the important both practical and ethical issues surrounding rights over personal data.

So let's forget about the right to be forgotten - but fight strongly for the right to delete personal data.