We’ll be watching you
The surveillance society is alive and well because we’ve given companies permission to do almost anything with information about us.
The surveillance society is alive and well because we’ve given companies permission to do almost anything with our information.
As a specialist in surveillance, Professor William Webster watches the watchers. Mostly he’s curious and quizzical, inquiring and interested in the complex questions posed by the post-privacy age. Should you tape over your webcam? (FBI director James Comey thinks you should.) Pippa Middleton has been hacked. Yahoo too. Will you be next? Can we trust Facebook or even the Government with our personal data?
Webster is intrigued by all this, but he has a light touch without a trace of apocalyptic paranoia. That’s why it resonates when Webster, a director of the Centre for Research into Information Surveillance and Privacy (Crisp), talks about what really worries him: children.
“Children are normalised into the surveillance society as soon as they go to school,” he says. “Schools are massive surveillance zones. You have the register, the surveillance cameras, cameras in the hallways, the classrooms – even the toilets, and library cards recording what is read. Schools normalise the idea that you are under surveillance from a very early age.”
Much of that is, well, old school. Now we’re moving to a new level. In a craze sweeping the US and the UK, teachers use an app called Class-Dojo. Children are assigned an icon, a cartoon-like character, and their behaviour is rewarded and punished. The data is captured and builds into a digital portfolio of their school days. Parents even get real-time messages to let them know what’s happening at school. “It’s taking hold like wildfire in UK schools and I think there are some really serious issues here,” says Webster, who is visiting New Zealand from Stirling University in Scotland to research and give a series of lectures in conjunction with Victoria University.
“These c hi l dr e n haven’t consented to their information being collected and processed. The parents haven’t given consent either, and there don’t seem to be any school policies for the use of this sort of software.” Class-Dojo is now being used by at least one teacher in half of all UK schools. “Over time, that teacher is creating a behavioural account of your child that is very sophisticated, and the Class-Dojo app is operated by a private company.”
Could the company sell the information on – say, to a reading recovery programme, armed with the knowledge of which children are struggling? “They could say, ‘This child is a bit behind the average’, and then crossreference it with another data set showing which parents can afford to pay for tutors.”
Then, of course, there is the old-fashioned idea that you have the freedom to just be a kid and tell a white lie. How was school? Fine. You been behaving in maths class? Yep. So have ended millions of conversations between parents and children. Now, though, things aren’t so simple.
“Schools normalise the idea that you are under surveillance from a very early age.”
PSST! WANNA SELL YOUR PRIVACY?
Want a little extra money? Soon you may be able to sell your privacy, which is a better deal than it sounds since now we give it away for free. In fact, we explicitly agree to give it away in the terms and conditions
for many of the popular apps and social media sites.
That, of course, is where the value is for the owners of these platforms. They mine the personal information and interests of users and sell that on to companies that then use it to reach potential customers. But Webster says we’re reaching a tipping point. “The current model is coming to an end where we just give over everything and it is then exploited and used and mined and value is created to benefit others.
“We will see innovative online companies that will say, ‘When we process your data, we will give you back some money. Every time we process data and sell it, we make one penny per person, and so we’ll give you back one-tenth of that.’ Instead of making all the profit, they’ll give a small amount of it back.”
Following the basic rules of economics, though, if privacy is increasingly rare, won’t it become more commercially valuable? “Exactly,” says the professor, clearly excited about this development. “Companies are starting to see there is value in being privacy friendly. If you use Firefox as a browser and not Google Chrome, you’ll see it has far superior privacy settings.”
One of Webster’s favourites in this category is Ghostery, which allows users to see the hundreds of companies that track them every time they go online. With Ghostery, if you visit a website, it will show you all the other sites and companies that your information is being exchanged with. You can then find out about, or block, a tracker if you don’t want it sucking up your data. “It blows your mind,” Webster says. “You think going to one web page is just going to one page, but what goes on behind that is like a fireworks display – it’s like, wow, the internet is a lot more complicated than I thought.”
Complicated, yes, and potentially scary too. Getting unwanted ads is a nuisance, but getting hacked can be seriously damaging. Every week seems to bring a major international news story based on a hack or data breach. Just after I speak to Webster, news breaks that 500 million Yahoo accounts have been hacked.
This hack – which drags in some Spark customers in New Zealand – is particularly worrying because it happened two years ago and users are only just being alerted. Lawsuits for damages are already in train.
Webster says the near future will bring harsher penalties for companies that allow this to happen, forcing them to increase security. “We’re going to see more and more data breaches, and you’ll soon see the penalties for this become much more severe. At the moment the penalties are quite small, mostly reputational rather than financial.”
This has huge implications for state agencies too. A Guardian report counted 9000 data breaches among the 17 largest government departments in the UK last year. In New Zealand, there is a steady stream of stories about government agencies sending information to the wrong people under sensitive circumstances or leaving their systems open to abuse.
A DIRTY WORD?
The state is the traditional creator of records. “It has always been in the business of mapping land and creating a record of who owns what,” Webster says. “This is not a new business for them.” So there has long been information gathering, but when does that cross over into surveillance? “Surveillance is a concept that allows us to understand that information is not neutral. It has power embedded in it. It is created by a human for a purpose. Whereas the term ‘information’ is benign.
“‘Surveillance’ is a very powerful word. It is very emotive and people see it very negatively – especially here in New Zealand. If I go to a public agency here and I start talking about surveillance, they clam up.”
He says he called in on a public agency in Wellington responsible for running “quite a few public space surveillance cameras”, although he won’t say which one. “I was asking about the smart technology behind the cameras. I was using the term ‘surveillance’, and I was put in my place: they said, ‘We don’t do surveillance – we have community safety cameras for community safety, not for surveillance.’ Well, in my mind, they do the same thing.”
In the UK, people seem happy to call them surveillance cameras, but not here. The UK also has a Surveillance Camera Commissioner, but there is no New Zealand equivalent. Yes, we have the Privacy Commissioner, but Webster says the office lacks teeth.
“They don’t have many investigatory powers, so that stops them discovering when there is a data breach or discovering when personal data is misused. It has to be brought to their attention. That means that the re-individualisation of people happens much more than it should.”
Re-individualisation? Sounds painful. What is it? Well, take Statistics New Zealand, he says. “They collect a lot of data about public services and they mine that data to find trends.” They might find families in one area are susceptible to a health problem. Then the health agencies want to know who the families are to give them targeted services. “But often, to find out who the people are, you have to re-identify them, and the whole process wasn’t set up to allow that – it wasn’t meant to happen.”
That leaves the agencies in “a bit of a pickle”, because they don’t have the terms and conditions sorted when you first interact with them. “They create lots of our personal data and they want to use it for public policy, but we haven’t given them consent for information to be used in other ways. That causes real problems.”
It’s different for private companies. “With your smartphone and all your apps, you sign those terms and conditions and you say it’s okay for them to identify you, so they will sell on your information to advertisers because you have agreed to almost anything.”
“You think going to one web page is just going to one page, but what goes on behind that is like a fireworks display.”
SURVEILLANCE BY CHOICE
That’s the key element then, isn’t it: choice? We agree to it. We like the convenience and we shrug our shoulders at the consequences. I broke my GPS a couple of months back and was going to spend $500 replacing it. But I downloaded an app for free that does exactly the same job. The catch? I put my location in, so every time I use it, I’m tracked or I can be. I know that. I’ve made that call.
Isn’t this, professor, surveillance by choice, rather than by stealth? “I absolutely agree,” he surprises me by saying. “I do the same. The value of the product or the service overrides the concerns about privacy or surveillance or data interchange.”
What happens, though, when the service is so ubiquitous you don’t have a real choice about whether you use it? It’s got to the stage now where some people actually have to be on Facebook. Personally, I like Twitter, and it’s very useful for work. I