Where is the line between online privacy and security?

Where is the line between online privacy and security?

We asked an expert — The amount of data we're producing is rapidly increasing, and governments are desperate to access it. But where should the line be drawn between security and privacy? We asked an expert to find out.

The amount of data the world produces is growing, and fast. In the past two years more data has been created than in every other year in history combined.

With all this data comes a tension – how much access should governments have to all our information? While we all want personal privacy, government’s are increasingly keen to know what we’re up to, and we’re told it’s for national security. The Apple v FBI case that hit headlines in 2016 pulled this issue back into the spotlight. What this all means in practice, however, is pretty complex.

Tanya O’Carroll is an Adviser on Technology and Human Rights at Amnesty International, so we caught up with her to find out what the court case means, and where the line should be drawn between security and privacy.

Huck: What exactly happened in the FBI v Apple case?

Tanya: In February 2016, the FBI and U.S. investigators working on the San Bernardino shooting approached Apple with a U.S. court order, asking Apple to circumvent their own security and crack the iPhone of one of the shooters.

Apple responded by saying that while they’d complied with the investigation thus far, they weren’t willing to hack their own security and create a backdoor to the iPhone – compromising the security of millions of people worldwide. The thing about information security is that as soon as they create that vulnerability, it could become vulnerable to abuse by criminals, governments, hackers. So, they pushed back and said what they were being asked wasn’t proportionate.

Daniel Nanescu via SpliShare

Daniel Nanescu via SplitShare

The FBI then said they’d managed to hack into the iPhone regardless, so why did they even ask in the first place?

This was never really about this case – there are two important historical trends you have to understand here. Since 9/11 back in 2001, we’ve seen increasing securitisation of society – the remits of what is permitted in the interest of national security keeps expanding.

Firstly we were told that torture was acceptable, despite being long considered illegal under international law, justifications for torture started to creep back in. “Torture is bad”, western governments would say, “except when we need to use it”.

Now privacy is that new frontline: “we want you to be secure, except from us”. If every government says this you can see it’s a race to the bottom and we end up much less secure.

Secondly, there’s what we learnt from Snowden – governments have been expanding their powers to take advantage of the increasing quantities of personal data online. There was a legal loophole as technology had developed so quickly, but as the public started giving heat to their governments over mass surveillance, there has been a scramble to legalise (and even expand) the powers Snowden revealed.

Then the Apple v FBI dispute comes up. Tech companies say it’s a slippery slope, and that it’s obvious where things are headed. The internet becomes less secure, but there’s also a dangerous precedent globally if Apple agree to this. What happens when more repressive governments, in China, in Egypt, in Saudi Arabia ask them to do the same? It was never about the single phone, it was always about a whole lot more.

So the FBI could hack it all along?

Well, the FBI backed down just before the hearing was scheduled, saying they had potentially found a way to break into the phone with the help of a consultant. Lots of people believe that the FBI never really needed Apple’s help in the first place – Snowden said the FBI’s claim it couldn’t unlock the San Bernardino iPhone was “bullshit”. It comes down to a question of convenience: it’s easier for security agencies to set the precedent that companies like Apple must provide them a “backdoor”.

Daniel Nanescu via Splitshare

Daniel Nanescu via Splitshare

If we should be concerned about governments having access to our data, shouldn’t we be more concerned that for profit corporations like Apple are sitting on all our information instead?

Well, there’ve been moves by Apple, and Whatsapp just this week, to make sure the data is out of their hands too. Most of the encryption on the Internet is transport encryption which basically means your data is secure while it’s travelling from one place to the next, but when it’s on the company’s servers they have access to it – storing, mining and selling our data for advertising is part of their core business.

The move by Apple in it’s iMessenger, and now Whatsapp, to roll out end-to-end encryption for their users means that the companies are no longer the middle men in our communications. Even they cannot see the messages, images and files that we send with end-to-end encryption.

Having said that, the companies who have introduced strong encryption are still the minority. As consumers we need to put pressure on more companies to increase security and be more transparent about what they do with our data.

If governments want access to this data, then can’t they just change the law?

Obama last year considered four approaches to regulate encryption but had to back off because in practise it’s just not feasible to regulate in a way that won’t have serious adverse consequences for Internet security. In the UK you’ve got the so-called Scoopers Charter, and last year sweeping new surveillance laws were introduced in Pakistan, Poland, Switzerland and more. In France right now, after the Paris attacks, the government is considering introducing prison sentences and fines for companies if they refuse to comply with decryption orders.

Campaigners and organisations like Amnesty are worried that governments will keep pushing to weaken and restrict strong encryption. We’re all sharing increasingly intimate parts of our lives online, and without adequate encryption everyone’s data is more vulnerable – not just to surveillance by their own governments but to cyber-crime, foreign intelligence and corporate misuse of their data.

But there’s bound to be a trade off between privacy and security, right?

Of course nobody denies that sometimes governments need to access data for intelligence and law enforcement reasons, but we to put a heavy onus on them to justify that. Not only that it’s necessary to meet a legitimate need, but also that steps taken to justify that the measures proposed are proportionate.This wasn’t the case with the Apple v FBI case.

Measures which allow the government to spy on entire countries or populations are not a proportionate response to national security threats. Measures to undermine the security of the Internet as a whole through weakening – or compelling companies to weaken – encryption that protects millions of people worldwide – are not proportionate either.

On the other side, we also have to think about the public value of encryption. Just think about the Panama papers, a year of journalists communicating and working together on history’s biggest leak – they couldn’t have done that without encrypted communications.

If we imagine end-to-end encryption is going to become the standard, and we don’t want governments to stop this, what happens when there’s a terrorist attack and access is needed to communications?

It’s funny that you frame it like that – the ticking bomb scenario. In essence governments keep trying to convince us to set the bar for everyone based on the most extreme circumstances. In recent terror attacks nobody has yet demonstrated that encryption was in any sense a barrier to detecting and stopping these attacks.

The amount of data governments have access to has never been so high – but they keep saying it’s not enough. In the Apple v FBI case, all the iCloud data from the phone was available from several weeks earlier, as was all the metadata (who, where, how information that doesn’t include the content of the communications).

Terrorists will always find a way to communicate “off grid” so it really doesn’t make much difference if governments weaken encrypted services like Apple and Whatsapp. Its the security and privacy of the rest of us that we stand to lose.

Thanks Tanya.

Enjoyed this article? Like Huck on Facebook or follow us on Twitter.