Grief over the Paris attacks will soon make way to demands for action.
As well as increased military activity, and the controversial suggestions to close the door on refugees, the next battle in the “surely something can be done” arena will be aimed squarely, and angrily, at Silicon Valley.
Tech companies were already under pressure to make it easier for governments to access “private” communication apps and services. Those calls have intensified greatly since the attacks in Paris.
The “problem” is to do with encryption.
Without encryption, all of the things we do online would be insecure, be it emailing, or shopping, or banking. They all rely on the principle that if you encrypt data using complex mathematics it is nigh-on impossible to crack.
If you’re using communication apps such as WhatsApp, Apple’s iMessage, WeChat and so on, your messages are encrypted by default.
It means that even if those companies wanted to hand over your messages to law enforcement, they couldn’t.
“There are a lot of technological capabilities that are available right now that make it exceptionally difficult, both technically as well as legally, for intelligence and security services to have the insight they need to uncover it,” said CIA director John Brennan at a security forum on Monday.
“And I do think this is a time for particularly Europe, as well as here in the United States, for us to take a look and see whether or not there have been some inadvertent or intentional gaps that have been created in the ability of intelligence and security services to protect the people that they are asked to serve.”
An opinion column in the New York Times, authored by Manhattan’s district attorney, and the City of London Police commissioner, said “encryption blocks justice”.
In the piece, published back in August, they wrote about a murder near Chicago in which a father of six had been shot. At the scene, officers found two mobile phones. But they were passcode locked. Neither Google or Apple (the phones ran their software) could unlock the phones, and therefore the data was inaccessible.
“On behalf of crime victims the world over,” the opinion piece read, “we are asking whether this encryption is truly worth the cost.”
It’s an argument that can be made with more vigour than ever after the Paris attacks.
With access to communications, the anti-encryption advocates say, we could perhaps stop these tragic events from occurring. That’s a claim worth scrutinising.
It’s early days in the investigation, and no evidence has yet been offered to show that encrypted communications were used to organise the atrocity.
But technology industry is, on the whole, against the suggestion that law enforcement should have “backdoors” into popular services – the term given to a hidden way of circumventing the app’s security.
A backdoor, in the infosec world, is the term given to a method in which a supposedly secure system can be accessed. It could be a quirk in some code, or a vulnerability in how a system communicates. Whatever the weakness, typically, once backdoors are made public, they are fixed.
Hackers make serious money by discovering backdoors and selling them on – often to government security services.
Many in law enforcement and government feel there should be a backdoor made just for those in authority to investigate and stop criminals and terrorists.
But some of tech’s most influential figures say that the notion of a secure, secret backdoor is dangerously misguided.
If any backdoor exists, hackers will find it eventually. It would mean data security for all of us, not just criminals, would evaporate.