‘Exceptional Access’: legal frameworks, technology & trust
The ‘exceptional access’ debate is an important one: how do authorised organisations (law enforcement, intelligence agencies and so on) legally and proportionally access encrypted conversations (and so on) without creating broken encryption models, inadvertent mass collection, unintended use or abuse while avoiding the creation of an ‘Achilles heels’ for exploitation by those with less well-meaning intentions.
The UK National Cyber Security Centre (NCSC)’s Ian Levy and Government Communication Headquarters (GCHQ)’s Crispin Robin have written some principles for the exceptional access debate as part of a (well worth reading) series of essays in the ‘Crypto 2018 Workshop on Encryption and Surveillance’ collection.
I have seen various “spies want your chat messages” nay-intelligence articles through to the almost-nay-privacy with “so what if you have nothing to hide?” so it seemed about time to throw my 2cents into the ring with the nth article on this contentious topic.
I’ll attempt to take more of a technologist view in a ‘this isn’t a technology problem’ kinda way.
Legal frameworks are hard
One of the hardest parts (if not the hardest) of the exceptional access debate is the cross-jurisdictional legal capabilities required to make it possible and the nuanced application when actually trying to do it.
If WhatsApp was a UK company with UK staff and a UK technical base (all of the servers were in the UK) would they still have to respond to warrants issued in France? With US citizen users in the US, how about from the US?
If the target subject 1 was in country A (but a citizen of country B) but was messaging a not-target subject 2 who was in country C and a citizen of country D — what approvals would be required? what limitations would there be?
What if not-target subject 2 was on a plane or in transit within country E? What if there was a third person 3 in the conversation who is a citizen of country F residing in country G?
How are the legal powers restricted and monitored so that legislation intended for criminal law enforcement is not used to monitor internal staff for vague HR reasons; used to check whether people live where they say they do so kids are in the right school or unsuitably vague (Patriot Act Section 215 “any tangible thing”)?
Technology problems are easy
In the exceptional access debate, the prominent technology ideas are fairly simple in concept (and likely in practice).
The leading and most technologically reasonable (in my view) has been informally dubbed ‘Ghost Protocol’: modify the already opaque client-side and server-side behaviours in order to suppress notifications and other user-facing content when adding an authorised ‘ghost’ party under warrant — leaving the cryptography (encryption) functions well alone.
Earning (and keeping) trust is hard
We implicitly place a lot of trust in application developers & platform operators
Apps such as WhatsApp, Facebook Messenger and even Signal only tell you what they do — actually knowing what they are doing is very hard.
We have already seen that developers and operators are entirely capable of creating undesirable situations all by themselves, so company-mediated identities should already attract a level of distrust.
Binary equivalence is the concept of determining that the source code has been turned into a binary (in a repeatable way) and that nothing has been added, removed or otherwise changed en-route.
It is extraordinarily difficult to sustainably determine whether changes have been made between a source code repository and the version of provided to users.
The purpose, functionality and security of the platform operator’s code is a blackbox: even if we could determine what exactly the client software does, we are still reliant on backend server-side functions which are opaque — even when the source code is published.
Governments & law enforcement
For a variety of reasons the public at large may be already distrusting of government and law enforcement on a variety of topics.
There is a disproportionate relationship between citizen and State so ultimately the executive, government and law enforcement branches must carry the burden of demonstrating that they are committed to being accountable with ‘the right amount’ of transparency.
In a move that pleasantly surprised many, GCHQ and NCSC published the UK Equities Process in late 2018 (the process of determining when discovered vulnerabilities are disclosed or withheld, with disclosure as the default preference). I mention this as it is a positive indicator that such organisations can take transparency seriously and are making efforts to be more so — plus it is quite fun to follow GCHQ on Instagram and engage with NCSC via Twitter.
“They want to break encryption”
No, they do not.
The FVEY (United States, United Kingdom, Canada, Australia & New Zealand) Statement of Principles on Access to Evidence and Encryption can be read in various ways but it does not call for the crippling of encryption designs that keep data safe — including their own data and that of the countries they seek to defend.
They use and recommend commodity IPSec and TLS for data in-transit encryption for non-national security (and some national security) data.
“WhatsApp, Signal (etc) would be rendered insecure”
Highly unlikely — unless the mechanisms are implemented incredibly poorly (which would be the app developer’s or platform operator’s fault or choosing)
Law enforcement have a desire for ‘exceptional access’ in order to facilitate their mission but this is not disjointed nor contradictory to their broader mission of “helping to make the UK the safest place to live and do business online” (NCSC everywhere but in this case on Twitter).
Law enforcement agencies want and need application developers and platform operators to write secure code:
- “writing decent code so security isn’t undermined by trivial vulnerabilities
- making sure there’s appropriate independent vetting of critical code before it’s added to the product
- protecting development networks so they know what’s really in the product and that it hasn’t been covertly modified by some external malfeasant
- protecting critical security artefacts like code signing keys”
— Ian Levy & Crispin Robinson
“This is a backdoor for anyone else to abuse”
Only if poorly implemented in server/client software by the application developer and/or platform operator.
“This is mass surveillance”
Adding a party to a conversation requires identifying that conversation and/or at least one existing participant within — and one hopes the legal basis is existing material suspicion (in that it is more than a hunch or a ‘see what we get’ collection exercise).
The technical proposals do not include adding ‘ghost’ members to every conversation across a platform.
Ultimately this ties into the legal framework problem, which requires civilian oversight and vigorous continual testing to avoid such future remorse.
So, where does this put us?
Within FVEY, Australia made the first move with TOLA (Telecommunications and Other Legislation Amendment (Assistance and Access) Bill 2018) and there has been significant backlash.
Other countries (and Australia) need an informed, attentive and motivated legislative arm to draft exhaustive prose to ensure adequate and proportional capabilities underpinned with reasonable mandated transparency and meaningful civilian oversight including meaningful dispute, arbitration and redress paths for both technology companies and targeted subjects while preserving operational capabilities and without inappropriately disclosing ‘ways and means’.
In my view as a non-lawyer who likes to write things on the Internet: while mechanisms to allow rapid response by law enforcement may need to exist, the use of broad, continuous or freely-given warrants should be impossible (unlawful) and detectable to avoid critique like the US Foreign Intelligence Surveillance Court (FISC, or FISA Court) has attracted.
This isn’t about busting encryption so as far as the technology propositions go — I’m happy to leave it there until I hear something that doesn’t make any sense.
Ultimately the question of whether you trust law enforcement, governments, application developers and platform operators to come up with, implement, stick with and monitor a ‘good idea’ is something only you can answer — my only ask is that you weigh both sides with a logical brain and a pragmatic lense.
Writing to your elected representative is a good way of asking what the legislative arm is up to and making your thoughts as a constituent heard.
“There are no easy answers here. We all naturally want perfect privacy and perfect safety but those two things cannot coexist … you have to accept reasonable restrictions on both of them.” — John Oliver
You might find other exciting posts in my Medium profile. I’m on Twitter as @JoelGSamuel.