Security back doors: Necessary evil or ‘vulnerability by design’?

Backdoor security CSC Blogs

Since the Snowden revelations brought to the public attention the extent of mass online surveillance, there has been an understandable acceleration in the adoption of stronger and on-by-default cryptography. This has caused alarm in some circles, and security services across the globe have been lobbying their governments for action.

Essentially what they want is a solution that will allow data to be encrypted between A and B, but still allow inspection by a trusted party should the need arise.There are a couple of salient precedents:

After 9/11, the U.S. realized that passenger safety was at odds with passenger security and locks on cargo luggage were hampering the TSA’s ability to ensure nothing bad was contained within. What happened when a sniffer dog identified a case and an X-ray proved inconclusive?

An early solution was plastic cable ties with serial numbers, so that the owners could see if their luggage had been tampered with, however this solution had obvious shortcomings. For one, anti- tamper does not equal anti-theft.

What the world really needed was a solution that could secure contents between A and B, but that could still allow inspection by a trusted party if the need arose.

A solution finally came in the shape of the TSA-approved Travel Sentry Locks, basically a lock with a built in “back door” in the shape of 2 keys, one for the owner and one from a set of master keys that could be carried by TSA agents.

There were a few teething issues with the Travel Sentry locks, primarily that a lot of cases were reported arriving at point B somewhat lighter (solved in part by installing video cameras to monitor baggage staff).  Secondly, cases arrived at Point B reportedly with the locks having been cut off, or worse with the cases badly damaged by TSA agents who presumably had made up for their carelessness in misplacing a set of master keys with extra vigilance. (This was solved by denial/refusal to take responsibility for one-way correspondence marked confidential).

To add insult to injury, there was further furor recently when it emerged that the Washington Post “accidentally” published a photo of the keys online back in 2014 that were good enough for security researchers to use 3D printers to recreate them.  While this was both interesting and amusing, it was completely moot because as the owner of several TSA-approved locks myself, (they come attached to most new luggage these days) I have known for a long time that it’s actually easier to pick them than it is to open them with a valid key.

Today, TSA-approved locks deserve the derision they attract. Luggage cling wrap stalls are available in every major airport, and interestingly several Web commenters have suggested that the best way to secure you luggage is to fly with a firearm in your case, which tellingly mandates a non-TSA approved lock and requires all sorts of additional security controls by the airline.

A decade prior to the TSA Lock debacle, there was the Clipper Chip, a short-lived key escrow system designed mainly with mobile phones in mind, and based upon some NSA developed secret crypto called Skipjack. The purpose of the clipper chip was to allow data to be encrypted between A and B, but still allow inspection by a trusted party should the need arise.

The clipper chip had some teething problems: Firstly, it was technically flawed by numerous security vulnerabilities that were probably missed because the NSA made the whole program secret, and as such not made available for inspection by trusted parties. Who says that Americans don’t get irony?

American companies ended up paying more to manufacture phones that no one wanted to use.

Next there was the unfortunate reality that the U.S. Government did not have jurisdiction outside of the U.S., so whilst they could insist that U.S. manufacturing companies included the chips in their products, they could not do the same for overseas companies, effectively meaning that American companies ended up paying more to manufacture phones that no one wanted to use.

Announced in 1993, it was dead in the water by 1996. I was unable to find out what this exercise cost the American taxpayer, but that this info was not easy to find via a Google search suggests to me that it probably wasn’t a bargain.

Which this brings us to the present-day predicament: Do we learn something from history that the concept of built in back doors are, in the words of the French deputy minister for digital affairs, “vulnerability by design,” or should we keep on doubling down.

Third time lucky perhaps?


Bio-picEric Pinkerton, a CSC Cybersecurity principal security consultant, has worked on numerous cloud assurance engagements, including complex control audits, detailed threat risk assessments and technical configuration reviews.  Pinkerton is also proud to have contributed to both the NESAF Cloud Security Framework and the CSA Cloud Controls Matrix.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: