Sorta Secure

The FBI is recommending that we stop using text messages.

After last week’s reports that Chinese hackers have infiltrated at least eight American telecommunications companies, the Cybersecurity and Infrastructure Security Agency issued recommendations to telcoms to help them secure their infrastructures. For the rest of us, the FBI “warned iPhone and Android users to stop texting and to use an encrypted messaging platform instead.” According to Forbes, they are urging us to “use a cell phone that automatically receives timely operating system updates, responsibly managed encryption[,] and phishing resistant MFA for email, social media[,] and collaboration tool accounts.”

This is a difficult needle to thread.

The Foreign Intelligence Surveillance Act gives the US government substantial authority for cybersurveilance. Specifically, section 702 authorizes the National security Agency to conduct searches of foreigners’ communications without warrant. And because these foreigners also communicate with Americans, interception and monitoring of American communications is also included. The roots of this program go back to Stellar Wind, a secret program from the early 2000s that forced telecommunications companies to monitor American communications and report that information to the government.

Image credit: Phil Roeder on Flickr

But strong encryption foils the government’s ability to conduct this kind of surveillance. There are ways to ensure that messages are not intercepted, redirected, or monitored. But those take away the ability for law enforcement to watch what’s going on. So the government strongly encourages developers to build security backdoors into their products that allow the government to monitor cyber communications and allow the companies to respond to FISA requests . That’s what they mean by “responsibly managed” encryption. That’s encryption that’s works well enough to make people mostly secure while being broken just enough that the “good guys” can get in if they need to.

Maybe an analogy would help. Our school has lockers with build-in combination locks. Our students are assigned lockers and are given the combinations. So they can put their stuff in their lockers, close the door, and be reasonably sure that no one else can open their locker. But there’s a list of all of the lockers and the combinations. If I wanted to, I could look up a locker’s combination on that list and go open the locker. I’ve never done that. I don’t have any reason to do that. But I could. I could also get my hands on one of the master keys that lets us open any locker without the combination. It wouldn’t be very hard.

We could make this more secure by using locks that allow the students to set the combination and that don’t support master keys. Then, they could be sure that the combination isn’t on some list someplace and no one could open their locker. But that would make it harder for us to help when they’ve forgotten the combination. It would make it harder to go get things out of their lockers that they need at home when they’re sick. It would make it harder to conduct locker searches if it’s suspected that they have drugs or weapons or rotting food from back in September in there.

So it’s sort of secure. It’s secure enough, unless someone *really* wants to get in. Like, say, a Chinese hacker.

Years ago, we faced a similar contradiction in US government policy. In 2000, congress passed the Children’s Internet Protection Act, requiring schools receiving federal funding to filter Internet content that was obscene, child pornography, or harmful to minors. It essentially ended the debate about Internet filtering in schools. We have to filter Internet content.

But there was this software program called UltraSurf. It was (is?) a free program that allows you to circumvent censorship filters, firewalls, and proxies. Students could easily download and use this program to get around any filtering measures we had in place. This software was initially funded by the US State Department. In an effort consistent with Voice of America and Radio Free Asia, it aimed to provide free and open access to news and information in places where the government was restricting that access. Essentially, it was designed to help circumvent the Great Firewall of China.

And, as it turns out, the software that China was using to oppress its citizens was the same software American schools were using to protect our children. Oppression and protection are two sides of the same coin.

We walk this line all the time. We want security, but we also want convenience. We want to protect our students, but we also want open access. We need to ensure privacy, but we also have to be able to defeat that privacy when the “good guys” need access.

I’m probably going to keep using text messages. I don’t have a lot of faith that the “secure” messaging apps are really all that secure. And I’ve come to terms with the fact that true privacy online doesn’t really exist.