Data security and the FBI’s attempts to screw it up

I can’t believe we’re even having this discussion in the USA.

This recent article in National Journal reports on a recent discussion hosted by Christian Science Monitor with Amy Hess, the executive assistant director of the FBI’s science and technology branch. The crux of this discussion was that encryption with “back doors” in it is an acceptable tradeoff for law enforcement.

The problem with Ms. Hess’s (and I would assume also the FBI’s) view is that when it comes down to it, computers are stupid. Example: when I type in my login ID “skquinn” and my password into a computer I have an account on, the computer gives me access based on that password. It’s going to give anyone access who has that password matching up with that login ID, it doesn’t matter whether it’s really me, my mom, a friend of mine, or some bozo that just stole my computer (for all values of “stole” whether it’s basic theft, burglary, or a cop with a warrant). There are ways around that password check, though, and this is why I keep my home directories encrypted (in my case, with eCryptfs).

Ms. Hess’s proposal would ask encryption software developers (such as the developers of eCryptfs) to include alternate ways of accessing the keys to decode my home directory, assumably for law enforcement use pursuant to a valid search warrant. The problem with that is that, again, computers are stupid, and the computer won’t be able to tell if it’s legitimate or not. The key can still be used to compromise my privacy; it’s bad enough if it’s a legitimate law enforcement use, but let’s say it’s some rogue cops who would like to see this blog disappear off the face of the Internet for good?

Real data security, whether the government likes it or not, means it is secure against even law enforcement access without the consent of the owner. Perhaps it could be said, especially against law enforcement access. While I would like to think the government acts in our best interests, there are quite a few instances from around the world past and present where this has not been the case. Present-day China, Nazi-era Germany, and recent governments in the Middle East (Iran, Iraq, Afghanistan) all come to mind. I’m certain that if computing technology like this had existed in the 1940s, Adolf Hitler would have loved to have backdoors like that for surveillance purposes.

It’s not our problem if the FBI or any other law enforcement agency can’t spy on us. I concur with the quote of John Basil Barnhill (mis-attributed to Thomas Jefferson): “When government fears the people, there is liberty. When the people fear the government, there is tyranny.”

I don’t want tyranny. And last I checked, that’s not the Statue of Tyranny standing in New York Harbor, either.

The fox in the henhouse, cyberspace edition

Again, before I get into discussing exactly what this email is about, I need to lay down the background on who’s who and what’s what. Otherwise, it’s easy for one to gloss over all of this and assume it doesn’t affect oneself, when in reality this potentially affects or could affect a large chunk of the users of the Internet.

In the beginning, there was the original Unix, AT&T Unix. The University of California at Berkeley made their own version of Unix based on AT&T’s code and called it BSD. There exist today several different operating systems that came from the original BSD code; FreeBSD, NetBSD, OpenBSD, DragonFly BSD, etc. Due to its liberal license, code from BSD was used in many places; instead of writing their own software for Internet connectivity (the TCP/IP stack, for those who know what that is), Microsoft adapted the one from BSD. Apple Mac OS X also uses software adapted from FreeBSD and NetBSD, which also traces its lineage back to the original BSD. Many GNU/Linux distributions also use software which came from BSD. Put simply, it is likely somewhere your computer has some software on it somewhere which originally came from BSD.

Of particular note in the BSD-derived operating systems is OpenBSD. The OpenBSD project was started by Theo de Raadt as a fork of NetBSD originally due to conflicts with the latter project’s leadership back in 1995. The focus of OpenBSD became security, and today many consider it the most secure operating system on the planet.

OpenBSD has software built into it to implement IPsec, which appears to have been started in the latter half of 1997. Theo de Raadt recently received an email from Gregory Perry. Gregory was working with a company called NETSEC and helped arrange funding for the OpenBSD Crypto Framework, upon which the IPsec software is based. The email, which Theo forwarded to the mailing list, contains a rather direct accusation that developers accepted money from the FBI to weaken the IPsec software in OpenBSD (specifically, to add “backdoors” to it intended for FBI use).

The full email is archived on, and also implies that this sabotage of the IPsec software in OpenBSD is the reason that the OpenBSD project lost its DARPA funding suddenly and unexpectedly. Now, back in 2003, sources such as ComputerWorld reported on Theo’s no-nonsense comments against the war on Iraq (such as the often-quoted “I try to convince myself that our grant means a half of a cruise missile doesn’t get built”) and it was suggested these were DARPA’s motivation.

First, Theo is to be commended for, as he states, “refus[ing] to become part of… a conspiracy.” It is not an easy decision for anyone, let alone someone of Theo’s stature, to decide to publish a private email. It involves a careful consideration of the consequences of violating a social norm for the greater good, and he acknowledges this:

Of course I don’t like it when my private mail is forwarded. However the “little ethic” of a private mail being forwarded is much smaller than the “big ethic” of government paying companies to pay open source developers (a member of a community-of-friends) to insert privacy-invading holes in software.

(I’ll get back to this decision Theo had to make in a bit.)

Gregory also deserves some recognition here, for blowing the whistle as soon as he was legally permitted to. This email serves as a prime example of the kind of damage a non-disclosure agreement (NDA) can do to the public good. I don’t think all NDAs are bad, and it’s way too easy to see why the FBI wouldn’t want the news of backdoors in OpenBSD’s IPsec software getting out. And, to be fair about it, I honestly think Gregory expected his email to become public; had he wanted this to truly remain a secret, he would have told no one. This almost certainly weighed into Theo’s decision as well.

This news has anywhere from annoying to disastrous consequences to users of OpenBSD’s IPsec software, and products derived from it. The latter half of this is the most troubling, as Theo wrote in his email:

Since we had the first IPSEC stack available for free, large parts of the code are now found in many other projects/products. Over 10 years, the IPSEC code has gone through many changes and fixes, so it is unclear what the true impact of these allegations are.

However inconvenient it may be for law enforcement agencies such as the FBI, back doors in security software are still weaknesses. It is easy to forget sometimes that computers are pretty stupid; they do what humans tell them to do. Exactly what humans tell them to do. A computer cannot, by itself, tell the difference between honest, largely law-abiding citizens such as me and the vast majority of you out there reading, someone acting with criminal intent, someone representing the FBI or another law enforcement agency, or someone working with a group like al Qaeda or the Taliban. As an example, anyone who knows my password on any of my computers, can type in the username (which is usually not intended to be kept secret; mine is normally “skquinn”) followed by that password (which is intended to be kept secret), and will be logged in as me. It does not matter to the computer one bit if it really is me; a police officer who wound up with one of my computers somehow, legally or not (and who as a rule, I would not want just going through the stuff on my computer; I value my Fourth Amendment rights), or an al Qaeda operative who somehow has access to my computer. (Sidenote: Biometric devices such as fingerprint scanners can be fooled as well, and in fact are in some cases more dangerous than a password typed in via the keyboard, as once compromised changing one’s fingerprints is impossible for all practical purposes.)

So it follows, the same “backdoors” the FBI put in, will work for anyone who knows about them, regardless of their good or evil intent. Such “backdoors,” as well as unintentional security holes which stem from bugs (programming errors) in the software, get found without the help of the source code (a human-readable form of the computer’s instructions) all the time. It was and is incredibly naive and stupid of the FBI and like-minded law enforcement agencies to assume that these “backdoors” would never be found.

We may not know for several more years just how much damage has been done by developers bribed by the FBI. This is but one small example of why I tend not to trust law enforcement agencies. Shame on the FBI for weakening the security of computers worldwide, including those outside of US jurisdiction. I hope restitution is made that involves fixing the intentionally broken software made fraudulently by programmers on the take from the FBI. That, and a pledge never to violate our privacy and peace of mind in such a fashion again, would be the minimum needed for me personally to start trusting the FBI again. Sadly, I don’t see that coming.

A victim of his own honesty

What could our justice system possibly be thinking when they prosecute cases like this?

A recent Mashable post chronicles the tale of Matthew White of Sacramento, California. Matthew is 22 years old and about a year ago downloaded what was represented as a copy of College Girls Gone Wild. Let's just say it was mislabeled and I don't doubt for a moment Matthew would have not bothered downloading the file were it truthfully labelled; it contained child pornography.

So Matthew deleted it, and thought that was the end of that. About a year later the FBI shows up at his family's house, and his family lets the agents inside and allows them to examine the computer, presumably without a search warrant.

The investigators recover the "deleted" copy of the mislabeled child pornography and now Matthew faces 20 years in prison. The truly sad part of this is that according to the story, Matthew plans to plead guilty and accept a 3.5 year sentence.

Hopefully someone out there knows Matthew and can relay a copy of this post to him. The last thing you want to do is plead guilty (I've already left a comment to this effect on the original Mashable post). Were I in Matthew's situation I like my chances in front of a jury.

Of course, the best way out of this situation is simply to deny the agents entry to the premises or access to the computer without a warrant. At that point their options are either to come back with a warrant or cease pursuing the case.

I don't know what crime Matthew committed that is worth sending him to prison for 3.5 years and branding him a sex offender for life. If the FBI is looking for people to make an example out of, surely they could pick someone who actually intentionally downloaded child pornography and kept the files instead of deleting them? At the least, Matthew should get a pardon. There is no sense in ruining the life of someone that young who acted in good faith and in all likelihood, was ignorant of whatever law he may have technically violated.

Shame on you, FBI agents that worked this case. Here's hoping your time holding the badge is short-lived.