X

Homeland Security and you

Cryptography expert David Holtzman explains how the ambiguous boundaries of the Homeland Security Act guarantee that many things will get reported to the government that have nothing to do with terrorism.

3 min read
The "killer app" remains the computer industry's holy grail. That's geek-speak for a feature so useful that people will buy the product just to have it.

It also carries the stronger marketing connotation of necessity, as in "we can't sell these gizmos without a killer app!" Without one, good technology often has to sit out the dance. Personal cryptography, one of these wallflower technologies waiting for over a decade, is now finally ready to rock. In this case, though, its killer app is not software, but the recently passed Homeland Security Act.

Part of this act legalizes and actually encourages ISPs to read their customers' e-mail and turn in anyone that they deem suspicious. The company must use a "good faith" effort to determine whether there is an "immediate threat to a national security interest." This shields them from litigation in the unlikely event anyone ever finds out they got ratted out. But the ambiguous boundaries of the bill guarantee that many things will get reported to the government that have nothing to do with terrorism.

Anyone who has anything to hide should be seriously considering a little crypto in their lives--even if they don't have anything to hide, but someday might. It doesn't have to be related to terrorism. This is also a good time to reflect on how often e-mail has been showing up as evidence in government cases, or that some of the damning Lewinsky notes published by the Starr investigation were deleted e-mails that she hadn't even sent.

Smart people are going to soon realize that sending a plain text e-mail through a commercial ISP is like misplacing a signed confession. This growing awareness will stimulate plenty of demand for encryption.

Cryptographic tools are inexpensive and they work. There are commonly available utilities that, once installed, will sign or encrypt e-mail, chat and IM sessions. Others use checksums--checks on the amount of data to make sure that it hasn't been altered--to make tamperproof files or steganography to hide information inside pictures or music. The best of these, like PGP (Pretty Good Privacy), open their source code for peer review, increasing the comfort of their users that a "back door" hasn't been slipped in.

Those old enough to remember the 1991 release of PGP might recall that it agitated military types more than if they had been handed a bowl of Pho by Jane Fonda at a Veterans of Foreign Wars meeting.

Smart people are going to soon realize that sending a plain text e-mail through a commercial ISP is like misplacing a signed confession.
The intelligence agencies claimed public access to strong encryption would make it impossible for them to do their jobs. This led to a brouhaha over the Clipper Chip, a compromise measure that would relax encryption controls, but with a price.

The gotcha was that the encryption gadget incorporated an algorithmic lockbox, a government skeleton key for rummaging around in digital closets.

But the Clipper Chip sailed away with no one onboard and restrictions on crypto were loosened anyway. PGP and other personal encryption tools were relegated to the hope chests of the paranoid, along with tinfoil hats and Area 51 travel guides. After the novelty wore off, the complexity of using encryption easily overcame curiosity, because there was no compelling need, and it was cumbersome to install. The diehards who still digitally signed their e-mail started to seem a little, well, odd.

Widespread acceptance of encryption will finally come about as a reaction to institutionalized data voyeurism.
But now, as Kurt Cobain said, "Being paranoid doesn't mean they're not out to get you." An added thought for passive resisters is that encrypted strings stick like a bone in the throat of search engines. Data miners, bots and Carnivore clones choke when they try to digest them. In fact, the cost of searching through this amount of encrypted traffic will be several orders of magnitude more than what's currently allocated.

Cultural habits quickly change when accelerated by practical circumstances. Widespread acceptance of encryption will finally come about as a reaction to institutionalized data voyeurism. Some will revive the Clipper Chip debate and suggest that crypto be made illegal because it is antithetical to the spirit of Homeland Defense and perhaps a little bit subversive.

People who hide things must have something to hide, right? Damn straight.