X

Perspective: Tech's answer to Big Brother

Even as the government looks to build a Total Information Awareness database, CNET News.com's Washington watcher Declan McCullagh says technology can says technology can still help you preserve your privacy rights.

Declan McCullagh Former Senior Writer
Declan McCullagh is the chief political correspondent for CNET. You can e-mail him or follow him on Twitter as declanm. Declan previously was a reporter for Time and the Washington bureau chief for Wired and wrote the Taking Liberties section and Other People's Money column for CBS News' Web site.
Declan McCullagh
5 min read
WASHINGTON--Why is everyone so surprised that the U.S. government wants to create a Total Information Awareness database with details about everything you do?

This is an unsurprising result of having so much information about our lives archived on the computers of our credit card companies, our banks, our health insurance companies and government agencies.

Now a Defense Department agency is devising a way to link these different systems together to create a kind of digital alter ego of each of us. After the Sept. 11 terrorist attacks, this proposed centralization was inevitable--and it's only going to get worse.

Blame retired Admiral John Poindexter, national security adviser for former President Ronald Reagan, who returned to the Pentagon in February to run a creepy new agency that's trying to create this mammoth surveillance and information-analysis system. It's called Total Information Awareness, and it's funded by the Defense Advanced Research Projects Agency (DARPA).

Don't get me wrong. I'm not saying it's a good idea, or that it's consistent with the traditional American values of limited government and a sharp demarcation between the private and the public sector. I'm not even sure if Poindexter's brainchild could ever work.

What I am saying is that if our personal information--some of it extraordinarily sensitive--is archived in corporate or government databases and protected only by the weak shield of the law, it's vulnerable to federal snoops.

After the Sept. 11 terrorist attacks, this proposed centralization was inevitable--and it's only going to get worse.
When a nation is responding to perilous threats, politicians tend to repeal privacy laws in a femtosecond. The current process started with overwhelming votes for the USA Patriot Act last year. (It cleared the Senate with only one "nay" vote, from the courageous Russ Feingold, D-Wisc.) And if another terrorist attack happens, all bets are off.

That's why simply enacting laws and trusting to the government to protect our privacy can be a very dangerous thing. Just ask the Japanese-Americans forced into internment camps during World War II. New research says they were selected using Census Bureau data--data that was handed over to the government in strict confidence. Or ask the people who were robbed by the former chief of detectives for the Chicago Police Department, who pleaded guilty last year to using law enforcement databases to plot crimes.

Technology offers a better way to preserve our rights against government overreaching. New crises may prompt Congress to vote unanimously to skewer the Bill of Rights. But technological protections don't vary with the whims of politicians or shifts in Supreme Court majorities.

The sad thing is that for years we've known about technology that can slow down this mass "databasification" of American society. We just haven't used it.

One approach is outlined in Peter Wayner's useful book, "Translucent Databases." It describes methods--complete with Java code that produces standard SQL (Structured Query Language)--to construct databases that use one-way functions to scramble data and shield it from prying eyes.

New crises may prompt Congress to vote unanimously to skewer the Bill of Rights. But technological protections don't vary with the whims of politicians or shifts in Supreme Court majorities.
"The main goal I had with writing the book is to show it is possible to build a database that does useful work and solves problems without keeping personal information," Wayner said. "At first it seems counterintuitive. You figure that if you're going to arrange appointments and keep track of what customers bought in the past, you need the information there. But it turns out it's possible (to scramble it), and it can make the database smaller and faster, too."

A basic example is the venerable Unix password file, which doesn't store any actual passwords. Instead, the operating system scrambles a user's password using a one-way hash function and saves the scrambled version to the file. Because the function cannot be reversed, the database is secure if viewed by a malicious hacker, but users can still log in.

More importantly, even if Poindexter obtained that file through a court order or some more surreptitious method, assuming the encryption algorithm worked properly, he wouldn't be able to extract anyone's actual passwords from it.

Wayner's book provides tips that more programmers should follow. He shows how to build an encrypted department store database using a one-way function that can't divulge personal information unless a customer's full name is supplied. Other examples include encrypted car rental databases and lotteries.

A second approach was invented by Stefan Brands, previously a scientist at Zero Knowledge Systems, who outlined it in a book titled "Rethinking Public Key Infrastructures and Digital Certificates: Building in Privacy."

Brands describes a remarkable technology called limited disclosure certificates. It's a pre-emptive response to current trends in authentication, where you might end up using one digital ID certificate for everything from driving to shopping to health care--and all your information and transactions would instantly appear in Poindexter's database.

Limited disclosure certificates solve that centralization problem. They use a clever bit of mathematics to protect the identity of honest people, but reveal the identity of people who attempt to commit fraud. As soon as you try to cheat someone, the privacy protection evaporates.

Brands predicts in his book how a limited disclosure certificate would work on a smart card: "Any data leakage from and to the smart card can be blocked. The cardholder can even prevent his or her smart card from developing information that would help the card issuer to trade the cardholders' transactions, should the card contents become available to the card issuer. Transactions can be completed within as little as 1/20th of a second, so that road-toll and public transport applications are entirely feasible."

In an interview, Brands added that "instead of all this information about you being managed in central databases, you could manage it yourself. In theory, all the data that organizations hold about you and need to make decisions about you could be distributed to you.

"If you use good cryptography, the organizations' information is protected: You can't modify the information. At the same time, you would then be able to disclose whatever you need for a particular purpose."

MIT professor Ron Rivest described Brands' work as imparting a way for people to remain anonymous and yet convince an Internet service provider that they are a paid subscriber. The beauty is that the user's sessions are unlinkable--the ISP can't even tell if an user currently logged in is the same as the user who used the service at a previous time.

It's true that Congress could outlaw Wayner's and Brands' techniques and force all information to be stored in a surveillance-enabled way. But until that happens, we don't have to make it any easier for Poindexter and his snoops.