Live: Best Cyber Monday Deals Live: Cyber Monday TV Deals Tech Fails of 2022 Deals Under $25 Deals Under $50 8 Products at All-Time Lows Cyber Monday Doodads Cyber Monday Cheat Sheet
Want CNET to notify you of price drops and the latest stories?
No, thank you

Snowden nicked NSA docs with common tool, raising more concern about agency -- report

Edward Snowden used a common Web crawler program to scrape the NSA's systems and grab secret documents, unnamed officials tell The New York Times.

Edward Snowden Laura Poitras/The Guardian/screenshot by CNET

Edward Snowden used common "Web crawler" software to grab top secret NSA documents, according to unnamed intelligence officials cited in a New York Times report, a revelation that raises further questions about the effectiveness of the agency's internal security measures.

The software was not named by the officials, but it's apparently similar to Googlebot, the program the search giant created to index new Web pages, as well as a program called "wget," which Chelsea Manning used to download the batches of secret files that were published by WikiLeaks several years back.

The crawler can be programmed with various search phrases; it then travels automatically from Web page to Web page, following links, and going ever deeper in search of relevant documents.

"One of the questions I have is, while people can access individual messages related to their specific job, shouldn't this system have caught someone downloading 500,000 messages and asked him, 'What are you doing?'" Senator-elect Mark Kirk (R-Ill.) said of the Manning leaks during the 2010 WikiLeaks episode.

Similar questions are currently being asked of the NSA's systems. And it's a weighty issue given that, as the Times notes, the agency is charged with maintaining US cybersecurity against foreign adversaries that are supposedly using far more sophisticated methods than Snowden apparently did.

A presidential directive made in response to the 2010 Manning/WikiLeaks incident required US government facilities to install updated anti-leak software. But the facility in Hawaii where Snowden worked as an NSA contractor reportedly hadn't updated the "insider threat" program because the outpost's network didn't yet have enough power to run it properly.

Today's Times story says it's not known if Snowden got lucky in landing at the Hawaii facility, or if he sought it out. NSA officials told the paper that Snowden would've been caught if he'd been working at the agency's headquarters in Fort Meade, Md.

Agency culture was a factor as well, the Times reports.

"Once you are inside, the assumption is that you are supposed to be there, like in most organizations," Richard Bejtlich, chief security strategist for Silicon Valley computer security firm FireEye, told the paper. "But that doesn't explain why they weren't more vigilant about excessive activity in the system."

The Times said, "The NSA declined to comment on its investigation or the security changes it has made since the Snowden disclosures. Other intelligence officials familiar with the findings of the investigations under way -- there are at least four -- were granted anonymity to discuss the investigations."

And Snowden told the paper in a statement: "It's ironic that officials are giving classified information to journalists in an effort to discredit me for giving classified information to journalists. The difference is that I did so to inform the public about the government's actions, and they're doing so to misinform the public about mine."

The Times reported earlier that the CIA suspected Snowden of trying to get his hands on classified files when he worked for the outfit in 2009, but Snowden says that report was inaccurate.

You can read the Times' complete report on Snowden's use of Web crawler software here.