Advertisement

Snowden used low-cost common tool to find NSA data: report

Snowden used low-cost common tool to find NSA data: report - image
AP Photo/The Guardian, Glenn Greenwald and Laura Poitras, File

TORONTO – Whistleblower Edward Snowden used common low-cost “web crawler” software to obtain top secret NSA documents, a new report alleges, raising new concerns about the U.S. agency’s security measures.

According to a new report by the New York Times, Snowden used software designed to search, index, and back up a website in order to gather data from the NSA’s system with little effort. At the time, Snowden was working as an NSA contractor in Hawaii, giving him “broad access” to the agency’s complete files.

READ MORE: What is PRISM? A cyber-surveillance explainer

The software reportedly allowed Snowden to set certain search parameters that would function while he went about his day job.

The web crawler – sometimes called a spider – is an Internet bot that moves from website to website by following hyperlinks. The crawler can be programmed to copy everything in its path.

Story continues below advertisement

Search engines like Google use web crawling software to update their content, while indexing other sites’ web content.

These programs are easy to come by and don’t cost very much to operate.

The report, which cites unnamed intelligence officials investigating the matter, revealed that Snowden accessed roughly 1.7 million NSA files in the process.

READ MORE: Obama orders changes to NSA programs

But the fact that this type of unsophisticated attack went unnoticed is prompting more security concerns at the NSA – since the agency is responsible for protecting military intelligence from cyber attacks.

According to the Times report, Snowden was asked about his activities while working as a contractor but told investigators he was doing routine network maintenance. The Hawaii post where Snowden was working when he accessed the documents had not received a security upgrade.

Sponsored content

AdChoices