Snowden used low-cost common tool to find NSA data: report

;

AP Photo/The Guardian, Glenn Greenwald and Laura Poitras, File

TORONTO Whistleblower Edward Snowden used common low-cost web crawler software to obtain top secret NSA documents, a new report alleges, raising new concerns about the U.S. agencys security measures.

According to a new report by the New York Times, Snowden used software designed to search, index, and back up a website in order to gather data from the NSAs system with little effort. At the time, Snowden was working as an NSA contractor in Hawaii, giving him broad access to the agencys complete files.

READ MORE: What is PRISM? A cyber-surveillance explainer

The software reportedly allowed Snowden to set certain search parameters that would function while he went about his day job.

The web crawler sometimes called a spider is an Internet bot that moves from website to website by following hyperlinks. The crawler can be programmed to copy everything in its path.

Search engines like Google use web crawling software to update their content, while indexing other sites web content.

These programs are easy to come by and dont cost very much to operate.

The report, which cites unnamed intelligence officials investigating the matter, revealed that Snowden accessed roughly 1.7 million NSA files in the process.

See more here:
Snowden used low-cost common tool to find NSA data: report

Related Posts
This entry was posted in $1$s. Bookmark the permalink.