Edward Snowden Used Cheap, ‘Web Crawler’ Software To …

Investigators say the NSA should have easily detected former contractor's activity

NSA whistle-blower Edward Snowden in a still image taken from video during an interview by the Guardian in his hotel room in Hong Kong on June 6, 2013

Edward Snowden used widely available automated software to stealclassified data from the National Security Agencys networks, intelligence officials have determined, raising questionsabout the security of other top secret military and intelligence systems under the NSAs purview.

The New York Times, citing anonymous sources, reported that the former NSA contractor used a web crawler, cheap software designed to index and back up websites, to scour the NSAs data and return a trove of confidential documents.Snowden apparently programmed his search to find particular subjects and determine how deeply to follow links on the NSAs internal networks.

Investigators found that Snowdens method of obtaining the data was hardly sophisticated and should have been easily detected.Snowden accessed roughly 1.7 million files, intelligence officials said last week, partly because the NSA compartmented relatively little information, making it easier for a web crawler like the one Snowden used to access a large number of files.

[NYT]

Follow this link:
Edward Snowden Used Cheap, 'Web Crawler' Software To ...

Related Posts
This entry was posted in $1$s. Bookmark the permalink.