Cheap software helped Edward Snowden plunder NSA secrets …

NEW YORK: Intelligence officials investigating how Edward J Snowden gained access to a huge trove of the country's most highly classified documents say they have determined that he used inexpensive and widely available software to "scrape" the National Security Agency's networks, and he kept at it even after he was briefly challenged by agency officials.

Using "Web crawler" software designed to search, index and back up a website, Snowden "scraped data out of our systems" while he went about his day job, according to a senior intelligence official. "We do not believe this was an individual sitting at a machine and downloading this much material in sequence," the official said. The process, he added, was "quite automated".

The findings are striking because the NSA's mission includes protecting the nation's most sensitive military and intelligence computer systems from cyberattacks, especially the sophisticated attacks that emanate from Russia and China. Snowden's "insider attack," by contrast, was hardly sophisticated and should have been easily detected, investigators found.

Moreover, Snowden succeeded nearly three years after the WikiLeaks disclosures, in which military and State Department files, of far less sensitivity, were taken using similar techniques.

Snowden had broad access to the NSA's complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency's computer systems in an outpost that focuses on China and North Korea. A Web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path.

Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the NSA's internal networks. Intelligence officials told a House hearing last week that he accessed roughly 1.7 million files.

Among the materials prominent in the Snowden files are the agency's shared "wikis," databases to which intelligence analysts, operatives and others contributed their knowledge. Some of that material indicates that Snowden "accessed" the documents. But experts say they may well have been downloaded not by him but by the program acting on his behalf.

Agency officials insist that if Snowden had been working from NSA headquarters at Fort Meade, Md, which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught. But because he worked at an agency outpost that had not yet been upgraded with modern security measures, his copying of what the agency's newly appointed No. 2 officer, Rick Ledgett, recently called "the keys to the kingdom" raised few alarms.

"Some place had to be last" in getting the security upgrade, said one official familiar with Snowden's activities. But he added that Snowden's actions had been "challenged a few times."

In at least one instance when he was questioned, Snowden provided what were later described to investigators as legitimate-sounding explanations for his activities: As a systems administrator he was responsible for conducting routine network maintenance. That could include backing up the computer systems and moving information to local servers, investigators were told.

View post:
Cheap software helped Edward Snowden plunder NSA secrets ...

Snowden used low-cost common tool to find NSA data: report

;

AP Photo/The Guardian, Glenn Greenwald and Laura Poitras, File

TORONTO Whistleblower Edward Snowden used common low-cost web crawler software to obtain top secret NSA documents, a new report alleges, raising new concerns about the U.S. agencys security measures.

According to a new report by the New York Times, Snowden used software designed to search, index, and back up a website in order to gather data from the NSAs system with little effort. At the time, Snowden was working as an NSA contractor in Hawaii, giving him broad access to the agencys complete files.

READ MORE: What is PRISM? A cyber-surveillance explainer

The software reportedly allowed Snowden to set certain search parameters that would function while he went about his day job.

The web crawler sometimes called a spider is an Internet bot that moves from website to website by following hyperlinks. The crawler can be programmed to copy everything in its path.

Search engines like Google use web crawling software to update their content, while indexing other sites web content.

These programs are easy to come by and dont cost very much to operate.

The report, which cites unnamed intelligence officials investigating the matter, revealed that Snowden accessed roughly 1.7 million NSA files in the process.

See more here:
Snowden used low-cost common tool to find NSA data: report

Edward Snowden declina testificar sobre casos de espionaje ante Eurocámara/ Global Paola Barquet – Video


Edward Snowden declina testificar sobre casos de espionaje ante Eurocmara/ Global Paola Barquet
Edward Snowden declina testificar sobre casos de espionaje ante Eurocmara/ Global Paola Barquet 07/02/14 Programa: Global. Conductor: Paola Barquet. Horario...

By: ExclsiorTv

Read the original:
Edward Snowden declina testificar sobre casos de espionaje ante Eurocámara/ Global Paola Barquet - Video

Wladimir Putin und Edward Snowden zwei überragende Männer des Jahres 2013 – Video


Wladimir Putin und Edward Snowden zwei berragende Mnner des Jahres 2013
Originalbeitrag: http://www.seewald.ru/wladimir-putin-und-edward-snowden/ Das Jahr 2013 war sehr bewegtes Jahr in dem vieles passiert ist und vielen Menschen...

By: seewaldru

Go here to read the rest:
Wladimir Putin und Edward Snowden zwei überragende Männer des Jahres 2013 - Video

Edward Snowden Used Cheap, ‘Web Crawler’ Software To …

Investigators say the NSA should have easily detected former contractor's activity

NSA whistle-blower Edward Snowden in a still image taken from video during an interview by the Guardian in his hotel room in Hong Kong on June 6, 2013

Edward Snowden used widely available automated software to stealclassified data from the National Security Agencys networks, intelligence officials have determined, raising questionsabout the security of other top secret military and intelligence systems under the NSAs purview.

The New York Times, citing anonymous sources, reported that the former NSA contractor used a web crawler, cheap software designed to index and back up websites, to scour the NSAs data and return a trove of confidential documents.Snowden apparently programmed his search to find particular subjects and determine how deeply to follow links on the NSAs internal networks.

Investigators found that Snowdens method of obtaining the data was hardly sophisticated and should have been easily detected.Snowden accessed roughly 1.7 million files, intelligence officials said last week, partly because the NSA compartmented relatively little information, making it easier for a web crawler like the one Snowden used to access a large number of files.

[NYT]

Follow this link:
Edward Snowden Used Cheap, 'Web Crawler' Software To ...

Snowden used simple technology to mine NSA computer networks

Former National Security Agency systems analyst Edward Snowden speaks during a presentation . (AP)

The National Security Agency whistleblower Edward Snowden used inexpensive and widely available software to plunder the agency's networks, it has been reported, raising further questions about why he was not detected.

Intelligence officials investigating the former contractor, who leaked thousands of documents to media outlets including the Guardian last year, determined that he used web crawler software designed to search, index and back up websites to scrape highly classified files, the New York Times reported on Sunday.

The unusual activity triggered a brief challenge from agency officials but Snowden persuaded them it was legitimate and continued mining data.

We do not believe this was an individual sitting at a machine and downloading this much material in sequence, an unnamed official told the Times. The process, the official said, was quite automated.

Web crawlers, also known as spiders, move from website to website, following links embedded in each document, and can copy everything they encounter. Snowden is believed to have accessed about 1.7 million documents.

The NSA has a mandate to deter and rebuff cyber attacks against US computer systems but Snowden's insider attack was relatively unsophisticated and should have been detected, investigators said, especially since it came three years after Chelsea Manning used a similar technique to access State Department and military data which was then passed to Wikileaks.

Snowden was a technology contractor working at an agency outpost in Hawaii that had yet to be equipped with modern monitors which might have sounded the alarm. The NSA's headquarters in Ford Meade, Maryland, had such monitors, raising the question whether Snowden was either very lucky or very strategic, said one intelligence official.

According to The Snowden Files, a new book by Guardian journalist Luke Harding, Snowden moved to a job in Honolulu with security company Booz Allen Hamilton because it afforded even greater privileges.

Some members of Congress have accused Snowden of being a spy for Russia, where he has been granted asylum. He has denied the allegation.

View post:
Snowden used simple technology to mine NSA computer networks