US whistleblower Edward Snoweden, who honed his hacking skills in India, used inexpensive and widely available software to ‘scrape’ the National Security Agency’s networks, according to American intelligence officials probing his high-profile case.
Using ‘web crawler’ software designed to search, index and back up a website, 30-year-old Snowden ‘scraped data out of our systems’ while he went about his day job, The New York Times quoted a senior intelligence official as saying.
‘We do not believe this was an individual sitting at a machine and downloading this much material in sequence,’ the official said.
The process by which Snowden gained access to a huge trove of the country’s most highly classified documents, he said, was ‘quite automated’ and the former CIA contractor kept at it even after he was briefly challenged by agency officials.
The findings are striking because the NSA’s mission includes protecting America’s most sensitive military and intelligence computer systems from cyber attacks, especially the sophisticated attacks that emanate from Russia and China, the report said.
Snowden’s ‘insider attack,’ by contrast, was hardly sophisticated and should have been easily detected, investigators found.
Snowden had broad access to the NSA’s complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency’s computer systems in an outpost that focuses on China and North Korea.
A web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path.
Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the NSA’s internal networks. US intelligence officials told a House hearing last week that he accessed roughly 1.7 million files.
According to media reports, Snowden had traveled to India in 2010. He spent six days in New Delhi, taking courses in ‘ethical hacking,’ where he learned advanced techniques for breaking into computer systems and exploiting flaws in software, the reports said.
Among the materials prominent in the Snowden files are the agency’s shared ‘wikis,’ databases to which intelligence analysts, operatives and others contributed their knowledge.
Some of that material indicates that Snowden ‘accessed’ the documents. But experts say they may well have been downloaded not by him but by the programme acting on his behalf.
NSA officials insist that if Snowden had been working from NSA headquarters at Fort Meade, Maryland, which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught. But because he worked at an agency outpost that had not yet been upgraded with modern security measures, his copying raised few alarms.
One official familiar with Snowden’s activities said his actions had been ‘challenged a few times.’
Using ‘web crawler’ software designed to search, index and back up a website, 30-year-old Snowden ‘scraped data out of our systems’ while he went about his day job, The New York Times quoted a senior intelligence official as saying.
‘We do not believe this was an individual sitting at a machine and downloading this much material in sequence,’ the official said.
The process by which Snowden gained access to a huge trove of the country’s most highly classified documents, he said, was ‘quite automated’ and the former CIA contractor kept at it even after he was briefly challenged by agency officials.
The findings are striking because the NSA’s mission includes protecting America’s most sensitive military and intelligence computer systems from cyber attacks, especially the sophisticated attacks that emanate from Russia and China, the report said.
Snowden’s ‘insider attack,’ by contrast, was hardly sophisticated and should have been easily detected, investigators found.
Snowden had broad access to the NSA’s complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency’s computer systems in an outpost that focuses on China and North Korea.
A web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path.
Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the NSA’s internal networks. US intelligence officials told a House hearing last week that he accessed roughly 1.7 million files.
According to media reports, Snowden had traveled to India in 2010. He spent six days in New Delhi, taking courses in ‘ethical hacking,’ where he learned advanced techniques for breaking into computer systems and exploiting flaws in software, the reports said.
Among the materials prominent in the Snowden files are the agency’s shared ‘wikis,’ databases to which intelligence analysts, operatives and others contributed their knowledge.
Some of that material indicates that Snowden ‘accessed’ the documents. But experts say they may well have been downloaded not by him but by the programme acting on his behalf.
NSA officials insist that if Snowden had been working from NSA headquarters at Fort Meade, Maryland, which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught. But because he worked at an agency outpost that had not yet been upgraded with modern security measures, his copying raised few alarms.
One official familiar with Snowden’s activities said his actions had been ‘challenged a few times.’