The Schwartz Report

Blog archive

Web Crawler Helped Snowden Find Classified NSA Files in SharePoint

Though it's well documented that former NSA contract employee Edward Snowden accessed many classified files stored in SharePoint repositories, it was the unfettered permission he had that let him steal and later release those files.

Giving him that access turned out to be an epic failure, whether or not you believe he did the right thing by sharing what he knew with the world. From a security standpoint, this administrator was an insider who was able to steal a lot of classified data that the agency didn't want disclosed.

Now it has come to light how he found these files. A report this weekend in The New York Times says Snowden used a Web crawler to find the 1.7 million files he ultimately took. It doesn't say whether he used the search engine in SharePoint (which effectively is a Web crawler), a freely available one found on the Internet or a custom bot he created.

According to the report, NSA officials only said it functioned like the Googlebot Web crawler, also known by some as a spider. What remains a mystery is why the Web crawler's activities of scanning classified files didn't set off any alarms, especially since the NSA rarely uses Web crawlers, according to the report. Because passwords were inserted, "the Web crawler became especially powerful."

The NSA is still unclear how Snowden came up with the search terms needed to seek out the files he was looking for and the NSA said it doesn't believe the searches were directed by a foreign power. But knowing the search terms apparently has helped the agency determine what information Snowden had, among other things.

As Steve Ragan noted today in his CSO blog, The Times report missed an important point:  "This isn't mastermind-level hacking, it's something any network administrator would know how to do," he noted. "But because Snowden was an insider, not to mention a network administrator with legitimate access to the commands and portals he was mirroring, his explanation for the access and archiving was accepted at face value."

Ragan explained how this can affect you. "At the time the investigators were duped, the NSA had the same problem many organizations have; they were more worried about defending the network from threats that came from the outside, and didn't seriously consider the potential for threats from within."

So if the NSA's SharePoint documents could be found by a Web crawler by your administrator you could be just as susceptible to data loss as the NSA was. Insider threats are nothing new. But what Snowden pulled off last year and the fact that he did so with a Web crawler is a reminder you need to keep an eye on threats from outside and within.

Posted by Jeffrey Schwartz on 02/10/2014 at 2:23 PM


Featured

  • Industrial Control System Honeypot Illustrates Bad Security Practices

    Security solutions provider Trend Micro has published results (PDF) from running an industrial control system (ICS) "honeypot."

  • Ransomware: What It Means for Your Database Servers

    Ransomware affects databases in very specific ways. Joey describes the mechanics of a SQL Server ransomware attack, what DBAs can do to protect their systems, and what security measures they should be advocating for.

  • Windows Admin Center vs. Hyper-V Manager: What's Better for Managing VMs?

    Microsoft's preferred interface for Windows Server is Windows Admin Center, but can it really replace Hyper-V Manager for managing virtual machines? Brien compares the two management tools.

  • Microsoft Offers More Help on Windows Server 2008 Upgrades

    Microsoft this week published additional help resources for organizations stuck on Windows Server 2008, which fell out of support on Jan. 14.

comments powered by Disqus

Office 365 Watch

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.