Edward Snowden made a show of the U.S. government’s inability to control access to even the most sensitive and classified information. As everyone now knows, Snowden was working as a contractor at the NSA on behalf of Booze Allen Hamilton. In that role, despite not being an official NSA employee, he was still able to subvert a number of security controls to assemble a treasure trove of classified information on NSA and U.S. government spying.
We know how he did it: tricking (or ‘socially engineering’) a couple dozen fellow employees at the Hawaii spy base where he worked to cough up their user name and password – ostensibly to facilitate Snowden’s job as a system administrator. We also know, from media reports, that Snowden had a (relatively) easy go of moving that stolen intelligence, because the NSA was using outdated leak detection software at the time and failed to detect the movement of classified information off site.
The Snowden leaks were both embarrassing and damaging to the U.S.’s reputation and relations with allies. They were also a wake-up call for U.S. Government agencies of all stripes that current tools and processes to protect classified and sensitive data were woefully out of step with the current environment of small, capacious storage devices and powerful cloud-based secure communications and hosting platforms.
Even before the Snowden leak, however, the Obama Administration was trying to reign in the handling of sensitive information. A 2010 Executive Order on the treatment of Controlled Unclassified Information (or “CUI”) (13556) sought to centralize a decentralized and unwieldy bureaucracy that delegated handling of such sensitive, but unclassified information to individual federal agencies.
The 2010 order designated the National Archives and Records Administration (NARA) as the Executive Agent for Controlled Unclassified Information (CUI) and directed NARA to implement a government-wide CUI Program that would standardize the way the Executive branch handles unclassified information that requires protection.
But what about the volumes of sensitive government information that ends up on the systems of contractors who work for the federal government? That list includes everything from federal contractors to state and local governments to colleges and universities.
This week brings some clarification: a draft document from the National Institute of Standards and Technology (NIST) “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations.” (Draft Special Publication 800-171).
The new NIST document outlines steps for protecting sensitive unclassified federal information that resides in nonfederal information systems and environments. Those include non-federal information systems that lie outside of the scope of existing laws like the Federal Information Security Management Act (FISMA) and any components of nonfederal systems that process, store, or transmit CUI.
The specifics of the guidance aren’t so remarkable. In fact, NIST pretty much reiterated guidance from an existing document, 800-53, “Security and Privacy Controls for Federal Information Systems and Organizations.” What is notable is that the Executive branch is forcing the government to address bigger issues than just which security technologies to deploy and where. The CUI program addresses system-wide deficiencies in managing and protecting unclassified information. That runs the gamut from poor or inconsistent physical markings on CUI data to safeguards that are alternately too loose or excessive.
To help make the whole process sane and consistent, the government has established a CUI Registry that identifies types of unclassified information that requires safeguarding and dissemination controls and acts as a reference for the types of safeguards that different categories and subcategories of CUI require – with citations of the specific legal citation that is the basis for the CUI classification and safeguard.
For private sector firms, the guidance from NIST may seem overly bureaucratic – as befits one of the world’s largest bureaucracies. But the new guidance is important for two reasons. It makes clear the role that third parties, including contractors and downstream business partners, play in many data breach incidents. Second: the NIST document provides guidance on the kinds of data that should be protected and, roughly, the kinds of protections that should be applied. Those are efforts that many private sector firms would do well to study and – if possible – to emulate.
About Paul Roberts
Paul Roberts is the Editor in Chief of The Security Ledger, an independent publication that covers the intersection of information security and the Internet of Things. Follow him on Twitter @paulfroberts and @securityledger.
More from the Digital Guardian Data Security Knowledge Base:
How to Prevent Another Wikileaks by Dan Geer
In the wake of the Wikileaks saga, Dan Geer discusses insider threat and the importance of protecting data.
Related ArticlesAfter Panama Papers: Firms Should Add Pen Testing to Due Diligence Process
An analysis by Wired shows that the Panama firm Mossack and Fonseca did a poor job managing its public facing systems, all the while promising clients security.Friday Five: 7/27 Edition
Ransomware hits another shipping company, a manufacturing data leak, and more - catch up on the week's infosec news with this roundup!The Third Party Data Breach Problem
Data breaches via third parties are a growing problem impacting companies across many industries – and one that can be even more difficult to defend against. How can companies secure their assets against cyber attacks that target suppliers and partners or use information stolen in previous breaches?