The Washington Post reported last month that Chinese military hackers had stolen over 600 GB of sensitive information from a contractor working for the US Navy’s Naval Undersea Warfare Center. The data in question was stored, apparently improperly, on an unclassified network. This storage method made the data more vulnerable to breach than it might have been if it had been managed in accordance with the Defense Federal Acquisition Regulation Supplement (DFARS) cybersecurity standards which govern American defense contractors. DFARS is based on the National Institute of Standards (NIST) 800-171 publication and its framework of controls.
The breach, which has serious implications for US national security, is only the latest in a long series of such compromises of American defense intellectual property and military secrets by the Chinese. It raises—or should raise—major concerns about the security of sensitive data in defense contractors’ networks. We asked a number of experts in the industry for their views on how well the DFARS standard was working and what could be done to avoid future episodes of this kind.
How could this happen in the first place?
What happened here? Of course, we don’t know the details, but a few elements of the hack seem evident. “Due to the fact that the data was not stored in a classified network, there were too many risks involved from the get-go,” says Eitan Bremler, VP of Product at Safe-T, a provider of software-defined access controls. “While this may have been done originally to simplify access and sharing, it left the data quite easy to steal. Instead, the contractor should have stored the data in a classified network and used secure data access and usage technologies.”
The asymmetry of the attack is also revealing. As Mike Fleck, VP of Security at automated data classification provider, Covata, puts it, “I think we’ll find that this breach was either the result of gross negligence or the contractor was doing everything right and they were simply ‘outgunned.’ Brian NeSmith, CEO and Co-Founder of Arctic Wolf Networks, offers a similar perspective, noting, “This sort of event provides a wake-up call across the contractor community: they’re in the crosshairs of nation-state actors. I expect everyone is re-evaluating and looking to how they might use new tools or processes to reduce risk and ensure compliance with DFARs and NIST 800-171.” Artic Wolf Networks offers a Security Operations Center (SOC) “as-a-service.”
I think we’ll find that this breach was either the result of gross negligence or the contractor was doing everything right and they were simply ‘outgunned.’
According to Pravin Kothari, Founder and CEO of the cloud access security broker (CASB) company CipherCloud, “The sophisticated advanced persistent threats (APTs) driven by nation states are very tough challenges for the very best legacy cyber defense. These nation state, government-funded black hat hackers are facing off against standard off-the-shelf commercial products and perimeter defenses.” To Kothari, it is unlikely that any military contractor can keep such attackers out of its networks. He adds, “The assumption must be they will get into your networks, so you must answer the more relevant questions: How do you detect them rapidly inside of your networks? How can you stop them from using stolen data? How can you stop them and shut down their attack?”
Getting onto an access system isn’t hard. What the adversary does once on the system is the main point.
The volume of data stolen is itself revealing as to the methods of the attack. “To offload 600+ gigabytes of data would likely have tripped most DLP systems,” observes Stan Engelbrecht, Director of the Cybersecurity Practice at D3 Security. “Again conjecture, but likely the attackers were in the network for a long time and slowly offloaded it.” This is not a hard task to pull off, says Sherban Naum, SVP, Corporate Strategy and Technology at the threat prevention vendor, Bromium. He explains, “Getting onto an access system isn’t hard. What the adversary does once on the system is the main point.” Naum is also struck by the sheer size of the exfiltration, believing the hackers’ ability to walk off with such much data reveals extensive deficiencies in the contractor’s security controls—NIST or no NIST. He adds, “Data encryption, fine grained access controls, compartmentalizing both users on programs and their respective data, network segmentation, as well as protecting the applications and access to the High Value Assets may not have been implemented.”
Does this breach reflect a deficiency in DFARS and NIST 800-171?
NIST 800-171 states that all contractors should comply by deploying adequate security, and report when an incident occurs. In this context, “Adequate security” means “protective measures that are commensurate with the consequences and probability of loss, misuse, or unauthorized access to, or modification of information.”
Experts cite both the nature of the standard itself as well as the quality of its implementation in this type of breach. According to Salvatore Stolfo, co-founder and CTO of data loss prevention vendor Allure Security, “Obviously, the contractor in this case failed to provide adequate security. Indeed, the NIST 800-171 basic security requirements are too weak.”
Aaron Turner, CEO and Founder of HotShot, a maker of compliance messaging tools, remarks, “As a general rule of thumb, when information security leadership within an organization ask me if NIST standards will help them, I can only tell them that compliance with NIST standards is an excellent way to protect an organization from yesterday’s threats and vulnerabilities. Due to the very nature of the way that NIST standards are created, edited and published, there will always be a gap between attackers’ capabilities and the protections that NIST promotes through their publications and standards.”
When information security leadership within an organization ask me if NIST standards will help them, I can only tell them that compliance with NIST standards is an excellent way to protect an organization from yesterday’s threats and vulnerabilities.
Steven Sprague, CEO of the private key storage company, Rivetz, believes the network security models relied upon by defense contractors (and everyone else) are failing. He comments, “Our practice has been built around known users on unknown computers and a reliance on network security to observe any leakage from the network.” In Sprague’s view, this approach is limited in its effectiveness.
Not everyone is so quick to criticize the standard, however. “This isn’t about how NIST’s recommendations are deficient, quite the contrary. This is about the contractor and the lack of controls and application of the NIST recommendations themselves. NIST can only lead the horse to water,” says Sherban Naum. “I do not believe there is a deficiency within the framework itself,” states Stan Engelbrecht. “The controls within are very clear and if they are implemented correctly. CUI data should be protected.” Pravin Kothari further notes, “There is no deficiency in NIST 800-171 as it stands. NIST 800-171 defines 14 sets of controls that provide good guidance for contractors.”
Brian NeSmith believes, “The breach doesn’t reflect a deficiency in the security standards, but it shows that a reliance on meeting security standards does not keep you safe from the bad guys.” He underscores this point of view by explaining, “Unfortunately the good guys have to get their defense right every time, while the bad guys need to get their attack right just once.”
The breach doesn’t reflect a deficiency in the security standards, but it shows that a reliance on meeting security standards does not keep you safe from the bad guys.
Scott Petry, CEO and Co-Founder of Authentic8, the company which makes the Silo cloud browser, concurrs, adding, “Not at all. In fact, it reflects a deficiency of the security practices – whether technical or human, in the contractor’s organization. It might reflect a wrong perspective in the organization when embracing the standards. The intention behind compliance frameworks like NIST and acquisition guidelines like DFARS is to get organizations to internalize and implement security practices as they apply to their business. Instead, they are used as often-mindless checklists in order to achieve a particular status. ‘we are xy compliant.’ Guidelines can’t deliver security. Organization implementation of guidelines can help them trend toward security but the approach needs to be different.”
Ultimately, as Mike Fleck explains, “No security program is 100% – in fact, far from it. Compliance regimes are for enforcing a baseline level of security due diligence based on a generalized level of risk tolerance.”
How big a risk is NIST 800-171’s “Self-attestation” policy?
The standard relies on defense contractors to “self-attest” to their compliance. Given the national security importance of the data these companies handle, this policy may appear naïve. Experts in the industry differ on how much of an issue it really is, however. Scott Petry says, “If you’re looking at NIST 800 171 with the objective of achieving security compliance, then yes, there is a connection [between the recent attacks and the ‘self-attestation policy’]. If you’re looking at it as a framework to improve your security posture, with a set of criteria to abide by, then less so. Just because you’ve implemented the letter of the spec doesn’t make you secure. As we see here.”
Pravin Kothari believes there is some connection between self-attestation and the risk exposure seen in this case. He observes, “it still makes no sense to allow anyone to self-attest on the issue of files, either classified or unclassified.” Aaron Turner warns “The ephemeral nature of information security makes it difficult to achieve objectives even when controls are successfully audited and tested by objective third parties. Whenever an information security leader asks me if a ‘self-compliance’ program will be effective, I caution them that without outside input into the compliance process, it will be nearly impossible to assure that controls are actually deployed in ways that successfully reduce the impact of cyberattacks.”
Aligning with this view, Mike Fleck argues, “Compliance assessments are subjective but I believe that third party audits result in a higher burden of proof than self-attestation.” He then notes, “Relative to risk exposure, maybe the biggest issue is that NIST 800-171 compliance is based on a medium risk baseline and it needs to assume the high baseline for some information.”
Stan Engelbrecht asks a question that puts the problem in perspective. After stating, “Any time ‘Self-attestation’ is involved it opens the door for customized results, like numbers used to fit any form of statistics wanted. Again, I don’t think this is a failure of the frame but rather the controls around the attestation.” He then asks, “How much follow up did the government agency do? Were the results of the attestation vetted to in fact hold true? These are questions we currently don’t have answers to.”
Steven Sprague does not find fault with the standard, commenting, “This is not the issue – the fundamental framework of 1990s LAN architecture trying to secure ports and links is at the heart of the problem.”
Do defense contractors have influence over the details of NIST 800-171?
The NIST standards are developed through an open process that Sherban Naum describes as “hugely collaborative.” The experts express concern about this but generally don’t feel that the policy is bad for national security. “As with any government standard, there are intense lobbying pressures that come to bear with the creation of the NIST cyber security publications and standards and the DISA STIGs. While independent security researchers generally provide input into the early stages of these standards, by the time they are about mid-flight in their development process, the big vendors are the only groups with the resources to dedicate technical expertise,” explains Aaron Turner.
Mike Fleck finds security standards to be intellectually honest, in contrast to standards involving interoperability protocols, which are, in his view, “Much more likely to be influenced by large vendors.” He shares, “In the case of NIST 800-171, it’s in the best interests of the large Defense Industrial Base companies for their subcontractors to be held to a standard. It doesn’t make sense for them to weaken the standard since they are ultimately responsible for the security of the supply chain associated with their prime contracts.”
There are influences on many sides, not just corporate ones.
Fleck’s view may not be accurate, however. It’s far from clear that defense contractors are held responsible for security lapses that occur on their watches. Defense contractors do influence the standards, though some experts, like Stan Engelbrecht are not overly concerned. He points out, “There are influences on many sides, not just corporate ones. Like open source software, from what I see, these standards would be difficult to ‘weaken’ or made easier to comply with due to the public nature of the process.”
One problem with the standards, according to several experts, is their tendency to be backward-looking. “They’re written more in the context of what happened previously (to prevent it from happening again). But we know that security vulnerabilities are always evolving, so by definition a static spec is a rear view mirror,” says Scott Petry. He added, “That’s why these processes should be less of a compliance check list and more focused on socialization of security issues and intent of the security practices.”
How can this be avoided in the future?
Assuming it is not an isolated incident, this breach reveals serious problems with military contractor cyber security. What can be done about this? Some simply advocate for more rigorous adherence to existing standards, e.g. protecting data at rest. This would be a good start, but there are other steps contractors could take to improve their cyber defense, according to industry experts.
Steven Sprague remarks, “It’s not a contractor problem or a laziness problem. It is, in the end, an architectural problem. We need known devices in a known condition running known services to enable secure information-sharing, with provable controls in place.” He adds, “It is time to move to a data security model where the data is secured in transit and at rest and advanced rights management is used to assure controlled access to the information. The challenge is that this requires a foundation of known devices to enable information sharing – from strong, tamper-resistant identity, to BIOS integrity, to trusted execution of policies for rights management.” Why is this not happening, in his view? He answers, “The current NIST framework touches on these decades-old technologies but does not require or incent a shift to a new architecture of data security.”
It’s not a contractor problem or a laziness problem. It is, in the end, an architectural problem.
Salvatore Stolfo advises that contractors be required to deploy new technologies that track data and documents. “You want to raise the bar against the level of ‘protective security measures’ that obviously failed in this case,” he said, adding, “Detection is far more important, and likely would have noticed the failure to protect far earlier than the entirety of the lost 613 gigabytes of data.”
Isaac Kohen, Founder of user behavior analysis company Teramind, feels that organizations need to take the steps necessary to conduct a thorough investigation of where their sensitive data resides and who has access to it. As he shares, “This includes third-parties. Once that’s understood, organizations can place progressive mitigation technologies like user analytics, DLP and security forensics into their security infrastructure to detect breach quicker and stop data from falling into the wrong hands.”