Building Trust Into Hardware: A Conversation with Steve Sprague

Hugh Taylor:                       Tell us about yourself.

Steven Sprague:               I’ve been at the forefront at the trusted computing space for almost my whole career. At Wave Systems, I was involved in building hardware security for entertainment and video game type content, actually building security chips that did micro-transaction and encryption and stored value and a whole variety of things. We became one of the founding board members of the Trusted Computing Group. I ran a team that produced over 160 million copies of software. I left that company in 2013 to form Rivetz, focused on how we apply the principles of trusted computing into mobile platforms.

Hugh Taylor:                       How concerned should we be about firmware-based threats?

Steven Sprague:               Very. Around 2010, we built an implementation of BIOS integrity or firmware integrity for PCs for NSA’s Testnet back at that time and with more interesting results than you’d like to believe. The challenge is the data changes and it shouldn’t. At the heart of this whole thing, if you were government, the number one thing you should understand is that you should have a very simple policy which is not a policy they have today, which is only known device should be connected to sensitive networks and data. And today, while we’ve spent billions of dollars on knowing who the users are, typically we have a known user on an unknown computer. And what I mean by that is that when you turn the power on in your computer and the firmware of your computer fires up, it’s not measured. Well actually the measurements aren’t checked. The Trusted Computing Group was responsible for shipping 1.5 billion PCs with an industry standard capability to check the firmware of every PC in the world. And the vast majority of infrastructure out there doesn’t turn it on.

Hugh Taylor:                       Do you feel like they are potentially a security risk based on maybe the influence of foreign intelligence services on the manufacturers and that sort of thing?

Steven Sprague:               I think there are two problems. I think one is the supply chain security. So how do I know when a device arrives, whether it’s a phone, a PC, a controller for an industrial pump, the firmware for your car, all compute devices, the vast majority of them are now built in Asia, and how do I actually know that the data file that was sent to the manufacturer to please make this chip is actually the chip that I got back? It’s one of the really open and interesting science problems that’s out there right now, determining that somebody hasn’t tinkered with that product in some way, shape or form and introduced a weakness. And those weaknesses can be exploited and they can be exploited globally. It hasn’t really happened yet at scale in a way that has really done a lot of damage. The networks have done a good job of routing around it as it were. But there’s systemic risk in everything we use. We’ve relied on the network to try and catch the problem. Whether it’s a computer sending bad transactions so we watch the corporate network and say, “Look, there’s a whole bunch of data going to Moscow or China.”, or pick your country of choice. “We don’t do any business with Moscow. Wonder where that data is going?” And so we’re trying to watch everything all the time. And the problem of course is well that works well if all the humans come to work and work in a little cubicle. The more mobile we are and the more our services move to cloud services, watching the network becomes enormously hard because the humans have left the building and the applications have left the building and we’re all running around mobile. And the only place we’re securing is the building, right?

Hugh Taylor:                       Let’s say that, for the sake of argument, that the Chinese Ministry of Security, State Security, their CIA, was interfering with the firmware supply chain, what kind of malware do you think they would install?

Steven Sprague:               I think the simplest one is an on/off switch. There are two things you can do. You can steal data. And while that’s interesting, then the question is what do I do with all the data? And by the way, the data can be detected. But I don’t know, let’s assume we put a controller into every car that turns out to be an industry standard part, and we’ve learned this over the last few years with Takata airbags. Who knew that Takata had 80% of the airbag market? Every time you turn around the car you own has a Takata airbag in it, right?

Steven Sprague:               I don’t know what the number are. I’m sure not 80%. But imagine they put a chip out there that could be signaled and all of a sudden that compute device turns off.

Steven Sprague:               Okay, so pick a few things like whole classes of things which might have the same chip in it, because it turns out there just aren’t very many manufacturers in the world, and you get up in the morning, and we could pick on the PC again. Let’s assume we got up in the morning and all the PCs didn’t work on Tuesday.

Steven Sprague:               That would get really annoying. And actually potentially could kill a lot of people, right? Computers are general purpose, but there are a lot of other systems that we’re turning to rely on, whether they be heating systems or cooling or water or electricity or others, where if all of a sudden it stopped working we’d be pretty unhappy. And if they could make it stop working by design so it became part of a weapon, then that’s a whole different problem. So what I’m trying to say is that on/off, which is an incredibly simple function, it’s just a bit, can be enormously dangerous. And so you can get more sophisticated from there, but the simple attack is just turn it off. So really, so why wouldn’t we care? Now, you might say, “Well, it hasn’t really been hacked.” That’s not true. In 2012, in the Saudi Aramco hack where the Iranians deployed and altered firmware onto Saudi Aramco’s machines, they had to throw away 35 or 40,000 computers. And there was a demonstration that was done around that time, where at Black Hat Mitre Corporation demonstrated that a firmware attack on a computer, that when you reflashed the firmware, if the firmware had been compromised, the compromise survived the replacing of the firmware on the machine. And so the only result was to throw the computer away.

Hugh Taylor:                       My perception is that the information security field is dominated by a software and network-based threat mentality. They’re looking for intrusion, malware, denial of service, and they’re not really looking at the device so much. That’s just my impression.

Steven Sprague:               It’s a very 1980’s approach because the assumption is that we have all the devices hooked to our networks. And contrary to popular opinion there’s been this rise of this new capability called mobile. For a while most of us are running around with one of these things in our pocket. Now, who’s network are they on? And who’s watching? And you could say, “Oh, well they’re on AT&T’s network.” Well that’s not true? You ever connect to Wi-Fi at the hotel? Again, who’s network are they on? “Oh, they’re on who know who’s network.” So how do I watch? The answer is you don’t.

Hugh Taylor:                       Let’s say the President called you up and says, “Okay Steve, I’m giving you unlimited money and all the people you need to cure this problem.” What would you do?

Steven Sprague:               I would first start by enforcing a policy, something a government could enact. It’s technology neutral, it doesn’t specify any specific technology or requirement. Only known devices will be connected to sensitive networks and data in US government systems. And what that would force, if properly enforced, would be a requirement for varying grades and quality of device identity to be turned on in all things. And so for the first time ever our network would switch from being a network of ports and passwords to a network that’s based on the identity of the device for the delivery of service. One of the byproducts of that is that the entire US federal government would become mobile. So you would become naturally more resistant to hacking because you could pick the whole government up and move to Las Vegas and plug it in and it would work because it’s not based on the wires, it’s based on the identity of the device.