When Joe Weiss goes to cybersecurity conferences, he rubs elbows with world dignitaries, law enforcement officials, and large corporations, but usually he's the lone representative from the industrial critical infrastructures.
He's been beating the security drumbeat for the utility industry and the others for at least 10 years, as previously isolated control systems at electrical and nuclear plants, electric substations, oil refineries, and water distribution centers are being modernized with direct connections to other systems and to the public Internet. The introduction of the system glitch and subsequent stock plunge last week that was made possible by Wall Street's move to high-frequency trading and electronic exchanges.is pushing old-school industrial control managers off a technological cliff and increasing the chances of problems. This is similar to the
Weiss was technical manager for 15 years at the Electric Power Research Institute (EPRI) before founding consultancy Applied Control Solutions, where he works with government agencies, utilities; and progressive vendors; testifies before Congress; and pushes for stronger cybersecurity measures in the industry.
His book, "Protecting Industrial Control Systems from Electronic Threats," is due out later this month and is sure to cause a stir as the first comprehensive look at cybersecurity issues that arise from practices, including allowing workers to access smart grid control system devices using a Bluetooth connection. He provides examples of intentional and unintentional incidents that have caused power outages and even led to deaths in the U.S., as well as internal industry documents that show that for some critical infrastructure operators security means at most having just a password.
In an interview with CNET, Weiss explains how the average corporate network is more secure than any power plant or substation, why applying IT security solutions won't necessarily work in the critical infrastructure world, and how he works out his excess frustration and drive on the racquetball court.
Q: So are there lessons for the critical infrastructure industry to be learned from the recent stock plunge and sell-off on Wall Street given they are both going through a similar transformation?
Weiss: It's not a much different from the McAfee problem (in which a caused tens of thousands of Windows XP computers to crash or repeatedly reboot recently). The question is how reliant are we getting on automated systems? This wasn't a control system, but we've had control system issues like this before.
Then what is the state of the security of our critical infrastructure and electrical grid?
Weiss: This is more than just critical infrastructure. This is the industrial infrastructure. You've got things, for example, where they make dog food or housewares or consumables or auto manufacturing. They may not be considered critical infrastructure, but they are part of industrial infrastructure. This is every place where you essentially do or make things or transmit or distribute things.
But people are concerned mostly with the electricity and other industries that provide lights, gas, water, etc.
Weiss: Almost all the industrial infrastructures use either the same or similar types of equipment and they do it connecting them in similar ways and they use similar, though not the same, types of protocols. That means two things. One is that all of these infrastructures are effectively vulnerable. The impacts are obviously different depending on what industry you're talking about and the potential impact of a problem with the electric infrastructure is obviously very, very significant to everybody. The impact of a problem in the chemical industry also can be very important. The impacts are what are different. The vulnerabilities are the same.
What is the vulnerability then?
Weiss: Let me give some history. All of these control systems were originally analog, standalone systems. With the advent of the microprocessor, it enabled adding intelligence into these systems. With the advent of modern communications, not just the Internet, this has enabled improved productivity of these systems and their facilities. The other thing that's happened is when these systems weren't intelligent, people weren't interested in finding out what was in them. But the ability to get information from them increased. These are the most important systems that an industry could have because this is where they make their money. Once you can get this information, many people from many internal and external organizations want access to it. So systems that were never designed to be in this type of connected environment are now connected.
What is the real concern? What is the true risk?
Weiss: We were pretty naive in assuming that the only people who would ever want access to these systems would be people who should have access to them. So the idea of trying to protect them wasn't really a necessity. Now, with the change in the economy where it's creating many disgruntled people, and with the political situation where you have either nation-state enemies, crime families, or corporate enemies, there are more and more people who have malicious intent when they can access these systems. Also, because these systems are being used in ways they really weren't designed or intended to be used, there are more and more unintentional impacts. Like with the recent McAfee or the Wall Street situations, it doesn't take much to cause huge unintended consequences.
These may have been theoretical problems 10 years ago. How real are they now?
Weiss: I have an incident database of control system cyberincidents. I started collecting them in the '98-'99 time frame. Nobody was really looking at the time, but that's when they started to occur. The electric system actually started to upgrade itself and effectively become a "smart grid" then when industry first started putting in intelligent electronic devices. These are the smart relays and breakers that are in electric substations. These started getting installed in the late '90s, early 2000s.
One of the major recommendations after the Northeast power outage of 2003 was to replace the cyber-dumb electromechanical switches with the smarter cyber-alive electronic devices. So it has had parts and pieces of a smarter grid for a while. All "smart" means is there is two-way communication. Industry has been doing that for a long time. It was simply never called it "smart grid" until recently. With the stimulus money there is a huge push to put in or implement modern IT communications using things like TCP/IP (Transmission Control Protocol/Internet Protocol) into the electric system. A lot of people, particularly from IT, see the smart grid as the Internet for electrons.
Is that a good idea?
Weiss: Conceptually it is, because it makes the grid more productive. I'm very adamant about using the word "conceptually." When I was at EPRI and involved in advanced controls and instrumentation, we viewed adding intelligence to the critical infrastructure as being a single-edge sword--nothing but productive improvements could accrue. We never realized that for all the positives there's a negative--"cyber." It's a double-edged sword and you have to address that. If you don't, the consequences can be devastating. If you do, the benefits can be phenomenally valuable. It is a trade off between productivity and security.
Then how do we address cyber, which poses such high risks?
Weiss: I've been asked many times why we can't we just go back and put the genie back in the bottle and go back to analog. There are way too many benefits that modern digital technology provides that older analog can't. What needs to happen now is to figure out how best to control that genie. This is where I've been pushing. Industry hasn't created an educated control system cybersecurity workforce to address this new genie. Industry needs to develop and train the workforce, so not only can they work with it but develop the appropriate technologies to deal with it.
What about the companies that say they can solve the problem using Internet security technologies?
Weiss: You have to understand the technical, administrative, and policy differences are between IT (information technology) and industrial control systems and from a cybersecurity perspective. There have been control system problems when certain IT technologies or testing have been used. For instance, there have been problems when IT tried to do penetration testing or to apply traditional IT policies, mandate strong passwords or lock out a system if the passwords have been used a certain number of times. These examples can and have caused grievous harm to control systems.
Do you have examples?
Weiss: People have used penetration testing on control systems that have either shut down the control system or in one case actually killed the system firmware. Policies as simple as requiring that default passwords be changed can be problematic. If you're in a very stressful situation, like the grid is going down or a power plant is in upset condition, it's been proven time and again that if people don't do what they're trained to do, they're going to do the wrong thing. If you force them to have a password they're not used to, they're not going to be able in a timely fashion to respond. Antivirus software is very resource-intensive.
There's been testing done by NIST (National Institute of Standards and Technologies) that showed that simply performing a virus definition update on an older control system processors can cause anywhere from a two- to a six-minute denial of service. That's just doing your daily virus definition update. There have been cases where installing antivirus software has shut down certain system control workstations. Block encryption has been shown to slow down or shut down control systems. Industry has been taking technology, policies, procedures, and testing that were rigorously developed for the IT community and tried to apply it to the control system community without really being in a position to have tested it and being sure it didn't cause a problem.
A water company patched their system and they tested the patch offline to make sure it worked. When they connected the patched system to the rest of the network, they were able to turn their water pumps on but could not turn the water pumps off. These are all situations that have already occurred and there have been times where control system suppliers have had to send emergency notices to customers that under no circumstances should they apply the Microsoft patch because it would shut down or impact their "vendor modified" Windows system.
Just don't do it?
Weiss: You can't patch without having tested it and knowing that what you do won't cause more harm than good. Some systems simply cannot be patched with the plant in operation, meaning they cannot be patched for 12 to 24 months. The cure can't be worse than the disease.
We've talked about unintended incidents. What about intended incidents such as attacks?
Weiss: There have been a number of malicious cyberattacks against control systems. To date, I am not aware of any nation-states launching malicious attacks against control systems. There have been a couple of and obviously a number of attacks against the Internet.
Malicious attacks by their very nature would cause much more harm than the unintentional incidents that have occurred to date. But the unintentional incidents have already resulted in two situations in the United States that have killed people. They have resulted in three reasonably major though short-term, a couple of days, power outages and in two nuclear plants shutting down from full power requiring the use of their safety systems.
Unintentional impacts doesn't mean insignificant impact. In the case of a specific cyber-related pipeline rupture, there was the spill of almost 250,000 gallons, 3 deaths, $45 million in damages in '99 dollars, the loss of a water treatment plant, and the bankruptcy of the Olympic Pipeline Company. There have been cyber incidents in almost all of the industrial infrastructures--electric, distribution system, transmission systems, hydro-plants, fossil plants, nuclear, combustion-turbine plants, oil and gas pipelines, water and water treatment systems, manufacturing facilities and transportation. These are worldwide--not just in the U.S. None of this is hypothetical. It's all already occurred.
How does this compare to the security level of corporate networks?
Weiss: PG&E's or any utility's human resources network or their customer information networks are more cybersecure than any power plant, including nuclear, any substation, or any control center in the U.S.
How can that be?
Weiss: Because the utilities got together and came up with a set of criteria, called the NERC (North American Electric Reliability Corp.) critical infrastructure protection (CIP) standards. In those standards they input a number of exclusions and allowed them to self-define what would be "critical." NERC has put out emergency warnings on some of the areas that have been excluded, like telecommunications, but NERC CIPs specifically exclude them. Can you imagine doing a cyber assessment of your IT systems and being told "do not address telecom?" Because of the Energy Policy Act of 2005, electric distribution which is the heart of the smart grid is specifically excluded even though the electrons move from distribution to transmission and back. It simply doesn't make any sense.
Aren't the industries regulated?
Weiss: Right now the only industries that actually have (regulatory) requirements are electric and nuclear power. Water has no requirements yet. None. You have industry organizations putting out guidelines or best practices but when you talk about any type of regulation the first is electric. That is why there is so much focus on the NERC CIPs.
Should there be more regulation?
Weiss: I've been arguing for the utilities to simply do a prudent engineering job of assuring that their facilities are secure. To date, the utility industry, with few exceptions, is simply not willing to do that.
Is this because they are choosing productivity and efficiency over security?
Weiss: It's because the utility engineering side or operations organization is the one responsible for the generation, transmission, and distribution and control of an electric system. They are not responsible for cybersecurity. The organization responsible for cybersecurity is IT. Only IT does not own the operational equipment. The operational organization generally has no budget for security and generally has not been trained to address cybersecurity. So what happens is that the operational organization isn't going to address security, but management is also looking at the NERC CIPs to dictate what needs to be done to be compliant. What most utilities are trying to do is minimize the number of assets deemed critical. If the asset is not identified as critical, the utility does not have to do anything further.
As of April 2009, the vice president and chief security officer of NERC put out a letter to the industry that said almost 70 percent of power plants in U.S. were not considered critical. Almost 30 percent of transmission assets were not considered critical. And it turns out 100 percent of the distribution assets, which is the heart of the smart grid, are not considered critical because distribution is explicitly excluded from the NERC CIPs. No public utility commission, including in California, has cybersecurity standards for distribution.
Do you think that the focus on cybersecurity detracts from the debate?
Weiss: IT is concerned about malicious cyberattacks not nonmalicious incidents. The industrial community is just as concerned about a nonintentional incident as an intentional attack. Conceptually, IT has technology to identify a cyberattack. In industrial controls the Windows layer has forensics for cyber. But at the control system layer, the non-Windows layer, there is very little if any cyber forensics. People say there will be a cyber 9/11. There may be, but we will not know it was cyber. You can't hide lights going out or plants shutting down. Unless you have the forensics, you can't say it was cyber and not something else. The best example of this is what happened in Maroochy, Australia, with a disgruntled ex-contractor. The first 20 times he hacked wirelessly into the SCADA (supervisory control and data acquisition) system and sewage discharge valves they thought it had to be a mechanical or electrical failure. They finally caught him after the 46th time he hacked in.
How do you define a "cyber" incident?
Weiss: The NIST definition is electronic communication between systems that affects either confidentiality, integrity, or availability. It does not have to be intentional. It does not have to affect confidentiality, which is what the consumer world is most worried about. Control systems worry about the integrity and availability of the signal. Is the signal showing that the valve is 80 percent open when its only 8 percent open? The difference can be a major explosion. But availability is also an issue. One of the big differences between IT and operations is that a control system must operate within a fixed time frame, generally milliseconds to low-level of seconds. It's like loading a tanker truck with gasoline. You don't want gasoline to start filling the truck until the truck pulls up and you open the top lid to the tank. You also don't want to finish filling the tank prematurely where the truck leaves without a full load.
So are there vulnerabilities at every junction, from the grid level, power plant, substation, and on down into the home?
Weiss: Absolutely. There are vulnerabilities also in the communications systems. Utilities use microwave, radio, power line carrier, cell phone, and satellites much of which is cybervulnerable.
You mentioned Bluetooth.
Weiss: One major electrical equipment manufacturer for the smart grid is providing Bluetooth communications for their electric reclosers--the devices on the pole top or in a substation that enable the electricity to start flowing again. They added Bluetooth because the utility companies were asking for it so that an engineering technician on a miserable rainy or snowy night would be able to sit in his truck and open or close that recloser as necessary without ever having to get out of the truck. But the recloser resides on the SCADA highway and you've just allowed communications onto the SCADA highway behind the firewalls And there is no criteria in this design for this system that security should be included in that implementation. They have to include appropriate security in the procurement and implementation specifications.
Is the Obama administration taking this issue seriously enough?
Weiss: The industrial control systems community has essentially no seat at the table. When government gets together and drafts a plan they bring in heavyweights from the IT community. And what you see in the plan are the requirements from the IT community. At several recent major government-sponsored cybersecurity conferences, I was the only one from the industrial control systems community and in prior years there was nobody there from the industrial control systems community.
Are representatives from the industrial controls area not invited, or are they not interested?
Weiss: Generally they're not invited. The IT community views control systems as just another computer. A control system is two pieces--the human machine interface, the screens people see in the control room, which are now moving toward Windows, Unix and Linux. These systems also use TCP/IP. So people look at this and say "Aha! That's IT. I know this." What they don't see are all of the devices in the field that sense, measure, control,and monitor physical processes. These devices don't look like a computer and don't use Windows. They use either proprietary real-time operating systems systems or fully embedded systems. There is no security at this level even though this where you go "boom in the night." What IT sees is the engineer sitting in front of a Windows workstation and they say "I know that."
Do you ever feel like the Cassandra in the industry?
Weiss: There are people who are starting to believe control system cyber is real, but it is still a small fraction. All I have are facts and physics. I've got an incident database with over 170 control system cyber incidents worldwide. Unfortunately, most incidents are not public. CERT (Computer Emergency Response Teams) and other IT security monitoring organizations are not designed to collect information for control systems. There is an unfortunate tendency to simply want to declare victory. The Department of Energy and the Department of Homeland Security both have work ongoing in this field. But neither has connected the dots on incidents that have actually happened to identify relevant R&D.
Can we ever really keep attackers out of these systems?
Weiss: Any nation state that wants to get in will get in. These systems must work in an efficient, reliable, and safe manner. "Rocket science" is needed to secure these systems without impacting their performance. Anything less and the attacker wins without even getting in. There is an expression, the only secure control system is one that is unplugged and on the bottom of the sea.
Your passion about the topic indicates that you eat, drink, and sleep control industry security. What do you do when you're not working?
Weiss: I play racquetball. It gives me a chance to bang my head against the wall for something I actually enjoy.