Erwin van der Zwan, CISA, CISM, CISSP
Most industrial processes and critical infrastructures such as energy (electricity, gas and oil), water treatment and distribution, telecommunications, transportation, or chemical plants depend heavily on information and communication technology (ICT) and industrial control systems (ICS), such as Supervisory Control and Data Acquisition (SCADA) or distributed control systems.1 Many countries are undertaking initiatives to protect against ICT disruptions, malicious (cyber)activity and terrorism by enhancing the resistance and resilience of critical infrastructure to prevent or mitigate the risk and impact of major incidents.2
The purpose of this article is to summarize the major cybersecurity issues of ICS to help improve awareness among owners, (security) professionals, auditors and policy makers and to help organizations recognize threats and vulnerabilities.
Nowadays, telecommunications, IT and ICS are intertwined with, and essential for, many aspects of modern society. The proliferation of ICT in industrial processes is increasing and (inter)dependencies are becoming numerous. Failure or manipulation of ICT systems results in diverse and unpredictable effects. Resilience depends, among other things, on the readiness of the organization and the design of the systems. Various critical infrastructures appear to be able to sustain an ICT failure of a few hours and sometimes even days. The first likely consequences of cyberincidents are loss of the ability to view and/or control the particular processes, causing an increased reliance on emergency and safety systems. Most incidents result only in a local impact. Nonetheless, they might have major personal or financial consequences. However, minor ICS errors or incidents may already have a devastating effect and even result in loss of life or significant environmental damage.
Well-known examples are the shutdown of the Hatch nuclear power station (Georgia, USA) in June 2008, due to a software update in the enterprise network of the Browns Ferry nuclear power station (Alabama, USA) in August 2006 as a result of a drastic increase in network traffic, and the Ohio Davis-Besse nuclear power station (USA) in January 2003, due to the Slammer worm. That SCADA errors may cause physical damage was demonstrated in the Taum Sauk Water Storage Dam (St. Louis, Missouri, USA) failure in 20053 and the Bellingham (Washington, USA) gas pipeline rupture in June 1999.4 Malicious intent is also not uncommon, as was the case with Mario Azar, who was indicted for temporarily disabling a computer system detecting pipeline leaks in 2009;5 former contractor Vitek Boden, who took control of the SCADA system of the sewage and water treatment system at Queensland’s Maroochy Shire (Australia) in 2000;6 and the disruption of services at Worcester Airport (Massachusetts, USA) by a teenager in 1997. Also, the incidents of November 2009 in the Brazilian electricity grid, causing a blackout that hit 60 million people, were speculated to have originated from cyberattacks instead of heavy rain and strong winds.
Industrial control environments are exposed mainly to technical, human or external threats, such as acts of nature, fire and flooding. Power disruptions, accidents, and technical or human failures occur frequently. In general, about 75 percent of incidents and disasters worldwide are believed to be caused by human error.
Fortunately, malicious threats against critical infrastructures are less common. However, vandalism, theft, fraud and cybercrime occur on a regular basis. Damage or theft of material such as copper is hard to defer at remote, unmanned locations. Insiders often play a significant role, for example, in providing valuable information or granting access; sometimes they are even the perpetrators.
Cyberincidents in the US reported to the CERT Coordination Center (CERT/CC) by third parties show an almost exponential increase since 1988 (see figure 1). After 2003, the number of incidents is no longer published because this provides little information given the widespread use of automated attack tools and the commonality of attacks against Internet-connected systems.7
Additionally, the nature of critical infrastructures may be attractive to attackers. Espionage by competitors, criminals or foreign nations is a serious threat. Extremists are developing an interest in computer hacking techniques and SCADA security. Information technology is not only used in traditional fraud and crimes or to support (terrorist) activities, such as information gathering, communications, money laundering or propaganda, but new activities such as disrupting ICT systems out of protest (“hacktivism”) are emerging. Large-scale disruptions or compromise of critical ICT systems may have a crippling effect on modern societies. Also, the US military recognizes its dependency on commercially operated critical infrastructures, such as the electricity grid.8 However, terrorist attacks against ICS systems, or using ICT as a main attack vector, are not considered to be likely at this point. Nevertheless, cyberthreats to ICS are real and increasing.
Machines and computers are developed, programmed and operated by humans. Humans are often the weakest spot when it comes to security. Lack of awareness, laziness, misjudgment and misconduct are just a few of the aspects of potential human weakness. People are inclined to have limited imagination when it comes to security. Asset owners and operators are normally not security specialists. More commonly, they are technically skilled and, understandably, more concerned about business risks, continuity and safety than about security and malicious cyberattacks.
Still, a general way of thinking is based on a false sense of security; the process control environment is an isolated island and security is provided by the complexity and unfamiliarity of the technology (that is, security through obscurity). Nothing could be less true, however. ICS are, in more than one way, connected to business networks and extranets. The industrial control environment also interacts in many ways, such as through memory sticks and temporarily connected laptops. Also, SCADA technology is not secret or classified. A handful of SCADA vendors supply all major systems in the world. Hence, language is not an issue, since all vendors publish documentation in English. Thus, knowledge about industrial processes; SCADA; and electricity grid, gas or fluid distribution techniques is widespread around the globe and relatively easy to obtain if one knows where to look. Besides, intelligent understanding of ICS is not necessary when the objective of the attacker is just to cause havoc.
People like to talk about their job, problems and accomplishments. Sensitive information may unintentionally be disclosed, for example, during meetings, chats with peers, phone calls in public areas or publication on blogs. Intentional disclosure may occur when obtaining permits to adhere to legislation or informing citizens.
The inability to comprehend the security issues and design principles of specific technologies may increase the dependence on a few systems and services. For example, the intended flexibility of the Internet Protocol (IP) is diminished when requiring human-understandable referral names. This translation service is provided by just a handful of domain name systems (DNS). Thus, new dependencies and vulnerabilities may be introduced to an otherwise robust and resilient (Internet, enterprise or industrial) network architecture.
Some of the human-related liabilities are:
In classic ICS environments, security does not come into play. Naturally, the focus is put on safety, availability and integrity. Confidentially is not an issue. In the past, ICS tended to be purpose-built, stand-alone systems. Nowadays, more and more standardized equipment is being introduced. Industrial control environments tend to encompass a mix of various (legacy) systems and standard platforms based on, for example, Intel, Windows, Linux and Oracle. Also, on the networking side, common off-the-shelf (COTS) technology (e.g., Ethernet and TCP/IP) is finding its way inside ICS without management dealing sufficiently with its security limitations. Inside the process control environment, network topologies are usually flat and straightforward. Firewalls and network segmentations are implemented at the edges. Radio technology is not new to ICS either; however, a different trend is the tremendous growth of wireless devices. Everything is getting an antenna; some are just printed internally on the circuit board.
These developments make the industrial control environment not only vulnerable for already existing attacks against the older legacy technology, but they also introduce new attack vectors, commonly known in the general ICT domain. A further harmonization of platforms and equipment to reduce costs creates a monoculture. This might result in malicious activity without specifically having attracted the attention of attackers. For example, a huge number of random attempts are targeted against Microsoft Windows systems simply because this provides the biggest target population. The spreading of computer viruses relies on this concept and, consequently, may now also find its way into ICS.
SCADA systems, as opposed to distributed control systems, are normally deployed in a centralized architecture, spread over two data centers. This is considered efficient and cost-reductive. A central physical location is easier to protect, maintain and operate. Also, it is often more convenient when implementing emergency power facilities, redundancy and fail-over between the systems, as opposed to achieving fail-over procedures across numerous critical locations. However, these measures are not protecting against software flaws, data corruption or attacks exploiting vulnerabilities that exist across all data centers. Thus, a single point of failure might still exist despite double data centers and parallel server platforms.
Authentication in ICS is often weak or absent. Operators are normally identified based only on a username and password. Logon and password policies are generally relaxed. Operators fear that they might be locked out by the system. The use of hardware tokens is seldom deployed or deployed only for remote access solutions. Some (older) communication protocols even allow intrinsic access just by establishing a connection with the device.
In addition, authorization is a neglected aspect. Operators are inclined to use accounts with the highest (administrative) privileges. However, during normal operations, such access to advanced functionality is not necessary. Additionally, computer services tend to violate the principle of least privilege and run on root access levels.
The long life cycle of ICS, generally 15 to 20 years, poses another challenge. It is simply not economically feasible to replace equipment with modern and better protected versions every three or four years. Inevitably, administrators must deal with security shortcomings and newly discovered vulnerabilities.
Hardware and software will always have shortcomings. This is inherent in their development by humans. The growing complexity has an additional negative effect on security. In large and complex environments, flaws may go undetected for a considerable period of time.
A frequently overlooked aspect is the logistics of procured hardware and software.9 How is security controlled and maintained in the supply chain? How does one verify that equipment that arrives onsite is not already infected and compromised? What security controls are in place at the factory or in transit? Smaller companies may not have the resources to implement high-end security solutions. Therefore, their security may be weak. Large companies buy from small companies and may consequently get infected.10
Normal business operations are becoming more and more dependent on data aggregated within the industrial processes. There is a requirement to have a permanent open link from the SCADA environment to the outside world. Systems that collect and archive process control data (data historian services) are, therefore, commonly placed so that they can be reached from the business network. Examples can be found where the outside connection is also used to pull data or software inside the ICS environment. External connections exponentially increase the exposure of the ICS. Such connections link nonprocess control employees on the business network to the ICS. They also open potential attack vectors from networks connected to the business network, such as the Internet and remote (home) users. Nevertheless, frequently heard comments state that outside communications do not exist (“Well, except maybe that one, and that one…”) or that the communications are only one-way. Regrettably, industrial control environments can seldom match this claim since none appear to be completely isolated.
Industrial communication protocols, especially older ones, are often without encryption, authentication and integrity controls.11 Universally deployed protocols such as Distributed Network Protocol (DNP3), OPC, Modbus or ICCP all have deficiencies. Proprietary protocols, sometimes believed to be secure through obscurity, are no better. Also, protocols from the TCP/IP suite, such as TELNET, SMTP, SNMP or FTP, send user credentials and data in cleartext over the communication network.
Almost all ICS environments are provided with some sort of remote access facility. Remote access uses public telecommunication networks such as the Internet and phone systems. The routes that data travel over these networks are not known and might be manipulated or monitored. Virtual private network (VPN) technology might guarantee the confidentiality of the communication. However, this might not prevent malicious attacks that can occur through compromised remote workstations. As with the telecommunication networks, these workstations reside outside the controlled environment of the company. Therefore, they may not be secured to the required company standard or may not be available during a time of crisis. Thus, attacks may take place hidden from view, using the VPN tunnel straight through the corporate firewall. Similarly, trusted connections from vendors direct to the production environment may still be in place today.
New trends push smart equipment to the end points—the users, for example—as with the smart power meters (smart grid). The underlying advanced metering infrastructure (AMI) is becoming a highway for various data instead of just metering. However, the system was never designed for general-purpose data transfer. The smart meters themselves commonly have integrated wireless, (mobile) telephone or broadband connections to transfer data. Furthermore, the AMI comprises head-end servers for command and control, collectors, telecommunication, and repeaters. All of them have their deficiencies.
Other entry paths into the process control environment are often neglected by assuming that all network traffic will flow through one security choking point: the firewall. Nonetheless, the systems remain exposed to attacks when allowing mobile devices, such as PDAs, flash drives, USB memory sticks, CD/ DVDs, (rogue) laptops, and (forgotten) modems, and wireless or VPN connections.
Security is frequently considered last in the corporate budget, although critical infrastructure owners are well aware of their responsibilities and role in the business. Hence, security controls are inadequate or out of date or simply do not exist.
Due to respect for business continuity and maintaining system robustness, ICS environments are not frequently or preventively patched. The collective culture says “if it is not broken, do not fix it.” Also, configuration and change management, common practice for the normal ICT operations of companies, is less widespread in industrial environments. Especially in smaller organizations, informal procedures are common.
Industrial companies find it difficult to employ the right, qualified technical staff. Skilled and experienced operators are usually older than their counterparts in other industries. New employees might not always have the time to learn all the tricks from their older, experienced colleagues. Specific knowledge about legacy systems or ICS behavior gets lost (de-skilling).
Another question is: Who is in charge? Usually, ICS are homegrown deployments that came about over a period of years. IT responsibilities are not always clear in these large, organically grown heterogeneous environments. Also, ICS operators are normally not accustomed to dealing with cybersecurity threats like those being faced today.
Outsourcing might blur the borders between internal and external responsibilities, facilities, and systems used to provide a specific part of the ICS functionality. The dependencies on other services or equipment may not always be clear and, thus, may be forgotten when drafting security controls. A different aspect is the outsource partner not understanding the core business and processes of its customer. This may, for example, result in a slow or inadequate response during incidents. Outsourced parties may also have conflicting loyalties or hidden agendas.
ICS are frequently about information gathering and sharing; it is not to be forgotten that normal information security aspects also have to be covered. One does not want all the ICS configuration details, internal procedures or security controls to be widely known.
Commonly, mobile equipment and data storage are still insufficiently protected. Ninety percent of users do not encrypt their PDA or BlackBerry. Around 65 percent do not encrypt their USB memory sticks or laptops.12 Users regularly use private devices, such as smartphones and USB memory sticks, for business purposes. Obsolete equipment is generally not cleaned of sensitive data.13 Recycled equipment frequently starts a new life or is being scrapped in other countries without the company’s knowledge.
ICS require attention, but are not that special in terms of security. Nevertheless, seeing the overview of issues illustrates the huge challenge that owners face. Operators can hardly be expected to oversee all aspects and keep everything running and secure at the same time. Every day requires juggling to keep all the balls aloft. Nonetheless, security must be established and maintained in a structured manner.
Bruce Schneier14 wrote once that one thing about security is certain: the uncertainty.15 Therefore, it is necessary to think sensibly about security in an uncertain world. Due care principles must be followed. A good ICS security baseline should be defined and met.
However, one must be aware that compliancy is an instrument for the security professional and management, but, by itself, it is not going to keep the organization secure. Another pitfall is expecting technology to solve problems. Technology is no solution for human error or ignorance.
Think of the ICS and the critical infrastructure being controlled by it as the organization’s most valuable possessions. They are in the inner circle of a multilayered, secure, in-depth architecture. Security controls should be spread out and varied, but the organization must still maintain availability and integrity requirements. Hence, antivirus programs should not be installed on ICS. If one thinks they are still needed deep inside the organization’s most valuable environment, it is too late.
If the organization must connect the industrial control network to the enterprise network, management should think twice. Were all (electronically connectionless) alternatives considered? Can a (hardware-based) one-way network device be used? Does the connection always need to be up, or is there a time frame during which the connection is needed? Has the least privilege principle been followed?
When finished, it is time to start practicing. The organization must be prepared, develop and test emergency response plans, evaluate cyberincidents, and try malicious attack scenarios. Red team/blue team exercises might prove to be a valuable training method.
This article summarizes some security aspects considered to be among the most liable or specific to industrial control systems. Most of these issues are well known, but are still a reason of concern, and must be taken into account in risk analyses and audits. ICS are experiencing the same, and more, vulnerabilities as any ICT environment. Especially in older industrial equipment, communication protocols and systems are not secure.
Unfortunately, ICS security is still several years behind general cybersecurity standards. ICS remain prone to technical and human errors. It is just a matter of time until serious (targeted) malicious cyberattacks will start to occur. Fortunately, many good initiatives are underway to improve the security of ICS, and a positive shift in the mind-set of operators, owners, vendors and policy makers can be observed.
The goal of securing ICS environments is to reduce the overall vulnerability exposure. It is not realistic to pretend that the solution is to try to close each vulnerability individually. There is neither a magic box nor a firewall that will make all these security issues go away. Dealing with security is a continuous process that should be firmly embedded throughout all levels of the organization and in all procedures, procurement and life-cycle management of aspects. A holistic approach on security is required. The aim is to move toward a mature security posture, where the organization is running ICT and process control environments with security built in. The major objective is to achieve sufficient resistance and resilience to withstand failures, incidents and (cyber)attacks and to mitigate risks to a level acceptable by society.
The article is based on research conducted by the National Advisory Center for the Critical Infrastructure (NAVI) in The Netherlands. The contents of this article do not necessarily express the views of the NAVI or other parties.
1 This article uses the terms ICS, process control systems (PCS), SCADA or distributed control systems (DCS) and associated equipment such as programmable logic controllers (PLCs), remote terminal units (RTUs) or database, application or communication servers to indicate ICT systems that in some way are used to monitor and control (physical) processes and equipment in industrial, infrastructural or facility environments.2 The European Programme for Critical Infrastructure Protection (EPCIP) identifies the telecom/ICT sector as a priority. Commission of the European Communities, “Protecting Europe From Large-scale Cyber-attacks and Disruptions: Enhancing Preparedness, Security and Resilience,” 30 March 2009, http://ec.europa.eu/information_society/policy/nis/docs/comm_ciip/comm_pdf_2009_0149_f_en.pdf3 Federal Energy Regulatory Commission (FERC), No. P-2277, “Technical Reasons for the Breach of December 14, 2005,” USA, www.ferc.gov/industries/hydropower/safety/projects/taum-sauk/ipoc-rpt/full-rpt.pdf4 National Transportation Safety Board (NTSB), Accident Report PB2002-916502, USA, www.ntsb.gov/publictn/2002/PAR0202.pdf5 US District Court for the Central District of California, Indictment CR No. 09, US vs. Mario Azar, USA, www.wired.com/images_blogs/threatlevel/files/azar.pdf6 Abram, Marshall; Joe Weiss; “Malicious Control System Cyber Security Attack Case Study—Maroochy Water Services, Australia,” MITRE Corp., USA, www.mitre.org/work/tech_papers/tech_papers_08/08_1145/08_1145.pdf7 Turk, Robert J.; “Cyber Incidents Involving Control Systems,” Idaho National Laboratory, USA, October 2005, www.inl.gov/technicalpublications/Documents/3480144.pdf8 Kleber, Drexel; “The US Department of Defense: Valuing Energy Security,” Journal of Energy Security, 18 June 2009, www.ensec.org/index.php?option=com_content&view=article&id=196:theus-department-of-defense-valuing-energysecurity&catid=96:content&Itemid=3459 Sachs, M.H., “Security From the Supply Chain Perspective,” GOVCERT.NL Symposium, The Netherlands, 200810 Some examples of (accidental and malicious) compromises occurring already in the supply chain of equipment include the following. In 2007, new hard disks produced in China were contaminated with viruses. The infection originated from an infected system performing quality assurance test on produced disks (Leyden, J., “Chinese Trojan on Maxtor HDDs Spooks Taiwan,” The Register, 12 November 2007, www.theregister.co.uk/2007/11/12/maxtor_infected_hdd_updated/). In October 2006, Linuxbased car navigation systems were shipped with various (Windows-based) malware and computer viruses (Gray, T.; “TomTom Steers Some Users Straight to Virus,” LinuxInsider, 30 January 2007, www.linuxinsider.com/story/55464.html?wlc=1269246153). In 1982, the CIA inserted a Trojan into pipeline control software that the Soviets bought covertly, which caused the pumps, turbines and valves to go haywire and resulted in a large explosion (Weiss, G.W.; “The Farewell Dossier,” Central Intelligence Agency, www.cia.gov/library/center-for-the-study-ofintelligence/csi-publications/csi-studies/studies/96unclass/farewell.htm).11 Graham, Robert; David Maynor; “SCADA Security and Terrorism: We’re Not Crying Wolf,” X-Force, Internet Security Systems, 2006, www.blackhat.com/presentations/bh-federal-06/BH-Fed-06-Maynor-Graham-up.pdf12 European Network and Information Security Agency, “Secure USB Flash Drives,” June 2008, www.enisa.europa.eu/act/ar/deliverables/2008/secure-usb-flash-drives-en13 University of Glamorgan, “One in Five Second Hand Mobiles Contain Sensitive Data,” UK, 26 September 2008, http://news.glam.ac.uk/news/en/2008/sep/26/one-fivesecond-hand-mobiles-contain-sensitive-dat/14 Bruce Schneier is considered one of the world’s foremost security experts.15 Schneier, Bruce; Beyond Fear, Copernicus Books, USA, 200616 International Society of Automation, ISA99, “Industrial Automation and Control System Security,” 2007-2009, www.isa.org/MSTemplate.cfm?MicrositeID=988&CommitteeID=682117 National Institute of Standards and Technology, SP 800-82, “Guide to Industrial Control Systems (ICS) Security,” USA, 2008, http://csrc.nist.gov/publications/drafts/800-82/draft_sp800-82-fpd.pdf18 International Organization for Standardization, ISO 27000, Information Security Management Systems, www.iso.org/iso/specific-applications_it-security
Erwin van der Zwan, CISA, CISM, CISSPis a management consultant/auditor with Ordina and the (cyber)security advisor with the former National Advisory Center for the Critical Infrastructure (NAVI) in The Netherlands. He can be reached at [email protected].
Enjoying this article? To read the most current ISACA® Journal articles, become a member or subscribe to the Journal.
The ISACA Journal is published by ISACA. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.
Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors’ employers, or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.
© 2010 ISACA. All rights reserved.
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, MA 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.