Introduction

I.1. What is industrial cybersecurity?

Nowadays, more and more man-made physical systems are controlled by a computer system. This is the case for autonomous systems such as vehicles, everyday devices and industrial production systems or water or energy distribution systems. Most of these systems are also connected in some way to the Internet.

The computer security of this equipment is becoming a major issue for the industrial world. This is particularly true today, in the context of the factory of the future, also known as Industry 4.0, which is presented as the fourth industrial revolution, and which is characterized by increasingly connected systems and by the increasingly strong integration of digital technologies into manufacturing processes.

There are many spectacular cyber-attacks in the news: they aim to steal identifiers, make some systems or websites unable to function properly, or try to block workstations by encrypting data in order to obtain a ransom.

The control systems of industrial installations are also subject to attacks, either by the collateral effect of a computer attack, as in the case of WannaCry (Symantec 2017; May et al. 2018), which can lead to plant shutdown and significant operating losses, or specifically with an attack on industrial systems. This is the case with the Stuxnet attack (Falliere et al. 2011), which aimed to destroy uranium production capacities in Iran, or the Triton attack (White 2017), which aimed to render security systems inoperative. Other recent attacks are presented in Chapter 4.

The potential damage that can be caused is great and ranges from simple yield losses to material and human damage, as well as information losses, which can be very serious. Given the magnitude of the impact of this potential damage, and given the frequency of attacks, the risk associated with the cybersecurity of industrial systems has become significant.

For a long time, this risk was neglected: industrial installations were loosely connected to company networks or the Internet, and industrial control system (ICS) seemed to be protected. The evolution of technology, uses and needs has led to the connection of these systems to other networks, whether for the transfer of production data to the company’s IT systems, for remote maintenance or for automatic download of updates. At the same time, the convergence of protocols toward common standards has increased the vulnerability of control systems. The idea that industrial systems could be considered isolated from the rest of the world, sometimes referred to as the air gap myth, is unrealistic nowadays.

The aggravating factor is that, since most technologies and protocols were developed at a time when cyber-attacks did not exist, they are not very secure and highly vulnerable. Many installations still use them. As the renewal rate of systems and equipment used in ICS is very low, relatively old equipment is still in service. The lifetime of the installations, which is much longer for ICS than for traditional computer systems, is an additional vulnerability. For the most recent installations that are being implemented around the industrial Internet of Things, a new risk factor is emerging, which is related to the complexity and flexibility of the installations.

The risk is therefore very real and cannot be ignored. It must therefore be controlled. As in other areas, there is no such thing as zero risk, so we must ensure that it is contained at an acceptable level. Therefore, the risk must be assessed and dealt with appropriately, that is, select the relevant actions to be taken. Depending on the issues and the context, the answers will of course be different. Measures cannot be limited to technical actions, they must be part of a risk management plan (Chapter 3) that can remain simple, but which must be global and take into account human and organizational aspects. Moreover, since, as we said above, there is no such thing as zero risk, it is useful to set up a recovery and business continuity plan, or even a crisis management plan. This will be based on a detection system and an alert chain. The whole approach must of course take into account the cost–benefit ratio.

I.2. From information security to cybersecurity

The security of an Information System (IS) concerns all aspects related to the control of IS risks and aims to guarantee:

  • – the optimal functioning of the IS to obtain the best quality of service;
  • – that no unacceptable damage may affect the various elements of the IS (beyond a fixed level);
  • – that no undesired operation may lead, directly or indirectly, to unacceptable damage to the rest of the company or partners (beyond a fixed level).

The term “cyber” is a prefix from the Greek word Kubernêtikê meaning “to lead, to govern”. In 1948, Norman Wiener introduced the term “cybernetics” to refer to the sciences of control and communication between the living being and the machine. “Cyber” has become relative to what is related to computers, and we speak of cyberspace to refer to the extension of our natural space through the Internet.

Cybersecurity concerns the computer security of systems connected to the Internet and belonging to cyberspace. Cyber-attacks are computer attacks in cyberspace, in addition to existing threats to ISs.

Through abuse of language, we often speak of cybersecurity for everything related to computer security (Niekerk and Solms 2016).

I.3. Is there really a risk for industrial systems?

Cyber-attacks are not very relevant to industrial or cyber-physical systems.

It is true that most attacks concern traditional computer systems. These attacks are endless and the tools to create attacks are becoming more democratic. The means used by organized cybercrime have grown considerably.

As far as industrial systems are concerned, attacks are limited in number and often show that the attacker has a very specific knowledge of the systems under attack and has implemented a tailor-made attack.

Does this mean that the risk associated with the cybersecurity of industrial systems is low? The answer is of course no. The level of risk depends on the severity of the damage and the likelihood of its occurrence. For an industrial or nuclear installation, the damage can be catastrophic and impact the population. The possibility of damage being caused is at least at the same level as for IT management systems. The level of risk is therefore very high. To be convinced, it is sufficient to observe the evolution of regulatory obligations, such as the LPM (Military Programming Act) in France, the NIS (Network and Information Security) directive in Europe or the Critical Infrastructures Protection Act in the United States.

The system is isolated from the Internet, so it is safe.

For a long time, it was believed that not being connected to the Internet was enough to avoid any risk of computer piracy. We sometimes talk about the myth of the air gap. In fact, the situation is more complex:

  • – first of all, even if a system is not connected, it can be a victim of technological malicious acts, Stuxnet is a demonstrative example. The attack vector was a USB key;
  • – often, the industrial network is connected to the IT corporate network: this IT system can be the victim of attacks and host malicious programs that subsequently attempt to corrupt the industrial network, or it can even allow attacks to pass directly through;
  • – in an industrial network, there are sometimes direct connections to the Internet, more or less official and sometimes temporary, for maintenance or configuration, and these represent a real vulnerability.

In addition, with the growing need to upload data to the IS or to the Cloud, with update systems from a manufacturer’s site and with remote maintenance, the isolation of industrial systems is increasingly illusory.

The stakes are low.

A widespread idea is that the risk is low when the production equipment does not use dangerous machines or processes.

For this type of installation, it is clear that the damage to the environment and people will be limited. However, for the company, the impact can be enormous since an attack can result in a shutdown of production for a long period of time, a substandard quality of the products manufactured, or even a destruction of the production equipment. The economic consequences must be analyzed and a cost–benefit analysis carried out to determine the level of cybersecurity measures to be taken.

The workstations are equipped with antivirus software and there is a firewall, and therefore we are protected.

Using an antivirus is a basic step to take. It allows the protection of computer workstations, running under Windows or macOS. However, in an industrial system, there are many devices running on a real-time operating system or an embedded Linux system, or even a proprietary system. For these systems, there is not necessarily an antivirus and they are therefore vulnerable.

Let us add that one of the problems with antivirus software for industrial computer workstations is that they are not always updated.

The limits of firewalls are also well known: the first is that filtering rules are not always well configured; the second is that even if data flows are limited, this does not prevent all attacks from passing through. As detailed in Chapter 4, the electrical energy management system that attacked in Ukraine in 2015 included firewalls that did not prevent anything from happening.

We use a Virtual Private Network (VPN), so no problem.

Another widespread idea is that the use of a VPN provides effective protection. This idea is also wrong for two main reasons:

  • – the first is that a significant number of VPNs use technologies considered outdated and are therefore vulnerable. A 2016 study (High-Tech Bridge Security Research 2016) showed that 77% of the VPNs tested still use a SSLv3-based protocol, created in 1996, or even SSLv2, while most standards such as PCI DSS or NIST SP 800-52 (see Chapter 6) prohibit its use;
  • – the second is that even with a well-configured VPN, if one of the workstations connected to these VPNs has been compromised, it can compromise the rest of the network, all the more easily because the transported data are encrypted and therefore difficult to filter.

Information System Security (ISS) is expensive and generates many constraints that limit efficiency.

A common idea is that ISS is expensive and imposes a large number of operational constraints that are incompatible with those of industrial control systems.

These constraints appear all the more important as an ICS uses very heterogeneous equipment, and its users appreciate a certain flexibility for operation. For example, the use of mobile terminals is becoming more and more common, as it is very useful for system control close to the process.

In reality, the ISS of industrial systems must be adapted to the challenges, and it is important to carry out a risk analysis and to compare the importance of these risks with the cost of measures to reduce them and the constraints they impose. However, security is often considered a source of expenditure that is difficult to justify by a return on investment. It is more relevant to measure it against a potential loss, for example on the quantity of production, if the system is unavailable, or on the cost of reconstruction, if it is destroyed.

As for operating constraints, it is important to implement solutions in consultation with users and taking into account the reality on the ground. We must support new operating modes and not try to limit users’ possibilities excessively.

I.4. Vulnerability of ICS

Computer systems are vulnerable to cyber-attacks and physical attacks. ICSs, which, as we have seen, are directly or indirectly connected to the Internet, are also vulnerable.

For historical reasons, and because of cultural differences, ICSs are even more vulnerable than traditional computer systems.

In general, an industrial installation is carried out for a relatively long period of time. Systems more than 10 years old are commonly found. In addition, the main objective is to continuously operate the production system, and everything that requires or can generate a shutdown is avoided. In addition, for accounting reasons, the protocols used are of a fairly old design and are not very secure.

But over time, ICSs have become highly connected. First, production and supply chain management processes are integrated with key management software, and production data are used in business computing applications. In addition, in order to allow a better reactivity, maintenance and monitoring are carried out remotely. Finally, some installations are isolated and supervised remotely (water treatment plant, energy distribution, etc.).

Industrial systems also have the specificity of being very heterogeneous and built from generic elements (Commercial Off-The-Shelf, COTS), chosen above all for their functionality and not for their security. The products used are not very secure, not always fully tested. Updating software, operating systems or firmware is difficult and is not always done regularly. Finally, at the equipment level, authentication management is difficult, and default passwords are not always changed.

Users are not always aware of the vulnerability of ICS: many physical ports (USB, RJ45) are poorly protected, and it may be easy to connect to them. There are even sometimes unofficial connections to the Internet, which have been created for reasons related to maintenance or remote access.

In addition, ICS are used in the world of production, where culture is often based on principles such as producing first and not taking the risk of changing something that works. This means, for example, that updates are not always made on a regular basis and that authentication management is not as rigorous as in the IT world with passwords that can be shared.

Finally, in many cases, there is no ICS security management policy: the management of subcontractors and stakeholders is not subject to specific measures, and there is no user access rights management policy defining their possibilities of action and prohibiting access by employees no longer belonging to the company.

I.5. Cybersecurity and functional security

Industrial installations and cyber-physical systems can be subject to failures or operating errors that can lead to harmful operations or failures. The study of these risks is the subject of what is called “functional security” or “operational safety”.

The sources of dysfunction considered are numerous: they can be technical, human or organizational. In the technical origins, a category is the subject of particular attention, namely electrical, electronic or programmable electronic components. Standard 61508 (Chapter 8) describes the approach for analyzing the risks associated with their potential failures. The idea is to characterize the installation by a level of probability of failure over a given period of time. By coupling it with an impact analysis in the event of malfunctions, it is possible to assess the level of risk of an installation or system and to satisfy some given objectives.

Functional safety, as defined by IEC 61508, does not include computer security, which was not a risk identified as such in 2002, at the time the standard was written. Even the latest updates are only beginning to mention it. It should be noted that Information System Security (ISS) is also not explicitly included in risk analyses of hazardous installations called “hazard studies”.

image

Figure I.1. Relationship between functional security and information system security

Information security and the operational safety of physical systems are managed by different approaches:

  • – on the one hand, the ISS risk management process is described in the 27000 family of standards (Chapter 6). The risk analysis methods used (Chapter 9) are, for example, EBIOS or OCTAVE methods;
  • – on the other hand, there are risk analysis methods for industrial processes such as PHA, HAZOP, FMEA or LOPA (Chapter 8) and standard 61508.

For ISS, the impact of the physical world on the IS, such as a power failure or overheating, is taken into account. On the other hand, the opposite influence (red arrow in figure I.1) is not considered for these systems, since they only handle information.

In the case of cyber-physical systems such as automated industrial systems or critical infrastructures, this influence exists and can have very significant damaging consequences. For these systems, it appears that the two aspects are not independent and must be addressed in a coordinated way.

Indeed, even if the causes of the feared events are different, the consequences and damage produced are for the most part common: damage to property, to people and to the environment through a process that is out of control. For example, a malfunction of a safety programmable logic controller (PLC) may be caused by both problems related to operational safety and to ISS. In any case, the consequences will be damage to the installation.

IEC 62443 (Chapter 7) aims to transcribe a number of functional safety concepts, including SIL levels, for cybersecurity in order to align approaches. For risk analysis, unified approaches are beginning to be proposed (Chapter 9).

I.6. Evolution of the perception of cybersecurity

The technological evolution that enabled the implementation of large-scale automated systems was the invention of the microprocessor by Intel in 1971. This led to the development of personal computing and, a few years later, industrial control systems. These systems were first connected by point-to-point wireline links, and then in the 1990s Ethernet communication with Transmission Control Protocol (TCP)/Internet Protocol (IP) became widespread. The main industrial protocols have been “encapsulated” and have allowed the development of industrial control systems as we know them.

At the time, the main problem was performance, and no one imagined that computer security would take on the proportions it has today.

The first computer viruses date back to the late 1980s. The brain virus that infects the PC “boot” sector was written in 1986 by brothers Basit and Amjad Farooq Alvi (Brain virus n.d.). It is recognized as the first virus on MS-DOS. In 1998, the Morris worm spreading on the Internet was written by R. Tappan. One of the first antiviruses (Norton antivirus) appeared in 1990. Virus progress was rapid, as it was also in 1990 that the first polymorphic virus appeared (Chamelon, written by R. Burger).

As early as the mid-2000s, the question of the vulnerability of ICS was raised (Abshier 2004; Wooldridge 2005). In 2008, the Aurora project (Meserve 2007), an experiment supervised by the Idaho National Laboratory, demonstrated the possibility of destroying an energy generator through a cyber-attack.

In 2010, a study on power generation systems conducted by Red Tiger, at the request of the U.S. Department of Homeland Security (Pollet 2010), showed that the IT security of these systems was not as par with that of the IT world. For example, many of the vulnerabilities made public were not fixed and left these systems vulnerable.

Since 2010, various regulations and bills have appeared in several countries. Let us quote:

  • – the National Cybersecurity and Critical Infrastructure Protection Act of 2013, USA;
  • – Article 22 of the Military Programming Act in 2013, France;
  • – the IT Security Act in 2016, Germany;
  • – the NIS Directive in Europe in 2016.

In addition, since 2010, a number of guides and books have been published on this subject (Macaulay and Singer 2012; Knapp and Thomas 2015), and methodological tools have been proposed to improve the ISS of industrial facilities (ANSSI 2012a; Stouffer et al. 2015).

Nevertheless, risk perception remained limited and steps were slow to be taken. However, with increasingly common and high-profile attacks, the consideration of cybersecurity in new projects is gradually emerging, and is becoming essential with the 2018 regulatory requirements (Chapter 6).

I.7. Structure of the book

This book is organized as follows1:

  • Chapter 1 presents the different elements of an industrial control system, Supervisory Control And Data Acquisition (SCADA) and Industrial Internet of Things (IIoT);
  • Chapter 2 describes the architecture of these systems and the different characteristics of the networks used in SCADA or IIoT systems;
  • Chapter 3 presents the basic concepts of information security and risk management;
  • Chapters 4 and 5 detail the principle of attacks and the analysis of ICS vulnerabilities;
  • Chapters 6 and 7 present the standards and regulations, with one chapter dedicated to IEC 62443, which is the reference standard;
  • Chapter 8 presents the useful concepts of operational safety;
  • Chapter 9 focuses on risk analysis methods: the EBIOS method and some more specific methods for industrial systems: cyber APR, cyber HAZOP and cyber-bowtie;
  • Chapter 10 presents methods and tools for securing an ICS: installation inventory, architecture security, and technical devices such as intrusion detection systems, data diodes and secure IIoT components. Cryptographic concepts are explained in Appendix 1 and the blockchain for IoT is introduced in Appendix 2;
  • Chapter 11 proposes a comprehensive approach for ICS security. It is based on the standards and methods described above, and is presented in a simplified and detailed version.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.34.146