Covert cognizance: self-aware algorithm to ward off hacking attempts


It sounds like a scene from a spy thriller. An attacker gets through the IT defenses of a nuclear power plant and feeds it fake, realistic data, tricking its computer systems and personnel into thinking operations are normal.

The attacker then disrupts the function of key plant machinery, causing it to misperform or break down. By the time system operators realize they’ve been duped, it’s too late, with catastrophic results.

The scenario isn’t fictional; it happened in 2010, when the Stuxnet virus was used to damage nuclear centrifuges in Iran. And as ransomware and other cyberattacks around the world increase, system operators worry more about these sophisticated “false data injection” strikes.

In the wrong hands, the computer models and data analytics – based on artificial intelligence – that ensure smooth operation of today’s electric grids, manufacturing facilities, and power plants could be turned against themselves.

Purdue University’s Hany Abdel-Khalik has come up with a powerful response: To make the computer models that run these cyberphysical systems both self-aware and self-healing. Using the background noise within these systems’ data streams, Abdel-Khalik and his students embed invisible, ever-changing, one-time-use signals that turn passive components into active watchers.

Even if an attacker is armed with a perfect duplicate of a system’s model, any attempt to introduce falsified data will be immediately detected and rejected by the system itself, requiring no human response.

“We call it covert cognizance,” said Abdel-Khalik, an associate professor of nuclear engineering and researcher with Purdue’s Center for Education and Research in Information Assurance and Security (CERIAS). “Imagine having a bunch of bees hovering around you. Once you move a little bit, the whole network of bees responds, so it has that butterfly effect. Here, if someone sticks their finger in the data, the whole system will know that there was an intrusion, and it will be able to correct the modified data.”

Trust through self-awareness

Abdel-Khalik will be the first to say that he is a nuclear engineer, not a computer scientist. But today, critical infrastructure systems in energy, water, and manufacturing all use advanced computational techniques, including machine learning, predictive analytics, and artificial intelligence.

Employees use these models to monitor readings from their machinery and verify that they are within normal ranges.

From studying the efficiency of reactor systems and how they respond to equipment failures and other disruptions, Abdel-Khalik grew familiar with the “digital twins” employed by these facilities: Duplicate simulations of data-monitoring models that help system operators determine when true errors arise.

But gradually he became interested in intentional rather than accidental failures, particularly what could happen when a malicious attacker has a digital twin of their own to work with. It’s not a far-fetched situation, as the simulators used to control nuclear reactors and other critical infrastructure can be easily acquired.

There’s also the perennial risk that someone inside a system, with access to the control model and its digital twin, could attempt a sneak attack.

“Traditionally, your defense is as good as your knowledge of the model. If they know your model pretty well, then your defense can be breached,” said Yeni Li, a recent graduate from the group, whose Ph.D. research focused on the detection of such attacks using model-based methods.

Abdel-Khalik said, “Any type of system right now that is based on the control looking at information and making a decision is vulnerable to these types of attacks. If you have access to the data, and then you change the information, then whoever’s making the decision is going to be basing their decision on fake data.”

To thwart this strategy, Abdel-Khalik and Arvind Sundaram, a third-year graduate student in nuclear engineering, found a way to hide signals in the unobservable “noise space” of the system. Control models juggle thousands of different data variables, but only a fraction of them are actually used in the core calculations that affect the model’s outputs and predictions.

By slightly altering these nonessential variables, their algorithm produces a signal so that individual components of a system can verify the authenticity of the data coming in and react accordingly.

“When you have components that are loosely coupled with each other, the system really isn’t aware of the other components or even of itself,” Sundaram said. “It just responds to its inputs. When you’re making it self-aware, you build an anomaly detection model within itself. If something is wrong, it needs to not just detect that, but also operate in a way that doesn’t respect the malicious input that’s come in.”

For added security, these signals are generated by the random noise of the system hardware, for example, fluctuations in temperature or power consumption. An attacker holding a digital twin of a facility’s model could not anticipate or re-create these perpetually shifting data signatures, and even someone with internal access would not be able to crack the code.

“Anytime you develop a security solution, you can trust it, but you still have to give somebody the keys,” Abdel-Khalik said. “If that person turns on you, then all bets are off. Here, we’re saying that the added perturbations are based on the noise of the system itself. So there’s no way I would know what the noise of the system is, even as an insider. It’s being recorded automatically and added to the signal.”

Though the papers published by the team members so far have focused on using their paradigm in nuclear reactors, the researchers see potential for applications across industries—any system that uses a control loop and sensors, Sundaram said. The same methods could be used also for objectives beyond cybersecurity, such as self-healing anomaly detection that could prevent costly shutdowns, and a new form of cryptography that would enable the secure sharing of data from critical systems with outside researchers.

Cyber gets physical

As nuclear engineers, Abdel-Khalik and Sundaram benefit from the expertise and resources of CERIAS to find entry points into the worlds of cybersecurity and computer science. Abdel-Khalik credits Elisa Bertino, the Samuel D. Conte Professor of Computer Science and CERIAS research director, with the original spark that led to creating the covert cognizance algorithm, and thanks the center for exposing him to new partnerships and opportunities.

Founded in 1998, CERIAS is one of the oldest and largest research centers in the world concentrating on cybersecurity. Its mission, says managing director Joel Rasmus, has always been interdisciplinary, and today the center works with researchers from 18 departments and eight colleges at Purdue. Abdel-Khalik’s research is a perfect example of this diverse network.

“When most people think about cybersecurity, they only think about computer science,” Rasmus said. “Here’s a nuclear engineering faculty member who’s doing unbelievably great cyber and cyberphysical security work. We’ve been able to link him with computer scientists at Purdue who understand this problem, but yet don’t understand anything about nuclear engineering or the power grid, so they’re able to collaborate with him.”

Abdel-Khalik and Sundaram have begun to explore the commercial possibilities of covert cognizance through a startup company. That startup, Covert Defenses LLC, has recently engaged with Entanglement Inc., an early-stage deep tech company, to develop a go-to-market strategy.

In parallel, the team will be working to develop a software toolkit that can be integrated with the cyberphysical test beds at CERIAS and the Pacific Northwest National Laboratory, where sensors and actuators coupled to software provide a simulation of large-scale industrial systems.

“We can provide additional applications for the technologies that he’s developing, since this is an idea that can help nearly every cyberphysical domain, such as advanced manufacturing or transportation,” Rasmus said. “We want to make sure that the research that we’re doing actually helps move the world forward, that it helps solve actual real-world problems.”

Cyber Incidents

The use of SCADA and industrial control systems in nuclear power plants brings cyber security problems and computer incidents to the attention of researchers. Not only nuclear power plants but also all relavant information in this category are highly critical. Attacks against platforms that hold rich information on nuclear power plants can be witnessed.18

The seven cyber incidents outlined below offer insight into the scale and severity of cyber malfunctions and attacks

Supervisory Control and Data Acquisition (SCADA) and Human Interaction

In the 21st century, national security is tied to the economy, which is highly dependent on energy and critical infrastructures. High electricity production as well as consumption

forces states to focus on energy security. Most states use different sources of energy to fullfil their electric needs. The electric grid and its components are almost always controlled by information technology. National security in the modern age relies on hardware, software, and human-machine interaction more than ever before. For this reason, it is possible to paralyze a nation with sophisticated cyber attacks.

With the realization of what devastating cyber attacks can lead to, states have begun to develop national strategies defining their cyber positions and capabilities in the event of an attack.

Through defining major threats, these national cyber strategies determine how agencies and institutions should prepare themselves. States must harmonize their efforts to address structural and technological challenges resulting from changes in mentality, data, and the Internet.

Human-Machine Interaction

Before 1957, computer technology had limited capabilities, executing tasks one at a time in a process known as batch processing. Researchers had no direct access to computers. In

addition to insufficient processing capabilities, computers were physically big, requiring huge rooms equipped with coolers. Before the advent of more advanced, modern technology, using computers was a long and time-consuming process.

The direct connection to servers that researchers achieved in 1957 was seen as a major milestone in computing technology, even though remote connection to servers had its limitations. High demand led to the time-sharing concept, which permitted different researchers to directly connect to servers over a limited period of time.

This concept first emerged so that multiple users could share the processing power of a single computer. This process also created user accounts and management strategy to access the server. Computer technology in the 1960’s was far from user-friendly, usable, and accessible. The necessity to connect scholars pushed researchers to create a network that permitted users to share files.40 The space race between the U.S. and U.S.S.R. facilitated the improvement of computing technology.

In the 1960’s, universities were relucant in sharing their computer resources with other users on ARPANET, pushing them to use a small computer called the Interface Message Processor (IMP) before the mainframe to control the network processes. The mainframe was only responsible for the initialization of programs and data files. The interaction of networks thus led to the Network Control Protocol (NCP), in which the Transfer Control Protocol verified the various computers on the network.

The rising number of participants introduced new technological improvements to the net. The introduction of e-mail, inter relay chat (IRC) systems, and Bulletin Board System (BBS) boosted the number of network users.41 These platforms also paved the way for computer- mediated communication and initiated the sharing of information among different groups. Hacker groups and technology fans mostly used these earliest forms of computer-mediated communication platforms. After the 1990’s, the growing number of Internet users drastically changed human-machine interaction. This development quickly evolved into intensive computer-mediated communication. Hackers and cracker groups42 in different parts of the world shared their technological expertise. These groups also played an important role in cultivating hacker culture and capabilities. Unauthorized access to computers increased swiftly in places where the network was available. For example, Group 414, formed by a group of teenagers from Milwaukee, launched attacks against Los Alamos National Laboratory, Sloan- Kettering Cancer Center, and Security Pacific Bank. Attacks instigated by the hacker group Legion of Doom forced the government to take steps toward the computer security act.

As computer technology continued to develop, automation became more common, requiring less human intervention in its routine processes. The major process control computing technology is the supervisory control and data acquisition (SCADA) system. In the early years of computing technology, SCADA systems were monolithic structures, which generally held all operations on a mainframe but limited the capabilities of monitoring systems. After the improvement of time management capabilities of central processing unit (CPU) in mainframe, industry started using distributed SCADA systems.

Distributed SCADA systems often share control functions and real-time information with other computers in the local area network. These types of SCADA systems also perform limited control tasks better than monolithic SCADA systems. In most nuclear power plants, the following three components comprise SCADA systems:

  • Sensors that measure the condition in specific locations;
  • Operation equipment such as pumps and valves;
  • Local processors which communicate between sensors and operation equipment43.

There are four different types of local processors, including Programmable Logic Controller (PLC), Remote Terminal Unit (RTU), Intelligent Electronic Unit (IED), and Process Automation Controller (PAC). The following are the main goals of processors: to collect sensor data; turning on and off operating equipment based on internal programmed logic or based on remote commands; translating protocols for the communication of sensors and

operation equipment; identifying alarm conditions; and short-range communication between local processors, operation equipment, and sensors. This type of communication mostly flows through short cables or wireless connections.

Host computers act as the central point of monitoring and control. The human operators monitor activity from host computers and take supervisory action when necessary. It is possible to change the rights and privileges of host computers by accessing the Master Terminal Unit (MTU). Long-range communication travels between the local and host computers,

using different methods like leased lines, satellite, microwave, cellular packet data, and frame delay. These types of SCADA systems can communicate through Wide Area Networks using ethernet or fiber optic connections.

SCADA systems use several programmable logic controllers (PLC) to monitor the different processes and to make necessary adjustments for the regular flow of operation. These PLCs also alert the operator when human intervention is required. The rising connectivity of SCADA systems permits including human operators to monitor the process with real-time data through a monitor. Yet connectivity makes the system more vulnerable to network

attacks. In these networked SCADA systems, carry the human machine interaction into another level. The networked SCADA systems underlined the importance of human operators and their role to monitor the alarms for the survival of the critical infrastructure.

Human operators form the vital nodes for the function of critical facilities like nuclear power plants. In nuclear power plants, human operators are the first level of protection in preventing an accident or noticing a problem. In the control room, the operator has to check designated indicators of its station and make the necessary adjustments to sustain the continuity of

the process. The process of human-machine interaction faces two major problems: human centered and hosting computer interface-centered.

The software that controls and communicates with SCADA systems is designed to provide required information and initiate alarms to alert human operators when a problem arises. Early designs of SCADA systems showed interface designs that were primitive and not focused on the cognitive and psychological awareness of the operators. The biggest problem with interfaces comes from static design which is characterized by a lack of movement and animation. Poor graphics accompanied the interface and only change when triggered by alarms. The alarms themselves had no varying alarm types according to the threat level. In some cases, the size of the alarm messages prevents the operator from seeing other information on the screen. Peripheral equipment, such as monitors and keyboards, were also not designed

to permit the operator to easily comprehend the information and respond quickly with as little effort as possible.

In the old interface designs, information was dispersed across three to four monitors. Insufficient screen space was one of the problems reported by the operators. In a modern nuclear power plant, the interface has to be designed with a higher resolution, permitting operators to follow the entire process on one large monitor no smaller than 40 inches. During the acquisition process, hardware experts specializing in screens must determine the monitor.44 The large screen promotes teamwork in noticing errors and increases the situational awareness of the operators. The host computer’s interface is critical to catching anomalies that might be the result of a cyber attack.45

Problems Induced by the Human Factor

Following such a static monitoring process requires a high level of alertness and attentiveness and is not easy for an operator to sustain this mode throughout his or her shift. This is not

a personal problem but an issue of human cognitive and physical capabilities. As different SCADA systems use different interfaces, human operators need time to adapt to the new interfaces. In the early months of training, the interfaces confuse operators with the multitude of alarms, messages, and information. After the adaptation period ends, the development of tunnel vision appears as a risk as human operators acclimate to static interface designs and tedious repetitions.46 In the beginning, being a human operator seems like a dynamic post, but as time goes by, the alarms become routine and daily tasks extend response time. According to one report on this topic, “the maximum manageable alarms per hour per operator are around 12, and around 300 alarms per day and most of the required operator actions during an

upset (unstable plant and required intervention of the human) are time critical. Information overflow and alarm flooding often confuse the operator, and important alarms may be missed because they are obscured by hundreds of other alarms.”47

Operators complain of many distractions in the control room, including human interruption

and phone calls. Peace and quiet in the control room is critical to allowing operators give their full attention to the screens they are monitoring. Consequently, unauthorized personnel in the control rooms would further jeopardize the security of facility.

Since the human machine interface is the only window to monitoring nuclear energy plants, the human operator and his or her host computer are critical in preventing an accident or security breach. However, most human machine interfaces bring their own set of security concerns due to problems in design. Most of the Human Machine Interfaces (HMI) is designed to provide relevant information to human operators in 2D graphic design. The main focus of HMI designs are functionality, usability and visibility. The neatly and interactive desings are crucial to support the attention of the operator. Thus, the human – machine interface is transforming into a front for cyber defence. The HMI also functions as the defender of a system against abnormal activities.

The basic principle of a sustainable security system is to implement a precise and clear security policy, of which major points have to be defined by state regulations and institutional details must be written by organizations. Formulating a security policy would help managers to build measurable and self-perpetuating systems where the division of labor is clear-cut. Computers and electronic devices connected to local networks maintain the physical security of power plants. Their network connectivity, however, makes them especially prone to cyber attacks.

Therefore, strong communication and cooperation among the managers of physical and cyber security fields is a must. Both managers have to know the others’ field to grasp the details and prepare for possible threats.

Security has to be understood as a continuously evolving cycle that must be assessed regularly according to the changing nature of threats. In nuclear power plants, the conventional security approach draws fixed limits for physical and cyber security sectors. In the age of hybrid entities, the international community must implement smart security policies that provide flexibility, adaptability, and cooperation. For the new facility in Turkey, the physical and cyber security managers of the nuclear power plant (or critical infrastructures) have to follow these major points:

  • Understand legal and regulatory requirements in Turkey and internationally;
  • Integrate security into the organizational culture and insist on the perception by all stakeholders;
  • Develop effective risk assessment programs;
  • Develop holistic governance programs for managing information risk;
  • Assess the impact of human factors and security strategies and potential breaches of security;
  • Develop emergency management policies;
  • Develop and ensure quality control in information assurance and security management;
  • Improve alternative communication technologies for emergency cases;
  • Follow new technologies to upgrade the security level of the facility.

On the first day of operation, the nuclear facility is equipped with the latest technology to work smoothly and securely. However, the emergence of new technology presents the question of how frequently a power plant should update its technology. There are various

academic assumptions that focus on the market competition of a facility. Facility managers and government officials must periodically discuss emerging technology and assess the current condition of plants from a security perspective. The maintenance and update of the security system is as critical as writing the security policy of the plant.48

The technological protections tailored to specific nuclear power plants create over-reliance on

these tools at the expense of human capacity. However, the capabilities of a plant’s personnel are critical to the planning, update, and maintenance of the facility. Safe security systems could be breached due to poor training, inattentiveness, and lack of necessary maintenance of staff. Continuous training and coordination of the disparate security systems in the nuclear power plant are vital to sustaining nuclear safety. Attacks on nuclear facilities can require the

coordination of perimeter security officials, cyber security managers, and SCADA engineers. In such an environment, division of labor must be clearly defined and implemented by managers to prevent a chaotic environment in the case of an emergency.

Another critical security aspect is dissemination. It is a known truth that facility employees rarely read security policies and amendments to security regulations. Motivating employees to follow these technical information and policy documents, and to take necessary caution when disseminating information presents a challenge. An administrator has to find ways to motivate the employees to abide by the security culture once it is established.

In the Turkish case, the language barrier presents another issue. Operator companies (Russians in Akkuyu and the French and Japanese in Sinop) have to ensure that technical and policy documents are available in Turkish in order to overcome any misunderstandings and prepare for contingencies.

Security Levels and Security Clearance

Cyber protection of nuclear power plants requires commensurate attention to perimeter security. Physical security comprises an indispensable part of cyber security since nuclear power plants run its firewalls and intrusion detectors on physical servers. Accessing them would be the first step in an attack. Fiber optic cables and other exposed connections must be protected from malicious attack. In some cases, scissors would be more harmful than Trojan viruses.

Therefore, the protection of computer systems, cables, and connections to the electrical grid should be categorized as high-risk assets. Inside the power plant, computers should be

categorized according to their security clearance level. Lower-level computers’ access to high- security computers should be banned. These security protocols should be checked periodically with the assumption that security rules are not being followed.

All these security measure are tied to the control of any equipments which has electromagnetic capacity used in the screening process at the entrance of a protected area. Since the coverage of electromagnetic devices are so large, the site management will decide how to limit these types of devices. Stuxnet showed that mobile devices, cellular phones, USB devices, NFC devices, RF devices, external hard disks, laptops, CPU-operated devices, and any device with bluetooth and wireless connectivity could be used to transfer malware. Admitting entry to these devices into facility grounds must be limited and under strict control. There are examples of facility employees that to use their relationship with screening officers to bring their magnetic devices into protected or vital areas. All visitiors have to follow screening process of the facility and drop (and lock) their electromagnetic devices to the reserved boxes for their usage. To prevent tailgating, the use of mobile phones in the entrance of checkpoint has to be restricted.49 The electromagnetic devices have to be collected from visitors and must be kept in a Faraday cage in the protected area of a nuclear power plant to prevent possible intrusion to the network’s system. The screening process should be repeated upon exiting the facility to ensure no magnetic devices are taken out of the site.

The computer and network systems of a nuclear facility are another major security concern.

Nuclear power plant systems require hardware replacements and maintenance from time to time. The regulator has to organize how the operator will design the hardware support system. All new hardware should be tested and observed by national authority of test bed.

Since the processes take time, the regulator has to encourage the operator to create a hardware management system before the operation of the facility to stock the spare parts. By this way, in any breakdown the facility management quickly replaces the required parts without any delay.

Also, third-party contractors should go through background checks. Since Heating, Ventilation, Air Conditioning (HVAC) management systems are designed for functionality and robustness but not security, these are considered less secure components of nuclear power plants. However, today’s HVAC systems are IP-taking appliances which are connected to local networks. To upgrade and patch the systems, the contractors access the HVAC servers from outside the facility. The vulnerabilities of these servers are quickly turning into systemic risks. Any intrusion to these HVAC systems could easily be used for a hybrid attack. The regulators and operators of nuclear power plants must be sensitive to the HVAC systems at all levels of security.50

reference link :

More information: Arvind Sundaram et al, Covert Cognizance: A Novel Predictive Modeling Paradigm, Nuclear Technology (2021). DOI: 10.1080/00295450.2020.1812349
Matthias Eckhart et al, Digital Twins for Cyber-Physical Systems Security: State of the Art and Outlook, Security and Quality in Cyber-Physical Systems Engineering (2019). DOI: 10.1007/978-3-030-25312-7_14

Yeni Li et al, Data trustworthiness signatures for nuclear reactor dynamics simulation, Progress in Nuclear Energy (2021). DOI: 10.1016/j.pnucene.2020.103612

Arvind Sundaram et al, Validation of Covert Cognizance Active Defenses, Nuclear Science and Engineering (2021). DOI: 10.1080/00295639.2021.1897731


Please enter your comment!
Please enter your name here

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.