Fog computing versus cloud computing

Nov. 27, 2017

A look at the future of SCADA and BMS servers in the age of cloud and fog computing

As the technology used in Microsoft Windows-based Supervisory Control and Data Acquisition (SCADA), Human Machine Interface (HMI) and Building Management Systems (BMS) has evolved, the preferred architecture for their deployment has also evolved. Twenty-five years ago, the focus of architecture discussions centered on the pros and cons of migrating to a client-server architecture versus a stand-alone PC. At the time, it was the subject of much debate. The conversation was happening because Microsoft had released the NT operating system, which when combined with improved local area network technology, enabled a cost-effective and reliable client-server deployment on low-cost PCs. Up to this point, that had only been realistic on much more expensive VMS or midrange UNIX platforms.

As client-server architectures became the norm, the next architecture debate focused on thin clients versus thick clients. It is a little hard to imagine these days, but the viability of using a web browser for client access was not widely accepted. Thick clients, often referred to as rich clients, were generally preferred because the user experience was “richer” on a full Windows operating system than was available at that time using one of the new HTML browsers such as NetScape or Internet Explorer.

At the same time, another new technology that was becoming available was remote access to the PC. Software packages like PcAnywhere (now part of Symantic) were widely used to access a PC with a telephone line and a dial-up modem. The setup of this type of remote access was relatively straightforward and easy to do. This was significant to the people responsible for the supervisory systems because they generally did not have access to corporate IT resources.

Fast forward to the current technology, and we are still grappling with the right architecture for supervisory systems, as we work to further drive down costs, improve reliability and increase efficiency. Due to the increased connectivity of these systems today, improving cybersecurity has become an equally important goal.

For SCADA, one of the most important technology developments was the trend away from serial communications to IP (Internet Protocol)-based industrial network communications. This allowed much more flexibility in where the supervisory server is physically located relative to the programmable logic controller (PLC). Remote terminal units (RTUs) were geographically distributed away from the SCADA host. This was valuable in realizing distributed control and remote monitoring, but the limited bandwidth radio communications of the time, made this architecture unsuitable for many SCADA applications.

Combining the advances in client server technology and network technology, it has, for several years, been possible to migrate the SCADA or BMS host server out of the control room and off the shop floor and into a physically secure, environmentally controlled data center. In the lexicon of cloud computing, when the supervisory host is on a dedicated machine running in a data center, we have created a private cloud.

As an illustration, a large grocery store chain has a warehouse where they store cheese products. Store owners purchased an air quality control system to ensure the “cave” was kept at optimal conditions to maintain the quality of their cheese products. This environmental system consisted of PLCs supervised by PcVue HMI software. It was installed on a PC in the warehouse. The management team for warehouse operations were concerned that physical damage to the computer, for example, a forklift accidently running into the PC, would cause them to lose control.

Since the PLCs used IP connections and the HMI was easily converted to a SCADA host with HMI client, it was straightforward to move the SCADA/HMI to the IT data center. This data center is located at the headquarters of the company. It was installed on a dedicated server. In the process of migration, the single-user HMI station was simply converted to a multiple-user SCADA system so that the maintenance team had access to the same information that operations could access.

With virtualization technology, it is not necessary to have the dedicated machine in the data center. This is now becoming increasingly common, as supervisory vendors make the necessary adjustments to allow both the client and server to be deployed on virtual machines (VMs). The next step in the warehouse migration project was to move off the dedicated server and onto a VM. The rationale for the second move was that it is possible to restore a VM from a backed-up version in approximately 15 minutes. In the data center, IT had a common process to continuously back up the VMs on a frequent basis. For this customer, this was preferable to having a redundant backup because it was a familiar process for IT to manage.

This illustrates the growing role of IT in operational technology (OT) deployment. Whereas some aspects of managing IT versus OT have different primary objectives, there are other aspects where IT best practices will benefit OT managers working in collaboration with IT including developing a robust cybersecurity culture.

For IT, business risks are mainly related to the confidentiality and integrity of the data processed and hosted by the IT systems, which leads to intangible consequences such as loss of knowledge and loss of reputation for the enterprise. For OT systems, business risks are related to the availability, integrity, reliability and safety of the industrial control system (ICS) itself. Risks include operational consequences in the physical world such as production shutdown and financial losses, environmental damage and the inability to control the process or to obtain accurate information about its state.

Further collaboration and development of appropriate common practices and shared principles are the hallmark of top-performing companies and has been enabled by the advancement of OT architectural thinking.

What about the public cloud? While the technology is the same, in the grocery store warehouse example the company did not want to risk loss of the system if access through their internet service provider (ISP) was lost. They also were concerned about an architecture requiring the use of the internet to connect their PLCs to their SCADA host. The concerns were both from the perspective of latency and reliability. In some cases, data privacy is also a concern with having third-party hosting of the operational data. In this case, that was not a concern.

In a second example, the public cloud is the ideal place to host the SCADA server. We have worked with a service provider who deployed SCADA software on Amazon Web Services (AWS). In this case, the architecture consists of a private Windows server running on a VM connected via a Distributed Network Protocol (DNP3) to devices that are strategically and, in some cases, remotely located to provide real-time views of the voltage across a network of high-voltage transmission lines. The DNP3 protocol is used to connect to monitor and control electrical devices and is widely used in North American substation automation.

In this case, the connection was made by creating a virtual private network (VPN) running over satellite links to some remote locations, far from the substation. The ability of the utility to monitor the transmission line voltage in real time has great value to them. The satellite connection may have latencies that aren’t appropriate for substation automation, but when compared to putting a data logger on the transmission line in the field and collecting the data a week later, it is fast enough.

The utility is able to access the data with an HTML browser and a connection to a public server providing remote desktop services (RDS) for each user. The RDS server is in turn running on a VM connected to the data acquisition server via another VPN.

What is significant about these examples is that public and private cloud architectures can both be the right choice for SCADA, but one size does not fit all.

In 2017, the focus of a lot of discussion is Industrial Internet of Things (IIoT) and Industry 4.0. While Industry 4.0 goes beyond the scope of operations, within the OT space, both concepts are driving toward cloud-based solutions. During the past few years, the focus of discussion on IIoT has been increased data acquisition and big data analytics rather than supervisory control.

In 2014, it was difficult to find a consensus of what IIoT was and what the architecture for it would be. Over the past three years, a common architecture has emerged. One is in which field sensors are connected to gateways that move the data to a public cloud where it is analyzed and where individuals or software packages may access it with standard application program interfaces (APIs), but not yet standard protocols. New low-bandwidth networks are emerging from companies like SigFox and alliances like LoRaWAN, and LTE and 5G cellular providers such as Verizon. What they have in common is the expectation that data rates will be slow, updating once a day or once an hour, and that data packets will be very small, on the order of bytes. Most are priced on the amount of data transmitted to the cloud.

For many SCADA and BMS applications, this allows for sensor integration not possible in the past. For example, a private LoRaWan network can be constructed to integrate IIoT sensors without going to the cloud but directing the data from the gateway directly into the supervisory system.

In the parlance of the IIoT, this is known as edge computing, as in the edge of the cloud. Cisco coined the term “fog computing” to recognize that some data treatments and response needs to be done near the process rather than from the cloud. In the presented examples, the control of the warehouse was essentially fog computing even though it didn’t make use of IIoT sensor data acquisition. The second example is clearly cloud computing.

Fog computing is known to provide location services and service quality for real-time applications and streaming,. It also permits a bigger heterogeneity since it is connected to end-user devices and routers. The applications can include industrial automation, transportation and networks of sensors. To most control engineers and SCADA vendors this sounds very familiar.

As we develop the language of IIoT and incorporate it into the supervision of buildings, utilities, infrastructure, industrial processes, water and wastewater and other common SCADA verticals, it is clear that, as it has been throughout the history of our industry, one size does not fit all. Software vendors that supply platforms for SCADA, HMI and BMS will need to continue to be flexible and scalable in terms of supporting various architectures for deployment including fog and cloud computing and incorporating IIoT data accessible from the cloud as well as private IIoT networks, which collect IIoT sensor data directly from a gateway device at the edge.

We have found that the location services aspect of fog computing is essential to delivering the exponentially increasing data in a meaningful and actionable way to the users of the supervisory system.

A secure contextual mobility server, which may be fog or cloud hosted, and that utilizes location services and user profiles to deliver intelligent data to mobile devices based on the users’ job responsibilities at their current location, is essential to enable the operators and maintenance teams to drive down costs, increase efficiency and reliability of supervisory control for smart buildings, smart grid and emerging Industry 4.0 evolutions.

Ed Nugent has 24 years of experience with SCADA development and implementation and is currently the chief operating officer for PcVue Inc., a global, independent SCADA/HMI provider in Woburn, Massachusetts. He has a Bachelor of Science in engineering mechanics from the University of Wisconsin and a master’s in business administration from the University of Puget Sound. He is former president of the International Society of Automation’s Aloha Section and a member of the Western States Petroleum Association. Nugent is an author and editor for the University of Hawaii’s Pacific Center for Advanced Technology Training SMART Grid Curriculum Development project; an American Recovery and Reinvestment Act program of the U.S. Department of Energy. He is an industry advisor and instructor for the Process Technology (PTEC) Program within the University of Hawaii Office of Continuing Education & Workforce Development (OCEWD) program and was an associate professor of Operations Research at the University of Puget Sound.

Sponsored Recommendations

Choosing The Right Partner for CHIPS Act Investments

As the U.S. looks to invest in the semiconductor research and production using CHIPS Act 2022 funding, it's important to choose the right partner.

EMWD Uses Technology to Meet Sustainability Goals

Eastern Municipal Water District pilots artificial intelligence-enabled control and machine learning to help save energy, reduce costs, and improve quality.

Protein Processing Solutions: Automation & Control

For protein processors looking to address industry challenges, improve efficiency, and stay ahead in a competitive market, Rockwell Automation offers tailored automation, control...

Automotive Manufacturing Innovation: Smart Solutions for a Connected Future

Rockwell Automation provides automation and control systems tailored for the automotive and tire industries, supporting electric vehicle production, tire production, battery production...