Augmented reality: Seeing HMI in a new way

June 12, 2023
Augmented reality user interfaces are now available to expand traditional HMI/SCADA so users have better options for obtaining operational information and controlling/interacting with their automation systems.

Available data — both the types and quantities — in today’s digitally connected facilities are expanding rapidly and are driving significant change to industrial automation and control systems environments. Visualization, automation and information platforms must progress to effectively manage and handle this data, and to put it into context for best usability.

Despite the progress made in human-centric design practices, traditional human-machine interfaces (HMIs) and supervisory control and data acquisition (SCADA) are primarily designed to “control the process” rather than view all the data in context as would benefit a person in the physical or virtual plant environment. Augmented reality (AR) when applied to HMI/SCADA functionality represents a user interface (UI) advancement providing new capabilities to collect, contextualize and present the data in new meaningful ways. AR does not necessarily replace HMI/SCADA, but provides a new experience complementary to traditional methods.

AR as an HMI/SCADA approach is already useful for operators in the present, and is continuing to improve for operators in the future. This article discusses what a “future-proof” AR UI looks like and which capabilities are available today.

An HMI that informs

One of the key HMI/SCADA requirements is to keep operators instantly informed about issues or needs in the facilities for which they are responsible, whether or not they are physically located there. Information must be provided in context so operators can readily understand it and take action.

In the same way that a heads-up display in some modern automobiles displays important information onto a windshield, augmented reality can superimpose a virtual interface on top of real-world assets. Industrial AR interfaces can use industrial-grade tablets that the operators carry, or goggles that they wear for hands-free operations.

This virtual interface offers several ways to change the way work is accomplished:

  1. AR can prompt operators or technicians to carry out step-by-step instructions, guiding them through the maintenance activity with support such as access to any required manufacturer’s manuals, diagrams, facility installation drawings and the associated work permits/stop points to verify that any necessary safety interlocks have been properly set.
  2. As employees change roles, or new employees join, AR can help train and guide them through work practices and procedures, teaching them the approved best practices, rather than on-the-job “word of mouth” with associated biases, shortcuts and loss of information.
  3. By providing access to view data originating behind safety barriers or inaccessible locations, without requiring an outage, AR increases safety by lowering the exposure risk.

To achieve these capabilities and to make use of its full potential the AR UI requires an updated approach for automation systems designs.

Delivering AR

Hardware for an AR interface needs sufficient processing capability to process inputs, so it knows where it is always physically, with sufficient bandwidth to connect and dialog — send/receive data — with other systems and personnel. These connections need to be versatile, simple, be based on a suitable integrated development environment (IDE), and ready to integrate with existing operational technology (OT) control system and IT architectures. This typically involves an architecture with a central project configuration and data server, working with AR applications running as clients (Figure 2).

As implied earlier, the AR interface itself is hardware independent, reliant instead on the IDE and standards to provide the required connectivity with the target data sources. The interface also requires some form of user input to support interaction with the user. User input is normally a combination of gestures — including the familiar tap/swipe/pinch used with touchscreen tablets and phones — and also vocal commands, which may require some training to adjust for accents and pronunciation. In general, however, good industrial AR implementations are easy to use with little or no training required.

The need to keep the AR components small and lightweight for portability, yet able to run for extended periods, drives many of the decisions on how data is processed. Because the AR UI also needs to be spatially aware, it requires sufficient edge processing capabilities to support cognitive recognition including artificial intelligence (AI) services that use image analysis to identify objects for context to discover the associated supporting data requirements (Figure 3). Context is then used to pull the real-time data for the identified equipment from the control system, and depending on the application other useful data such as associated manuals and drawings relevant to the activity they are performing.

Depending on the application or user and role of the individual using the AR system, the amount of data to be accessed, analyzed and displayed can vary considerably. This wide range of demands means all the system components need to be scalable, not just for the range of applications but also to provide additional capabilities as AR technology evolves and grows.

AR use cases

Knowing what a system should be able to do, and how it can actually be applied, are two different things. The following use cases demonstrate how AR is being used with different interfaces today.

Using a tablet as the AR interface presents a UI that most operators are familiar with, so the learning curve and incremental investment is reduced. Figure 3 shows an electric actuator with operating state as well as information on important attributes associated with the capability of the device to act when called upon to do so.

Another option is for operators to wear a heads-up display, such as the Microsoft HoloLens or some other smart glass projection. Using heads-up technology, a technician can be prompted and guided through an inspection procedure, with each step highlighting and illustrating the required work.

Upon acknowledgement that each step is complete, the next step will follow and so on through the complete approved procedure, which can include the associated maintenance records so they can be completed in real time, rather than back in the shop at a later time. The procedure will also include the proper steps to return the unit to its operating state, and can include specific details such as replacing a gasket and the required torque on the bolts.

Because the AR interface is a client, it reflects the data it is receiving and can also support different localized languages and many associated details. Users can access the data in a control room environment, or right next to the equipment of interest (Figure 3). In addition to clearly identifying the unit, as well as process information on the pump flow rate and pressure, an AR display can also show motor details such as bearing temperatures, thus providing the operator the status of the process as well as the key variables impacting the health of the complete system.

AR UI as the new HMI/SCADA

AR can present more information to a user than ever before, with a high level of integrity and unprecedented contextualization. Users of an AR UI therefore become more effective in their jobs and can make better informed decisions, with greater safety and at lower risk to themselves and the process. AR can be deployed to support alarm signaling with instructions, HMI interactivity, guided maintenance, barcode/QR scanning with feedback, connectivity with other personnel and much more.

With the workforce transition underway — especially as workers are more likely to be remote/roving — combined with the increasing complexity of the systems being controlled and number of points being monitored, traditional HMI/SCADA can benefit from new ways of making actionable information fully available for users. The next generation worker requires a next generation HMI/SCADA. Augmented reality user interfaces represent this next generation HMI.

Silvia Gonzalez is the director of product management, software, for Emerson's controls and software business. Silvia is responsible for developing IIoT, industrial automation and controls technologies that bring increased value to customer operations. Silvia holds a bachelor’s degree in Electrical/Electronic Engineering from Universidad La Salle, Mexico, has received a Digital Business Strategy certificate from MIT, and is based in Houston, Texas.

Emerson

www.emerson.com

Sponsored Recommendations

Choosing The Right Partner for CHIPS Act Investments

As the U.S. looks to invest in the semiconductor research and production using CHIPS Act 2022 funding, it's important to choose the right partner.

EMWD Uses Technology to Meet Sustainability Goals

Eastern Municipal Water District pilots artificial intelligence-enabled control and machine learning to help save energy, reduce costs, and improve quality.

Protein Processing Solutions: Automation & Control

For protein processors looking to address industry challenges, improve efficiency, and stay ahead in a competitive market, Rockwell Automation offers tailored automation, control...

Automotive Manufacturing Innovation: Smart Solutions for a Connected Future

Rockwell Automation provides automation and control systems tailored for the automotive and tire industries, supporting electric vehicle production, tire production, battery production...