Companies wanting to create a true digital transformation (DX) must determine the best ways to interconnect their intelligent assets and growing numbers of Internet of Things (IoT) devices. They often encounter challenges as they merge information technology (IT) and operations technology (OT) systems using varying forms of networking communications technology (CT). Greg Hookings, Director of Business Development – EMEA, Stratus Technologies, discusses.
While IT networks can take various sizes and shapes, there are many consistent elements. Such is not the case when a network must reach into the industrial edge where OT systems live and work. This industrial world can include anything from an auto assembly plant to a brewery, and typically consists of both legacy systems and cutting-edge devices. Office and commercial operations also often have field edges supporting building control and other mechanical systems.
Trying to integrate these edge elements with the core IT systems is challenging because of the nature of the equipment involved. Motors, conveyor belts, packaging lines, compressors, pumps, distillation towers – these things are controlled and monitored using systems often unfamiliar to IT personnel. Without understanding the nuances of OT networks, interconnection often results in a sub-optimal system. Yes, there’s connectivity, but network availability may be reduced, data storage can be inadequate and cyber security vulnerabilities might emerge. Eventually, these problems can bleed into the core networks, causing slowdowns and congestion.
Such situations stem from pursuing the wrong approach to OT asset management. All the specialised hardware and software supporting these functions can’t be strung together using a semiautomated provisioning and management approach. Success instead depends on an automated software-defined infrastructure overlay to handle management and orchestration of hardware and software assets.
Using best practices
IT architects approaching the OT world to establish this infrastructure, especially the edge computing elements, should follow three critical design best practices.
First, stick with a use-case driven approach. While conventional IT network designs are shifting more to mixed workload models, edge computing architecture is still driven by concepts as basic as the location of devices in a facility and the hardware form factor. Placing a hardware device in an area of a plant or building that is hot and dusty can have a major effect on its survivability. This may require asking questions as to whether a device should be embedded or stand-alone based on the environment, and also what duties it must perform. Is it interfacing directly with IT-, OT- or CT-related data? Does it communicate via wireless, ‘standard’ Ethernet or some industrial variant, USB or another interface? Answers to these questions will direct the selection of the proper compute platform and operating system.
Second, consider the applications hosted on the edge platform. Working hand-in-hand with the concept of use-case centricity, it is important to consider which types of applications (IT, OT or CT) will be hosted in a particular part of the infrastructure and how they will be deployed. Virtualisation, through virtual machines and containers, can bring core-like functionality to the edge while supporting concurrency and sharing. Additionally, the infrastructure must support the types of communication capabilities – wired, Wi-Fi, cellular or industrial wireless – needed to provide connectivity between the edge and the core.
Third, designers must meet service-level objectives. Designing infrastructure capable of delivering exactly the right mix of computing and data persistence resources from start-up is difficult at best. Nonetheless, it is critical to support the stated use case-specific applications. Success in this situation depends on building elasticity and scalability into the infrastructure so computing resources and applications can be added or removed as necessary. This calls for a mix of core/cloud, fog and edge-only application deployments, which may need to be adjusted as operational experience is gained.
Navigate the data
Initial forays into OT can leave IT architects swimming in data from hundreds and even thousands of sensors and endpoints. If the practices suggested so far have been implemented, management and analysis will be a much easier task. A hybrid data management framework incorporating a data lake platform for combining structured and unstructured data can avoid a flood. It can support effective management of raw sensor and device data, with initial analysis directing it to the appropriate part of the core.
Extending connectivity to the edge can introduce a host of security issues since typical industrial equipment was designed with little protection in mind. Industrial equipment also has a long lifespan, making the problem worse. Anyone designing edge systems must understand the need for a multipronged approach to remediate these deficiencies, using security appliances where needed combined with secure applications and operating systems. Network activity must be monitored continuously for any kinds of physical, network, application, or data breaches.
Any solution for communicating with and securing the edge requires more than bolting on accessories. Benefitting from the data and information at the edge calls for a systematic approach following proven best practices to deploy edge platforms and manage the resulting data flows.