Development of Edge Computing-Based Products in the AI Era – Challenges and Trends

July 8, 2024

The computing industry is undergoing constant changes at an accelerated pace. Technologies that have emerged in recent years continue to reshape the development, implementation, and management of technological products. Among these changes, impacts of the cloud computing revolution and the meteoric advancement of AI technologies stand out. These revolutions are interconnected. The cloud revolution has changed the way new software products are developed in every aspect: architecture, development infrastructure, data engines and connectivity products, information security management, and more. The AI revolution completely changes the approach to solving complex technological problems such as computer vision (image and video understanding), natural language processing, signal analysis, and analytics.

Companies developing technology-rich products seek to harness these changes to reduce risks, shorten development times, and enable competitive innovation in their products. However, when the technological product is performance-sensitive and requires interaction with physical assets like cameras, network and security products, machine controllers, or existing local computing systems, the architecture can no longer be solely cloud-based. The challenge of performing significant computing tasks close to physical assets has led to the trend of integrating substantial edge computing.

In recent years, edge computing has become an accepted technological approach that processes data close to its source at the network edge, rather than sending all information to the cloud or a remote data centre. In an era where data volume is increasing rapidly and AI demands and enables real-time responses, edge computing becomes the ideal solution. Local processing allows immediate interaction with physical assets, faster response times, reduces infrastructure load, improves information security, and enables offline operation. However, edge computing presents new challenges such as the need for complex management of numerous devices, cybersecurity threats, and adapting hardware and software solutions to distributed environments.

Despite these challenges, edge computing has become a central technology in the era of AI and cloud computing due to its numerous advantages. A major challenge in managing edge computing is fleet management, i.e., how to manage hardware, firmware, and software configurations across a distributed and extensive installation of edge computers. This management is more complex than it initially appears. On one hand, one might treat all systems like network appliances. In such products, like switches and routers, the software is a simple and unified unit that is rarely updated and is intended to replace the entire software stack including the operating system, management components, and any other required software. The common pattern here is using two storage areas for version A and version B. If version A is active, the update will replace the software system stored in storage area B, and on the next activation, storage area B will be the active area.

Managing edge computing based on modern software using such a technique is painful and expensive. There are several usage patterns around modern software and AI products for which atomic updates (those that replace the entire system) are unreasonable. AI computing is model-based and updates faster than the entire system, making it unreasonable to replace everything just to upgrade a model.

Additionally, modern software products rely on technologies like Kubernetes, container-based computing, and microservices. For these products, complete upgrades are never performed; instead, the container file is replaced, and the microservice it supports is restarted. Modern architectures encourage frequent small improvements rather than rare momentous changes, so it is not logical to replace the entire system with each micro-upgrade. Furthermore, a full upgrade is lengthy and requires restarting the entire system, which can be operationally costly and carries a higher risk.

The solution comes from the field of management software. These are marketed as dedicated products or as part of a broader IoT solution. To address these challenges, modern products for managing numerous and distributed edge computing units include capabilities for updating components such as containers and AI models and controlling them, in addition to all the classic capabilities of managing distributed edge devices like monitoring, initial deployment, and log collection. Since support for micro-updates requires supporting infrastructure at the operating system level, such products are no longer just management products but complete software platforms for the edge world. These products are available from edge computing hardware manufacturers with support for their products only, including orderly management of firmware versions which are controlled by the manufacturer, or as generic products that manage the operating system level and above.

Original source: https://www.pc.co.il/editorial/411796/