MX Grid Complements MXIE
|
NEWS
|
At the end of April 2024, Nokia launched MX Grid, a distributed edge computing platform that provides the orchestration and execution of embedded Machine Learning (ML) models and applications on “far edge” devices. The far edge usually refers to the place where data are generated, namely the sensor node or machine. In this case, Nokia’s MX Grid appears to be targeting the sensor aggregation point—at one remove from where the data are generated—running on “micro-edges” (gateways or industrial PCs), which collect data from machines, sensors, and video systems to provide local data analysis close to where the data are generated. Suggested use cases include process monitoring and predictive maintenance, which ABI Research has identified in recent research as being a fast-growing application for embedded ML, with a forecast 116 million ML-enabled hardware shipments targeting Condition-Based Monitoring (CBM) applications by 2029.
MX Grid is designed to complement and extend Nokia’s Mission-Critical Industrial Edge (MXIE), a platform launched in 2021 that provides industrial connectors, edge node management, application orchestration and execution, and a third-party application marketplace for “thick edge” hardware—like Dell PowerEdge servers. Targeted at industrial markets such as manufacturing, energy, and logistics, MXIE application partners include Crosser, Litmus Automation, and Microsoft.
This being Nokia, there is a strong networking flavor to these products. Similar to other networking vendors like Dell, Advantech, and Cisco, Nokia’s platform is designed to provide an out-of-the-box fully-integrated ecosystem to simplify industrial deployments; Nokia’s addition is to use MXIE to support the deployment of its private Long Term Evolution (LTE) and 5G networks. Similarly, Nokia’s micro-edges will be connected “by private wireless networks and/or reliable Wi-Fi using MX Boost.”
Growth of Edge Application Platforms
|
IMPACT
|
Edge processing in industrial markets usually refers to deployments on servers or on-premises data centers. Increasingly, as ML compression becomes more advanced, edge-first platforms are emerging, specializing in deploying and managing ML models and applications across distributed edge devices. These platforms frequently target not only the thick edge, but also the thin and far edge, providing the orchestration, execution, and supervision of applications and ML models running across distributed infrastructure. One of the key benefits of a distributed approach is to create a data processing hierarchy, feeding data back to a higher-level system on a need-to-know basis; where a higher level system does not “need to know” what is happening at lower levels, action is taken locally or data are not transmitted upstream, reducing the amount of information these systems need to process and reducing the processing latency.
It is within this context that Nokia’s MX Grid is important. Building models and compressing them to fit on different types of edge hardware is increasingly straightforward; it is the deployment and maintenance of models in a production setting that is the challenge vendors across the Industrial Internet of Things (IIoT) stack are looking to solve today. This is where orchestration platforms and end-to-end solutions come into play, partnering with embedded ML model builders to close the Machine Learning Operations (MLOps) pipeline by providing an integrated management environment for ML deployments across various levels of the edge. In Nokia’s case, MX Grid Manager provides this functionality, acting “as a central hub for device, software, application, and model management.” MX Grid Manager coordinates ML deployments across MXIE, as well as across the micro-edges deployed closer to where the data are collected.
As the Ecosystem Develops, Develop Your Ecosystem
|
RECOMMENDATIONS
|
Embedded ML is one of the key developing technologies of interest to IIoT suppliers and customers, and technology suppliers targeting industrial customers are all looking to understand the role they can play in this market. As a result, the market is becoming increasingly competitive; in addition to traditional industrial application platforms, edge-first ML application platforms like MicroAI and Stream Analyze, orchestration specialists like Barbara and ZEDEDA, or networking hardware vendors like Advantech and Dell are all launching tools to support customers as they move ML deployments to production. In particular, with higher data demands of modern industrial environments, industrial technology suppliers should consider how they can offer a multi-edge environment for their customers to create both a deep processing hierarchy and broad access to specialist tools and applications.
While the market previously has been dominated by incumbents like Amazon Web Services (AWS), new technology capabilities and new vendors are creating a patchwork of specialist tools focused on automating the ML pipeline, rather than relying on an expensive and consultative process. Some of these vendors will succeed and some will not, but the coming months and years will be challenging for technology suppliers and customers alike to navigate as they look to understand who provides best-in-class capabilities at different levels of the ML value chain. Interestingly, no new partnerships were highlighted in Nokia’s MX Grid announcement. These will be critical for both Nokia and the broader embedded ML ecosystem to seek in the coming year if they are to demonstrate their competitiveness.