Data Gravity and New-Generation Workloads Force Shift from Traditional IT Infrastructure
|
NEWS
|
Large amounts of data tend to attract more data, which is commonly known as data gravity. Data gravity can affect the design and architecture of a data center as well as how the data is stored, managed, and accessed. Data gravity can also affect how businesses make decisions regarding where to place their data and how it is shared with external parties. Banks, for example, face data gravity challenges where data from ATMs and mobile banking needs to be consolidated from various places.
According to ABI Research’s Network Technology and Market Tracker analysis report, worldwide 5G mobile data traffic is expected to reach 1676 Exabytes in 2026, growing at a Compound Annual Growth Rate (CAGR) of 63%. 5G is designed to connect to any device, people, or platform. This increase in connectivity generates large amounts of data, which subsequently increases the demand for computing capability and infrastructure capacity.
Digital transformation has been the building block for private enterprises looking to provide excellent customer experiences. Many of these enterprises have built new digital applications and services. These new solutions come with new-generation workloads in the form of Artificial Intelligence (AI) and Machine Learning (ML), algorithm-based recommendation systems, the Internet of Things (IoT), blockchain, and more, which all have different requirements compared to traditional IT workloads.
DPUs Enable Better Performance for Digital Transformation, 5G Deployments, and AI/ML Workloads
|
IMPACT
|
The growth of data due to digital transformation, the proliferation of 5G services, increasing integration of AI/ ML workloads, etc., puts a huge strain on data centers that are processing these data-intensive, new-generation workloads. The introduction of intelligent accelerators such as DPUs has given data center providers and private enterprises an alternative solution to managing and optimizing these new workloads.
Here are some use cases that can accelerate DPU adoption in the data center:
1. Offloading the UPF from the 5G packet core.
Within a 5G packet core, the User Plane Function (UPF) is the highest workload, responsible for packet inspection, routing, and forwarding. CSPs can utilize DPUs to offload UPF workloads from server Central Processing Units (CPUs), thus providing better UPF performance. From an energy consumption perspective, the server CPU consumes less energy due to the offload, resulting in significant power savings for a large data center.
2. Accelerating cloud-native AI development.
The training process for AI and ML algorithms involves huge amounts of datasets and can be very compute-intensive. Red Hat’s partnership with Nvidia with OpenShift and BlueField-2 DPU enables enterprises to accelerate the development and deployment of cloud-native AI applications by utilizing DPUs to offload container management functions, leaving the server CPU to run AI-specific workloads.
3. Enhance virtual networking and security functions.
DPUs are also used to offload network security functions. DPUs enhance the acceleration of virtual networking and security functions such as load balancing, firewall, network isolation, and remote management. DPUs provide network visibility and tracing which is important in the event of a network security breach, enabling security engineers to zoom in on the point of attack and isolate the incident for better response and future planning.
Data Centers Lead the Way but Further Technological Advancements Expected for DPUs
|
RECOMMENDATIONS
|
The current use cases for DPUs revolve around large data centers, with cloud hyperscalers and CSPs having the most compelling benefits in using DPUs. With DPUs designed to accelerate new-generation workloads that create and consume large amounts of data, cloud hyperscalers and CSPs are often looking at solutions that can help optimize and improve the performance of these data and compute-intensive workloads.
However, this does not mean that private enterprises cannot benefit from using DPUs. Enterprise businesses will have servers running from an on-premises data center or private cloud which hosts business-critical applications. These applications will require basic storage, network, and security functions. DPUs can be used to increase the performance of these enterprise servers, offload auxiliary functions, and leave the server CPU to compute application-specific workloads.
DPUs are still growing and evolving rapidly. This industry is still in its infancy, with vendors such as AMD, Broadcom, Intel, Marvell, and Nvidia all having products and solutions that fall under the DPU category but vary in terms of architecture, functionality, and target customer market. As the DPU market continues to grow, ABI Research expects a further leap in terms of the technological capability of DPUs beyond the current storage, network, and security functions.