<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1448210&amp;fmt=gif">

On-Device Generative AI Expected to Drive Heterogenous AI Chipset Shipments to Over 1.8 Billion by 2030

Distributing workloads across heterogenous processor architectures proves more effective for generative, multi-modal, productive on-device AI applications

21 Feb 2024

Generative Artificial Intelligence (AI) workloads have moved beyond the bounds of cloud environments and can now run on-device supported by implementing heterogeneous AI chipsets. Combined with an abstraction layer that can efficiently distribute AI workloads between processing architectures and compressed LLMs with under 15 billion parameters, these chipsets can enable enterprises and consumers to run generative AI inferencing locally. Consequently, ABI Research, a global technology intelligence firm, estimates worldwide shipments of heterogeneous AI chipsets will reach over 1.8 billion by 2030 as laptops, smartphones, and other form factors will increasingly ship with on-device AI capabilities.  

“Cloud deployment will act as a bottleneck for generative AI to scale due to concerns about data privacy, latency, and networking costs. Solving these challenges requires moving AI inferencing closer to the end-user – this is where on-device AI has a clear value proposition as it eliminates these risks and can more effectively scale productivity-enhancing AI applications,” says Paul Schell, Industry Analyst at ABI Research. “What’s new is the generative AI workloads running on heterogenous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU. Qualcomm, MediaTek, and Google were the first movers in this space, as all three are producing chipsets running LLMs on-device. Intel and AMD lead in the PC space.”

Hardware alone will not be enough. Building a solid on-device AI value proposition requires strong partnerships between hardware and software players to create unified propositions. These collaborations will nurture the development of productivity-focused applications to be deployed on-device. ABI Research expects this will spur demand and shorten replacement cycles of end devices like smartphones and PCs. This will lead to accelerating shipment numbers between 2025 and 2028 as the software ecosystem matures, breathing new life into markets that have been stagnating. Automotive and edge server markets are also impacted but to a lesser extent. 

The productivity AI applications running on-device, powered by heterogeneous AI chipsets, will drive significant market growth in personal and work devices. This is reflected by the increasing penetration of heterogeneous AI chipsets, eventually encompassing most systems towards the end of the decade. “Chip vendors and OEMs should look to expand the productivity AI application ecosystem to tempt more customers and mature the offering. This will create opportunities analogous to the growth previously spurred by the expansion of Android and web-based applications in their respective markets and require reaching a critical mass of applications that appeal to a broad range of customers in consumer and enterprise markets. Success in creating popular and useful applications could make or break the transition to on-device AI.”  

These findings are from ABI Research’s Opportunities for Heterogenous Computing: General & Generative AI at the Edge application analysis report. This report is part of the company’s AI & Machine Learning research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Application Analysis reports present an in-depth analysis of key market trends and factors for a specific technology.

Contact ABI Research

Media Contacts

Americas: +1.516.624.2542
Europe: +44.(0).203.326.0142
Asia: +65 6950.5670

Related Service