How Chipset Choices May Break a 5G Vendor Strategy
|
NEWS
|
As Communication Service Providers (CSPs) start to rapidly roll out their 5G networks, Original Equipment Manufacturers (OEMs) are developing aggressive strategies to grab 5G equipment market share. In comparison to LTE, 5G promised to provide 1000x increase in capacity, more than 100 billion connections, and less than 1 millisecond latency. To successively achieve these, one of the key factors is to develop new baseband modem architectures and advanced radio propagation technologies that can handle these stringent requirements. Although processing capabilities continue to advance, 5G New Radio (NG) places an additional burden on infrastructure that previous generation equipment cannot handle. For example, the 5G radio network now requires massive signal processing capability in base station chipsets. Field-Programmable Gate Array (FPGA), along with its capabilities of fast processing speed and reconfigurability, has been deemed as a promising solution and used by certain companies in their 5G Radio Access Network (RAN) infrastructure to handle complex signal processing algorithms.
The inherent merits of FPGA are the main motivation for the companies to design their RAN infrastructure with these chipsets, allowing them to enjoy rapid 5G deployment and earlier stage algorithm design changes. However, according to its Q3 2019 earnings call, Nokia’s 5G profit margins were dampened by the high cost of its 5G FPGA chipsets, which are heavily included in its first generation 5G products. Rajeev Suri, Nokia’s CEO, laid blame for the FPGA decision on several factors. He said at the time of the decision, Nokia was dealing with the integration of Alcatel Lucent and FPGA seemed like the best choice for time-to-market. In addition, he also mentioned that one supplier let Nokia down. To redeem this negative trend, Suri shared that Nokia is increasing investment in System on Chip (SoC) capabilities and moving aggressively to strengthen and diversify its supplier base. The shift began in 2018, and Nokia’s position will be much better at the end of 2020. This is perhaps the most prominent example of how the wrong decision or choice can heavily impact vendor positions and competitive strength.
How 5G Chipset and Base Station Design Differs
|
IMPACT
|
Is the FPGA-based 5G hardware decision made by Nokia really “short-sighted”? Before answering this question, it is necessary to reiterate 5G requirements and RAN architectures. As mentioned previously, 5G not only provides high data rates—it also embraces massive Machine-Type Communications (mMTC) and Ultra-Reliable Low Latency Communications (URLLC). However, at the time of this writing, only 5G enhanced Mobile Broadband (eMBB) has been frozen in Release 15, which means that the full specification for the other two scenarios is still not frozen and certainly not ready for commercial deployment.
In any case, baseband processing plays the key role of 5G RAN development. To support the 1000x increase in capacity, exploiting additional radio channel resources using multiple antenna techniques, e.g., massive Multiple Input, Multiple Output (MIMO), will be crucial for future 5G networks. In this case, complex beamforming design and digital predistortion algorithms require the baseband units to be equipped with high processing capability. By comparing with general purpose processors—e.g., Central Processing Unit (CPU) and Graphics Processing Unit (GPU)—FPGA, with its direct hardware configuration, can provide high computing performance with lower latency and energy cost. Such a solution is quite suitable for the purpose of baseband signal processing.
An alternative solution for baseband signal processing is to use what is called an Application-Specific Integrated Circuit (ASIC): customized chipsets and optimally designed to improve energy efficiency and reduce cost per unit in comparison to FPGA. However, it also requires very large upfront non-recurring engineering costs, and the product life cycle is quite long, e.g., more than two years in many cases. In addition, once the chipsets have been designed and produced, unlike the inherent re-programmability of FPGA, adding new features or changing the structure of the chipsets is not possible. From the above points of view, Nokia’s decision to select FPGA was a natural one, as designing and including ASICs in first-generation 5G equipment is not a natural choice due to the age of the new generation and the very likely fine-tuning changes that are always needed. All other vendors had the same strategy but migrated to SoCs more swiftly after the first year of 5G deployment.
Future Choices of Chipset For 5G RAN
|
RECOMMENDATIONS
|
Nokia’s two chief rivals, Ericsson and Huawei, had a different strategy and assets in this transition. In 2017, Ericsson opened its ASIC Design Center in Austin, Texas to focus on core microelectronics of 5G radio base stations, and recruitments are still ongoing. HiSilicon, Huawei’s fully owned semiconductor design company since it was born in 2004, helped Huawei design its own processors, i.e., Kirin and Balong. Because of these, Ericsson and Huawei can shorten their products’ life cycles and reduce the products’ costs. By contrast, two to three years ago, Nokia was busy absorbing Alcatel-Lucent and, in order to gain 5G market sharing, FPGA was naturally its only option.
Based on ABI research’s recently published Mobile Infrastructure (MD-INFR-104) Data and further developing studies, the break-even point of life time cost for developing an ASIC and an FPGA for 5G RAN infrastructure happens in the first year. In other words, with the capability of observing the high shipment demand and controlling products life cycle and cost, the company should go for ASIC instead of FPGA. Of course, it will lose product upgrading flexibility, since the standardized techniques in the 3rd Generation Partnership Project (3GPP) Release 16 and 17 have not be decided yet. On the other hand, Release 16 and 17 are prepared to activate huge amounts of vertical transformations in which most of industries will rely on private wireless networks. Ultimately, the ideal option is to diminish FPGA’s hardware cost penalty for re-programmability and enjoy its flexibility. In this case, mobile operators that decide to go for FPGA based equipment will be able to upgrade new air interfaces based on specific requirements by reprogramming the existing hardware.
In summary, this may not be a straightforward choice, as most 5G equipment currently being deployed is for consumer use cases and will certainly remain consumer-focused. Moreover, FPGA-based equipment will, in most cases, be more expensive than the cost-optimized ASIC or SoC counterparts. Finally, enterprise 5G equipment may be deployed as an overlay to existing networks, meaning that new equipment will be necessary. All these arguments formulate a complex decision process, forcing mobile operators to choose the best performing and most cost-effective vendor for the time being. The Nokia example, though, illustrates how a complex corporate environment and relying on a third party for a vital component can break a vendor’s strategy at a crucial market stage. For now, Nokia has taken immediate measures to counter this, but smaller vendors may not have this option and may likely perish.