The introduction of generative AI, particularly ChatGPT, has significantly impacted the demand for high-end AI servers worldwide, leading to a noticeable increase in shipments. It is projected that shipments will grow from 61,000 units in 2021 to 639,000 units in 2024. With hyperscale cloud providers continuously investing in AI capabilities and the enhancement of chip-on-wafer-on-substrate (CoWoS) packaging and ODM production capacities, AI servers are expected to maintain a strong market demand.
DIGITIMES categorizes AI accelerators into two types: GPUs/ASSPs and ASICs. This research report will delve into and analyze the production and sales trends of AI servers based on these two types. It will cover the overall AI server shipment market and provide a detailed analysis of market conditions according to accelerator platforms and customer types. With the emergence of ChatGPT and Nvidia’s H100, the commercial AI training server market is expected to grow rapidly. This growth is anticipated to drive the market share of GPU/ASSP AI servers to reach 71% in 2024, with Nvidia GPUs becoming the mainstream accelerators in the market. Additionally, key companies such as AMD, AWS, Google, and Huawei will also be key players discussed in this research report.
From the customer perspective, major US hyperscale data center operators have remained the primary recipients of high-end AI servers in recent years. Notably, even before 2023, these operators had strategically invested resources in AI application development. This foresight positioned them as the first to heavily invest in building high-speed AI accelerator clusters during the generative AI boom. Their large-scale procurement further enhanced their competitive edge in acquiring AI servers. DIGITIMES predicts that even as the supply chain capacity increases to meet the demands of other customers, US large-scale data center operators will continue to account for over 60% of market demand in the next two years, with Google and Microsoft expected to be the main customers.
This research report will focus on observing and analyzing two critical assembly dimensions: L6 and L10-L12. It will concentrate on the allocation of AI server orders among OEMs, conducting individual analyses based on the aforementioned types of accelerators and customer categories. The study will explore the shipment and development trends of L6 and L10-L12 assembly plants.
Looking to the future, the competitive landscape of the global high-end AI server market is expected to become more diversified. While the majority of market supply and demand will continue to be dominated by major manufacturers, the increasing demand for AI infrastructure from enterprise users and advancements in AI chip technology will lead to a more diverse distribution of market share in the future. This will also result in a more complex and variable AI server supply chain, which is also a core focus of this research report.
CHAPTER 1 RESEARCH BACKGROUND AND SCOPE
1.1. DEFINITION OF HIGH-END AI SERVERS
1.2. AI SERVER ASSEMBLY BREAKDOWN
CHAPTER 2 GLOBAL HIGH-END AI SERVER SHIPMENT FORECAST FROM 2021 TO 2026
2.1. GLOBAL HIGH-END SERVER SHIPMENT TREND FORECAST
2.2. MAJOR ACCELERATOR PLATFORM SHIPMENT FORECAST
2.3. ESTIMATED SHIPMENT VOLUMES FOR MAJOR CUSTOMERS
CHAPTER 3 ESTIMATED SHIPMENT VOLUME AND MARKET SHARE OF KEY L6 AND L10-L12 ASSEMBLERS
3.1. SHIPMENT ESTIMATES FOR PRIMARY TAIWANESE MAKERS
3.2. DISTRIBUTION OF SHIPMENTS AMONG VARIOUS ACCELERATOR VENDORS
3.3. DISTRIBUTION OF OEM ORDERS FOR PRIMARY CUSTOMERS
CHAPTER 4 CONCLUSION
OUR TEAM
ABOUT DIGITIMES RESEARCH
CONTACT US
DISCLAIMER
COPYRIGHT STATEMENT