TrendForce initiatives a exceptional 105% improve in annual bit shipments of high-bandwidth reminiscence (HBM) this yr. This increase is available in response to hovering calls for from AI and high-performance computing processor builders, notably Nvidia, and cloud service suppliers (CSPs). To satisfy demand, Micron, Samsung, and SK Hynix are reportedly rising their HBM capacities, however new manufacturing traces will possible begin operations solely in Q2 2022.
Extra HBM Is Wanted
Reminiscence makers managed to kind of match the provision and demand of HBM in 2022, a uncommon prevalence available in the market of DRAM. Nevertheless, an unprecedented demand spike for AI servers in 2023 compelled builders of acceptable processors (most notably Nvidia) and CSPs to position further orders for HBM2E and HBM3 reminiscence. This made DRAM makers use all of their obtainable capability and begin inserting orders for extra instruments to broaden their HBM manufacturing traces to fulfill the demand for HBM2E, HBM3, and HBM3E reminiscence sooner or later.
Nevertheless, assembly this HBM demand is not one thing easy. Along with making extra DRAM gadgets of their cleanrooms, DRAM producers must assemble these reminiscence gadgets into intricate 8-Hello or 12-Hello stacks, and right here they appear to have a bottleneck since they don’t have sufficient TSV manufacturing instruments, in keeping with TrendForce. To supply sufficient HBM2, HBM2E, and HBM3 reminiscence, main DRAM producers have to obtain new tools, which takes 9 to 12 months to be made and put in into their fabs. In consequence, a considerable hike in HBM manufacturing is anticipated round Q2 2024, the analysts declare.
A noteworthy pattern pinpointed by TrendForce analysts is the shifting desire from HBM2e (Utilized by AMD’s Intuition MI210/MI250/MI250X, Intel’s Sapphire Rapids HBM and Ponte Vecchio, and Nvidia’s H100/H800 playing cards) to HBM3 (integrated in Nvidia’s H100 SXM and GH200 supercomputer platform and AMD’s forthcoming Intuition MI300-series APUs and GPUs). TrendForce believes that HBM3 will account for 50% of all HBM reminiscence shipped in 2023, whereas HBM2E will account for 39%. In 2024, HBM3 is poised to account for 60% of all HBM shipments. This rising demand, when mixed with its increased value level, guarantees to spice up HBM income within the close to future.
Simply yesterday, Nvidia launched a brand new model of its GH200 Grace Hopper platform for AI and HPC that makes use of HBM3E reminiscence as a substitute of HBM3. The brand new platform consisting of a 72-core Grace CPU and GH100 compute GPU, boasts increased reminiscence bandwidth for the GPU, and it carries 144 GB of HBM3E reminiscence, up from 96 GB of HBM3 within the case of the unique GH200. Contemplating the immense demand for Nvidia’s choices for AI, Micron — which would be the solely provider of HBM3E in 1H 2024 — stands a excessive probability to learn considerably from the freshly launched {hardware} that HBM3E powers.
HBM Is Getting Cheaper, Sort Of
TrendForce additionally famous a constant decline in HBM product ASPs every year. To invigorate curiosity and offset reducing demand for older HBM fashions, costs for HBM2e and HBM2 are set to drop in 2023, in keeping with the market monitoring agency. With 2024 pricing nonetheless undecided, additional reductions for HBM2 and HBM2e are anticipated attributable to elevated HBM manufacturing and producers’ progress aspirations.
In distinction, HBM3 costs are predicted to stay secure, maybe as a result of, at current, it’s completely obtainable from SK Hynix, and it’ll take a while for Samsung to catch up. Given its increased value in comparison with HBM2e and HBM2, HBM3 might push HBM income to a powerful $8.9 billion by 2024, marking a 127% YoY improve, in keeping with TrendForce.
SK Hynix Main the Pack
SK Hynix commanded 50% of the HBM reminiscence market in 2022, adopted by Samsung with 40% and Micron with a ten% share. Between 2023 and 2024, Samsung and SK Hynix will proceed to dominate the market, holding almost equivalent stakes that sum as much as about 95%, TrendForce initiatives. Alternatively, Micron’s market share is predicted to hover between 3% and 6%.
In the meantime, (for now) SK Hynix appears to have an edge over its rivals. SK Hynix is the first producer of HBM3, the one firm to provide reminiscence for Nvidia’s H100 and GH200 merchandise. Compared, Samsung predominantly manufactures HBM2E, catering to different chip makers and CSPs, and is gearing as much as begin making HBM3. Micron, which doesn’t have HBM3 within the roadmap, produces HBM2E (which Intel reportedly makes use of for its Sapphire Rapids HBM CPU) and is on the point of ramp up manufacturing of HBM3E in 1H 2024, which can give it a big aggressive benefit over its rivals which are anticipated to start out making HBM3E solely in 2H 2024.