Editor’s note: register CNN’s China Simultaneous newsletter This book explores what you need to know about the country’s rise and its impact on the world.
The U.S. government has imposed new export controls on sales to China of high-tech memory chips used in artificial intelligence (AI) applications.
The rules apply to U.S.-made high-bandwidth memory (HBM) technology as well as foreign-produced technology.
Here’s everything you need to know about these cutting-edge semiconductors, whose demand is skyrocketing amid the global craze for artificial intelligence.
High-bandwidth memory (HBM) is basically a bunch of memory chips, small components that store data. They can store more information and transfer data faster than an older technology called DRAM (dynamic random access memory).
HBM chips are commonly used in graphics cards, high-performance computing systems, data centers and autonomous vehicles.
Most importantly, they are essential for running increasingly popular AI applications, including generative AI powered by AI processors such as those made by Nvidia (NVDA) and Advanced Micro Devices (AMD) unit (GPU).
“Processors and memory are two important components of artificial intelligence. G Dan Hutcheson, vice chairman of chip research organization TechInsights, told CNN that without memory, it would be like the brain has logic but no memory.
The latest export curbs, announced on Dec. 2, follow two previous rounds of restrictions on advanced chips announced by the Biden administration in three years, aimed at denying China access to key technologies that could give it a military advantage.
Beijing retaliated by imposing new export restrictions on germanium, gallium and other material elements needed to make semiconductors and other high-tech equipment.
Experts say the latest export restrictions will slow China’s development of artificial intelligence chips and, at best, hinder its access to HBM. Although China’s current ability to produce HBM lags behind South Korea’s SK Hynix and Samsung and the United States’ Micron (MU), China is developing its own capabilities in this field.
“U.S. export restrictions will limit China’s access to higher-quality HBM in the short term,” Jeffery Chiu, CEO of Ansforce, an expert network consulting firm specializing in the technology field, told CNN. “But in the long run, China will still be able to produce them independently, albeit with less advanced technology.”
In China, Yangtze Memory Technology and Changxin Memory Technology are the leading manufacturers of memory chips. It is said that they are improving the establishment capacity of HBM production lines to achieve their strategic goal of technological self-sufficiency.
The main reason why HBM chips are so powerful is that they have larger storage space and faster data transfer speeds than traditional memory chips.
Since AI applications require a lot of complex calculations, these features ensure that these applications run smoothly without delays or glitches.
More storage means more data can be stored, transferred, and processed, thereby enhancing the performance of AI applications because large language models (LLMs) enable them to have more parameters to train with.
Think of faster data transmission speeds, or higher bandwidth in chip parlance, as a highway. The more lanes a highway has, the less likely it is to have a bottleneck and therefore the more cars it can accommodate.
“It’s like the difference between a two-lane highway and a 100-lane highway. You don’t get traffic jams,” Hutcheson said.
Currently, only three companies dominate the HBM global market.
According to a research report released by Taipei market research firm TrendForce, Hynix will account for 50% of HBM’s total market share as of 2022, followed by Samsung (40%) and Micron (10%). The two Korean companies are expected to occupy a similar share of the HBM market in 2023 and 2024, accounting for a total of around 95%.
According to Taiwan’s official Central News Agency, citing Micron senior executive Praveen Vaidyanatha, Micron aims to increase HBM’s market share to 20% to 25% by 2025.
The high value of HBM has led all manufacturers to devote the majority of their manufacturing capabilities to more advanced memory wafers. Avril Wu, senior vice president of research at TrendForce, said that starting in 2024, HBM is expected to account for more than 20% of the total value of the standard memory chip market, and may exceed 30% by next year.
Think of multiple standard memory chips stacked on top of each other like a burger. This is essentially the structure of HBM.
On the surface, this sounds simple, but it’s not so easy to implement that it’s reflected in the price. The unit sales price of HBM is several times that of traditional memory chips.
That’s because HBMs are about the height of six hairs, which means each layer of standard memory wafers stacked on top of each other also needs to be very thin, a feat that requires state-of-the-art manufacturing expertise known as advanced packaging.
“Each memory chip needs to be ground very thin before being stacked together, just half the height of a human hair, which is very difficult to achieve,” Qiu said.
Additionally, holes need to be drilled into the memory wafers for wires to pass through before they can be stacked on top of each other, and the location and size of these holes need to be extremely precise.
“When you try to build these devices, you’re going to have more points of failure. It’s almost like building a house of cards,” Hutcheson said.
For more CNN news and newsletters, create an account at CNN.com