Exercise 2Article
Read the article aloud on your own or repeat each paragraph after your tutor.SK Hynix Readies Next-Gen HBM4 Chips for AI SystemsSouth Korea's SK Hynix has prepared its first next-generation HBM4 chips for mass production.
The chips are expected to significantly improve AI performance, as the boom in artificial intelligence pushes the world's existing memory chips to their limits — as well as prompting concerns about the huge amount of energy needed to keep these systems running.
HBM stands for "high bandwidth memory," and the new HBM4 chips arrive hot on the heels of HBM3E, which was first introduced last year. The earliest HBM chips were released in 2013.
HBM chips vertically connect multiple dynamic random access memory chips — better known as DRAM chips — to significantly increase their processing speed. DRAM is used to temporarily store the files that are needed when a computer is running programs and switching between tasks, so it's essential for this memory to be quickly accessible so the computer can run smoothly.
The new HBM4 chips can process twice as much data per second as HBM3E, achieved by doubling the chips' input/output connectors to 2,048, bringing their bandwidth up to 10 gigabits per second.
Compared to HBM3E, the new chips' power efficiency has also been improved 40%, SK Hynix says.
The company announced that the new chips were ready for mass production in September, several months after delivering samples to key customers, including Nvidia, for testing.
California-based Nvidia makes the vast majority of the world's high-end graphics processing units (GPUs), which are used for things like video games and 3D rendering.
Nvidia's high-end GPUs are also used to run large-scale AI systems such as OpenAI's ChatGPT, and SK Hynix is the main memory supplier for Nvidia's AI processors.
While SK Hynix's HBM4 chips had initially been scheduled for an early launch, the schedule was moved up by six months at the request of Nvidia CEO and founder Jensen Huang — giving an indication of just how big a fire AI has lit under chip development.