Engews

SK Hynix Readies Next-Gen HBM4 Chips for AI Systems

8
Advanced
SK Hynix Readies Next-Gen HBM4 Chips for AI Systems
Exercise 1

Vocabulary

Repeat each word, definition, and example sentence after your tutor.
generationNounˌdʒenəˈreɪʃn
one stage or step in the development of a particular product or technology
The company said its next-generation console will be released next November.
boomNounbuːm
a period of great wealth or fast growth
The US went through a huge economic boom in the 1950s.
bandwidth Nounˈbændwɪdθ
the amount of information that can be sent through a network at a time
They're offering a free upgrade to a higher bandwidth if we sign up this month.
hot on the heels ofPhrasehɑːt ɑːn ðə hiːlz əv
following someone or something very closely
Hot on the heels of the huge success of their first single, the band announced a world tour.
indicationNounˌɪndəˈkeɪʃən
a sign or piece of information that shows something
The election results are a clear indication that people think the country is heading in the right direction.
light a fire underPhraseAmerican
laɪt ə faɪr ˈʌndərto make someone act more quickly or enthusiastically
The success of the project lit a fire under the team to set even more ambitious goals.
Exercise 2

Article

Read the article aloud on your own or repeat each paragraph after your tutor.SK Hynix Readies Next-Gen HBM4 Chips for AI Systems
South Korea's SK Hynix has prepared its first next-generation HBM4 chips for mass production. The chips are expected to significantly improve AI performance, as the boom in artificial intelligence pushes the world's existing memory chips to their limits — as well as prompting concerns about the huge amount of energy needed to keep these systems running. HBM stands for "high bandwidth memory," and the new HBM4 chips arrive hot on the heels of HBM3E, which was first introduced last year. The earliest HBM chips were released in 2013. HBM chips vertically connect multiple dynamic random access memory chips — better known as DRAM chips — to significantly increase their processing speed. DRAM is used to temporarily store the files that are needed when a computer is running programs and switching between tasks, so it's essential for this memory to be quickly accessible so the computer can run smoothly. The new HBM4 chips can process twice as much data per second as HBM3E, achieved by doubling the chips' input/output connectors to 2,048, bringing their bandwidth up to 10 gigabits per second. Compared to HBM3E, the new chips' power efficiency has also been improved 40%, SK Hynix says. The company announced that the new chips were ready for mass production in September, several months after delivering samples to key customers, including Nvidia, for testing. California-based Nvidia makes the vast majority of the world's high-end graphics processing units (GPUs), which are used for things like video games and 3D rendering. Nvidia's high-end GPUs are also used to run large-scale AI systems such as OpenAI's ChatGPT, and SK Hynix is the main memory supplier for Nvidia's AI processors. While SK Hynix's HBM4 chips had initially been scheduled for an early launch, the schedule was moved up by six months at the request of Nvidia CEO and founder Jensen Huang — giving an indication of just how big a fire AI has lit under chip development.
Exercise 3

Discussion

Have a discussion based on the following questions.
  1. What are your thoughts on SK Hynix's next-generation HBM4 chips?
  2. How do you expect AI performance to improve over the next few years?
  3. How long do you expect the AI boom to continue?
  4. How important is the chip industry to your country?
  5. What have been the biggest tech breakthroughs you've seen in your lifetime?
Exercise 4

Further Discussion

Have a discussion based on the following questions.
  1. What AI tools do you use the most?
  2. Is AI developing faster than you expected?
  3. How is AI being used in your industry?
  4. What impact do you expect AI to have on your industry over the next decade?
  5. The tools and technologies we've developed are really the first few drops of water in the vast ocean of what AI can do. — Fei-Fei Li. What are your thoughts on this quote?
SourceThis article is based on an article by Stefan Stojković.