Home
News
Products
Corporate
Contact
 
Tuesday, November 26, 2024

News
Industry News
Publications
CST News
Help/Support
Software
Tester FAQs
Industry News

GDDR6 found its way into AI and Networking


Wednesday, August 1, 2018

The “G” still stands for “graphics,” but new use cases driving the need for GDDR memory technology have nothing to do with pixels.

In fact, applications such as artificial intelligence (AI) and machine learning, which need ultra-fast memories, have shorted gamers of their GDDR supply, so it’s probably a good idea that makers of the technology are ramping up delivery. Micron Technology recently began volume production of its 8-Gb GDDR6 memory, which, of course, is aimed at the graphics market but also automotive and networking segments.

Some of the emerging uses cases for GDDR memory are still graphics-driven. In the growing automotive memory market, it’s to support increasingly visual dashboards and advanced driver assistance systems (ADAS) that must be responsive to a driver’s actions immediately, while autonomous vehicles need high-performance memory to process the vast amounts of real-time data. Other emerging applications include augmented reality (AR) and virtual reality (VR). Finally, video is always hungry for memory as 4K gets more widely adopted and 8K nips at its heels.

Andreas Schlapka, director of Micron’s compute networking business unit, said that in the last three years, traditional GDDR applications such as graphics cards and game consoles have been joined by more networking applications and autonomous driving. In modern vehicles, high-performance memory must deal with large amounts of data generated by sensors and cameras. Similarly, advanced networking technologies use GDDR to power network interface cards (NICs), he said. “Five years ago, the biggest traffic in data centers was in-and-out traffic. Something was done with your information on a specific node, and it came back to your terminal.” Now there’s much more traffic within the data center from one node to another, which drives enormous needs in terms of throughput.

Speaking of throughput, there’s cryptocurrency applications, said Schlapka, as cryptocurrencies and crypto-mining have become more mainstream. There’s a lot of data that must be run through quickly, and memory has to keep up with either an A6 or GPU. Similarly, high-performance computing, especially AI, benefits from the bandwidth of GDDR, he said. “If you train a neural network, you run through as much data as you can — ideally, pin to byte. And you need to load them into a memory and then compute them pretty fast. I think that’s when GDDR becomes a big deal and a very good solution.”

GDDR has seen a steady evolution with faster speed per pin, said Tien Shiah, specialty DRAM marketing manager for Samsung Semiconductor, Inc., and is traditionally used alongside GPUs for graphics applications. “Now the GPUs are finding their way, especially the high-end ones, into machine-learning-type applications because those types of AI algorithms are very well-suited for the parallel architecture of a GPU.” However, he said, some applications need to go further and use high-bandwidth memory (HBM). Samsung began production of 16-Gigabit GDDR6 for advanced graphics systems at the beginning of the year.

While many of the emerging adopters of GDDR technology have been in automotive and high-performance computing, new consumer applications have presented themselves, too, such as 8K video processing, augmented reality, and virtual reality, although the latter hasn’t moved as fast as expected, said Shiah. “There was a lot of hype and excitement around the technology, and then, as people started using it, some people got nauseous and dizzy, but that relates to how powerful your computer system is.” He said that early products hitting the market lowered the specifications, so more people could use the technology, but the tradeoff was that some people got dizzy. “As the computer systems get more powerful, which they [do] every year, and as the graphics capability goes up with technologies like GDDR6, the experience will be much better. More people will be able to enjoy it because of the improvements in GPUs and CPUs, and the battle is in memory.”

And the good news is that, unlike planar NAND, which faced a wall in recent years and needed 3D to become cost-effective to continue forward, there’s plenty of runway left for GDDR. “We’re always looking into what’s coming ahead,” said Shiah. “Right now, we’re not seeing any sort of a wall. We continue to work on improvements to the technology.

Schlapka also sees room to grow and that there could be something beyond GDDR6, whether it’s called GDDR7 or something else. “We think there is room to further develop a parallel-point technology like GDDR6, but that’s a question that we need to answer in four or five years.”

The current iteration of GDDR has hit a tipping point. As a JEDEC standard, said Jon Peddie, principal of Jon Peddie Research, it has been fully tested by companies incorporating it into their products, such as AMD, Nvidia, and Qualcomm. “All of these guys have had their finger in the pie, making sure that this is a reliable thing.” Micron is the first to go into mass production, he said. “It’s good news for the gamers. It’s good news for the AI developers. It’s good news for the workstation people. It’s all around pretty much good news, and it will just get better as they learn how to squeeze more yield out of the process.”

Moore’s Law, however, does create challenges with GDDR scalability as it does with other technologies, said Peddie. “GDDR6 is built on Moore’s latest, greatest process node, and as with any new process node, there is an issue with regards to yield. As semiconductor manufacturers get more familiar and more experienced in building these things, the yield gets up and up and up.” Good yields mean that manufacturers are “minting money,” he said, but as processes creep into the single-digit nanometers, it means that they are tinkering with “atomic-level stuff” that makes it hard to manufacture in mass production.

But for now, GDDR memory is in a good place, said Peddie, having consistently improved from generation to generation. “What you’ll see is this beautiful — and I mean beautiful — Moore’s Law pattern where more memory is getting into the same package and it’s running faster whilst, at the same time, using less power. That’s absolutely what Moore’s Law is all about. The GDDR family of parts is a prime example of how Moore’s Laws work.”

By: DocMemory
Copyright © 2023 CST, Inc. All Rights Reserved

CST Inc. Memory Tester DDR Tester
Copyright © 1994 - 2023 CST, Inc. All Rights Reserved