Wednesday, June 21, 2023
Nvidia is reportedly interested in evaluating SK Hynix's HBM3E samples, according to "industry sources" via a DigiTimes report. If the information is accurate, then Nvidia's next-generation compute GPU for artificial intelligence and high-performance computing applications could use HBM3E memory instead of HBM3.
According to the industry sources cited by Korea's Money Today and Seoul Economic Daily, Nvidia has requested samples of HBM3E from SK Hynix with a view to evaluating their impact on GPU performance.
SK Hynix's upcoming HBM3E memory will increase the data transfer rate from the current 6.40 GT/s to 8.0 GT/s. This enhancement will consequently elevate the per-stack bandwidth from 819.2 GB/s to a whopping 1 TB/s. However, there are uncertainties surrounding the HBM3E's compatibility with pre-existing HBM3 controllers and interfaces, as SK Hynix has not yet disclosed information on this aspect of the new technology. In any case, Nvidia and other developers of compute AI and HPC GPUs will need to evaluate technology.
SK Hynix seemingly intends to initiate sampling of its HBM3E memory in the latter half of 2023, with plans to begin large-scale production in late 2023 or 2024. SK Hynix plans to build HBM3E memory using its 1b nanometer fabrication process, which is the company's 5th generation 10nm-class node for DRAMs. This same fabrication process is currently used to produce DDR5-6400 DRAMs. The same technology will be used in production of LPDDR5T memory chips for high-performance, low-power applications.
It remains to be seen which of Nvidia's compute GPUs will use HBM3E memory, though it is likely that the company will use the new type of memory for its next generation of processors due in 2024. Meanwhile, we do not know whether this will be a revamped Hopper GH100 compute GPU or something brand new.
SK Hynix currently controls over 50% of HBM memory market and is the only company to supply HBM3. It will also be exclusive maker of HBM3, at least initially.
Yole Development, a market research company, has projected a significant expansion of the HBM memory market as it has unique bandwidth advantage over other types of DRAM. The firm estimates that the market, valued at $705 million in 2023, will nearly double to reach a worth of $1.324 billion by the year 2027.
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|