Home
News
Products
Corporate
Contact
 
Friday, April 26, 2024

News
Industry News
Publications
CST News
Help/Support
Software
Tester FAQs
Industry News

HP Enterprise promotes "Memory Driven Computing"


Wednesday, June 5, 2019

Purpose-built computing accelerators are being developed to gain higher performance, though memory-driven computing can be used to accelerate the accelerators.

Moore's Law—the doubling of transistors in integrated circuits about every two years—is coming to an end. This is inevitable, as limitations will prevent further miniaturization of components, either through manufacturing limitations or through reaching limitations of miniaturization at atomic levels. With Moore's Law predicted to end in 2025, research into the future of computing is being conducted in earnest, to find new ways to accelerate computing performance.

Various companies are developing such accelerators for specialized use cases: General-purpose computing on graphics processing units (GPGPU) are at the forefront of the accelerator trend, with NVIDIA touting their capabilities for machine learning, and quantum computers can arguably be considered as accelerators for medical research. However, not all workloads benefit from these types of accelerators. Hewlett Packard Enterprise announced The Machine in 2017—a computer equipped with 160 TB of RAM—as part of a push into what they define as "memory-driven computing," an effort to process large quantities of data in memory.

The difficulty with this is that traditional DRAM is fast, but not dense—less data can be stored in DRAM than on Flash memory, in terms of bits per square centimeter. Likewise, Flash memory, as a solid-state storage medium, has higher access speeds and lower latencies than traditional platter hard drives, though hard drives offer higher storage densities. The problem is not just raw speed, however: The way they are connected to a computer differs, with RAM being the most directly connected, and SSDs and HDDs further away, requiring a traversal into RAM, from the RAM to the CPU cache.

For memory-driven computing, "what we are not assuming is that there is only one kind of memory," Kirk Bresniker, chief architect at HPE Labs, told TechRepublic. "What if I had large pools of memory that is of different kinds? Balancing out price, performance and persistence. But have it all be uniform in how it is addressed. Uniform address spaces, a uniform way to access it. A way to physically accumulate memory of different capabilities, but have it be much more uniform... a memory fabric is what stitches all those kinds of memories together."

Last year, Intel announced Optane DC Persistent Memory, with sizes up to 512 GB per module. This product pin-compatible with DDR4 DIMMs, though use 3D XPoint, a technology positioned by Intel as somewhere between DRAM and NAND. Optane DIMMs have higher capacities than DRAM, and longer durabilities (in terms of write/erase cycles) than NAND, but are slower than DRAM when being written to. Notably, Optane DIMMs can retain data when powered down. For memory-driven computing, new kinds of memory such as this, as well as phase-change and spin-torque memory are vital to creating memory fabrics.

Additionally, an important function of memory fabrics is to reduce those latencies as much as possible, which can also benefit other accelerators, such as GPUs.

"When the cores in the main CPU talk to each other—talk to memory—we measure that time in nanoseconds. When [talking to a] GPU, we're taking microseconds. A thousand times slower," Bresniker said. "On a memory fabric where we are measuring all of those latencies in nanoseconds, I can take that accelerator or that memory device, it's worth is actually increased dramatically because it's on that memory fabric."

For more, check out "4 reasons why your company should consider in-memory big data processing," and "3 reasons why your company dislikes big data, and 4 things you can do about it."

By: DocMemory
Copyright © 2023 CST, Inc. All Rights Reserved

CST Inc. Memory Tester DDR Tester
Copyright © 1994 - 2023 CST, Inc. All Rights Reserved