Home
News
Products
Corporate
Contact
 
Thursday, November 28, 2024

News
Industry News
Publications
CST News
Help/Support
Software
Tester FAQs
Industry News

New cache management technique speeds system by 9%


Friday, September 30, 2016

By improving the efficiency with which computer processors find and retrieve the data they need from memory, researchers from Samsung Electronics and North Carolina State University (NC State) have given computer applications a speed boost of over nine percent, while reducing energy use by over four percent.

Though computers store all data to be manipulated off-chip in main memory (aka RAM), data required regularly by the processor is also temporarily stored in a die-stacked DRAM (dynamic random access memory) cache that allows the data to be retrieved more quickly. This data is stored in large blocks, or macroblocks, that allows the processor to locate the data it needs, but means additional, unwanted data contained in the macroblocks is also retrieved, wasting time and energy.

By getting the cache to learn over time the specific data from each macroblock the processor requires, researchers from Samsung and NC State were able to improve the efficiency of data retrieval in a couple of ways. Firstly, it speeds up data retrieval by allowing the cache to compress the macroblock so it only contains the relevant data and, secondly, the compressed macroblocks free up space in the cache for other data the processor is more likely to need.

This technique is called Dense Footprint Cache (DFC) and was compared to current state-of-the-art die-stacked DRAM management methods using a processor and memory simulator. The researchers found, after running 3 billion instructions for each of the applications tested, a boost in speed of 9.5 percent and reduction in energy use of 4.3 percent.

The researchers also found the Dense Footprint Cache approach significantly reduced the incidence of last-level cache (LLC) misses. This is when the processor attempts to retrieve data from the cache that isn't there, meaning the data needs to be retrieved from off-chip main memory, which wastes time and energy. In testing, Dense Footprint Cache reduced LLC miss ratios by 43 percent.

The team will present its paper (PDF) on DFC at the International Symposium on Memory Systems being held in Washington DC from Oct. 3-6.

By: DocMemory
Copyright © 2023 CST, Inc. All Rights Reserved

CST Inc. Memory Tester DDR Tester
Copyright © 1994 - 2023 CST, Inc. All Rights Reserved