CHOP: Adaptive filter-based DRAM caching for CMP server platforms

CHOP: Adaptive filter-based DRAM caching for CMP server platforms,10.1109/HPCA.2010.5416642,Xiaowei Jiang,Niti Madan,Li Zhao,Mike Upton,Ravishankar Iy

CHOP: Adaptive filter-based DRAM caching for CMP server platforms   (Citations: 6)
BibTex | RIS | RefWorks Download
Abstract—As manycore,architectures enable a large number,of cores on the die, a key challenge that emerges is the availability of memory,bandwidth,with conventional,DRAM solutions. To address this challenge, integration of large DRAM caches that provide,as much,as 5◊ higher,bandwidth,and,as low as 1/3rd of the,latency (as compared,to conventional,DRAM) is very promising. However, organizing and implementing a large DRAM cache is challenging because of two primary,tradeoffs: (a) DRAM caches at cache line granularity require too large an on-chip tag area that makes,it undesirable,and (b) DRAM caches with larger page granularity,require too much,bandwidth,because,the miss rate does not reduce enough,to overcome,the bandwidth,increase. In this paper, we propose CHOP (Caching HOt Pages) in DRAM caches to address these challenges. We study several filter-based DRAM caching,techniques: (a) a filter cache,(CHOP-FC) that profiles pages and determines,the hot subset of pages to allocate into the DRAM cache, (b) a memory-based filter cache (CHOP- MFC) that spills and fills filter state to improve,the accuracy,and reduce,the size of the filter cache,and,(c) an adaptive,DRAM caching,technique,(CHOP-AFC) to determine,when,the filter cache should,be enabled,and,disabled,for DRAM caching. We conduct,detailed simulations with server workloads,to show,that our filter-based DRAM caching techniques,achieve the following: (a) on average over 30% performance,improvement,over previous
Cumulative Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
Sort by: