1. GREEN Cache: Exploiting the Disciplined Memory Model of OpenCL on GPUs
- Author
-
Dong Hyuk Woo, Jaekyu Lee, Mani Azimi, and Hyesoon Kim
- Subjects
Computer science ,Cache coloring ,CPU cache ,Pipeline burst cache ,Parallel computing ,Cache pollution ,Cache-oblivious algorithm ,Theoretical Computer Science ,Non-uniform memory access ,CUDA Pinned memory ,Cache invalidation ,Write-once ,Cache hierarchy ,Cache algorithms ,Direct memory access ,Snoopy cache ,Hardware_MEMORYSTRUCTURES ,Cache-only memory architecture ,Uniform memory access ,Memory bandwidth ,Smart Cache ,Memory management ,Computational Theory and Mathematics ,Computer architecture ,Hardware and Architecture ,Page cache ,Memory model ,Cache ,Software - Abstract
As various graphics processing unit architectures are deployed across broad computing spectrum from a hand-held or embedded device to a high-performance computing server, OpenCL becomes the de facto standard programming environment for general-purpose computing on graphics processing units. Unlike its CPU counterpart, OpenCL has several distinct features such as its disciplined memory model, which is partially inherited from conventional 3D graphics programming models. On the other hand, due to ever increasing memory bandwidth pressure and low power requirement, the capacity of on-chip caches in GPUs keeps increasing over time. Given such trends, we believe that we have interesting programming model/architecture co-optimization opportunities, in particular, how to energy-efficiently utilize large on-chip caches for GPUs. In this paper, as a showcase, we study the characteristics of the OpenCL memory model and propose a technique called GPU Region-aware energy-efficient non-inclusive cache hierarchy, or GREEN cache hierarchy. With the GREEN cache, our simulation results show that we can save 56 percent of dynamic energy in the L1 cache, 39 percent of dynamic energy in the L2 cache, and 50 percent of leakage energy in the L2 cache with practically no performance degradation and off-chip access increases.
- Published
- 2015
- Full Text
- View/download PDF