Back to Search
Start Over
An architecture for the efficient implementation of compressive sampling reconstruction algorithms in reconfigurable hardware
- Source :
- Visual Information Processing
- Publication Year :
- 2007
- Publisher :
- SPIE, 2007.
-
Abstract
- According to the Shannon-Nyquist theory, the number of samples required to reconstruct a signal is proportional to its bandwidth. Recently, it has been shown that acceptable reconstructions are possible from a reduced number of random samples, a process known as compressive sampling. Taking advantage of this realization has radical impact on power consumption and communication bandwidth, crucial in applications based on small/mobile/unattended platforms such as UAVs and distributed sensor networks. Although the benefits of these compression techniques are self-evident, the reconstruction process requires the solution of nonlinear signal processing algorithms, which limit applicability in portable and real-time systems. In particular, (1) the power consumption associated with the difficult computations offsets the power savings afforded by compressive sampling, and (2) limited computational power prevents these algorithms to maintain pace with the data-capturing sensors, resulting in undesirable data loss. FPGA based computers offer low power consumption and high computational capacity, providing a solution to both problems simultaneously. In this paper, we present an architecture that implements the algorithms central to compressive sampling in an FPGA environment. We start by studying the computational profile of the convex optimization algorithms used in compressive sampling. Then we present the design of a pixel pipeline suitable for FPGA implementation, able to compute these algorithms.
Details
- ISSN :
- 0277786X
- Database :
- OpenAIRE
- Journal :
- Visual Information Processing XVI
- Accession number :
- edsair.doi...........bda702d3bec78f5e302fae7087bb9ade
- Full Text :
- https://doi.org/10.1117/12.719264