Scientific Reports (Feb 2021)
Lowering latency and processing burden in computational imaging through dimensionality reduction of the sensing matrix
Abstract
Abstract Recent demonstrations have shown that frequency-diverse computational imaging systems can greatly simplify conventional architectures developed for imaging by transferring constraints into the digital layer. Here, in order to limit the latency and processing burden involved in image reconstruction, we propose to truncate insignificant principal components of the sensing matrix that links the measurements to the scene to be imaged. In contrast to recent work using principle component analysis to synthesize scene illuminations, our generic approach is fully unsupervised and is applied directly to the sensing matrix. We impose no restrictions on the type of imageable scene, no training data is required, and no actively reconfigurable radiating apertures are employed. This paper paves the way to the constitution of a new degree of freedom in image reconstructions, allowing one to place the performance emphasis either on image quality or latency and computational burden. The application of such relaxations will be essential for widespread deployment of computational microwave and millimeter wave imagers in scenarios such as security screening. We show in this specific context that it is possible to reduce both the processing time and memory consumption with a minor impact on the quality of the reconstructed images.