In recent years, deep learning has exhibited remarkable performance in image classification. Nevertheless, traditional deep-learning-based techniques heavily depend on the availability of high-quality images for conveying information. This reliance results in the inefficient utilization of hardware and software resources across various stages, including image acquisition, storage, and processing. Additionally, these techniques often necessitate substantial amounts of data to effectively learn the underlying mapping, posing challenges in practical scenarios where acquiring a sufficient volume of paired data proves difficult. In this paper, we introduce a novel approach for image-free few-shot recognition by using a single-pixel detector. Our method comprises two fundamental stages. First, we design a neural network that integrates encoding and decoding modules, which can learn optimized encoding masks based on the statistical priors. Second, we employ these optimized masks to generate compressed 1D measurements. Subsequently, these measurements are fed into the classification network, preceded by the decoding module trained during the initial stage. The parameters of this decoding module serve as the initialization parameters for the subsequent stage of training. Furthermore, we incorporate a meta-training strategy, commonly used in few-shot classification, to mitigate dataset requirements during the second stage of training. Simulation results illustrate the effectiveness of our approach in image-free classification directly from 1D measurements, bypassing the time-consuming image reconstruction process. Our technique achieves a substantial reduction in data volume by two orders of magnitude while relying on only a limited number of paired data samples.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.