Back to Search Start Over

Time‐Lapse Image Classification Using a Diffractive Neural Network

Authors :
Md Sadman Sakib Rahman
Aydogan Ozcan
Source :
Advanced Intelligent Systems, Vol 5, Iss 5, Pp n/a-n/a (2023)
Publication Year :
2023
Publisher :
Wiley, 2023.

Abstract

Diffractive deep neural networks (D2NNs), comprised of spatially engineered passive surfaces, collectively process optical input information at the speed of light propagation through a thin diffractive volume, without any external computing power. Diffractive networks were demonstrated to achieve all‐optical object classification and perform universal linear transformations. Herein, a “time‐lapse” image classification scheme using a diffractive network is demonstrated for the first time, significantly advancing its classification accuracy and generalization performance on complex input objects by using the lateral movements of the input objects and/or the diffractive network, relative to each other. In a different context, such relative movements of the objects and/or the camera are routinely being used for image super‐resolution applications; inspired by their success, a time‐lapse diffractive network is designed to benefit from the complementary information content created by controlled or random lateral shifts. The design space and performance limits of time‐lapse diffractive networks are numerically explored, revealing a blind testing accuracy of 62.03% on the optical classification of objects from the CIFAR‐10 dataset. This constitutes the highest inference accuracy achieved so far using a single diffractive network on the CIFAR‐10 dataset. Time‐lapse diffractive networks will be broadly useful for the spatiotemporal analysis of input signals using all‐optical processors.

Details

Language :
English
ISSN :
26404567
Volume :
5
Issue :
5
Database :
Directory of Open Access Journals
Journal :
Advanced Intelligent Systems
Publication Type :
Academic Journal
Accession number :
edsdoj.733670b28f124d64a6c6e0f320090ab3
Document Type :
article
Full Text :
https://doi.org/10.1002/aisy.202200387