Back to Search Start Over

User Identification: A Key Enabler for Multi-User Vision-Aided Communications

Authors :
Charan, Gouranga
Alkhateeb, Ahmed
Publication Year :
2022

Abstract

Vision-aided wireless communication is attracting increasing interest and finding new use cases in various wireless communication applications. These vision-aided communication frameworks leverage visual data captured, for example, by cameras installed at the infrastructure or mobile devices to construct some perception about the communication environment through the use of deep learning and advances in computer vision and visual scene understanding. Prior work has investigated various problems such as vision-aided beam, blockage, and hand-off prediction in millimeter wave (mmWave) systems and vision-aided covariance prediction in massive MIMO systems. This prior work, however, has focused on scenarios with a single object (user) in front of the camera. In this paper, we define the \textit{user identification} task as a key enabler for realistic vision-aided communication systems that can operate in crowded scenarios and support multi-user applications. The objective of the user identification task is to identify the target communication user from the other candidate objects (distractors) in the visual scene. We develop machine learning models that process either one frame or a sequence of frames of visual and wireless data to efficiently identify the target user in the visual/communication environment. Using the large-scale multi-modal sense and communication dataset, DeepSense 6G, which is based on real-world measurements, we show that the developed approaches can successfully identify the target users with more than 97$\%$ accuracy in realistic settings. This paves the way for scaling the vision-aided wireless communication applications to real-world scenarios and practical deployments.<br />Comment: Datasets and code files are available on the DeepSense website: https://deepsense6g.net/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2210.15652
Document Type :
Working Paper