Towards Real-World 6G Drone Communication: Position and Camera Aided Beam Prediction

IEEE Global Communications Conference (GLOBECOM) 2022

Gouranga Charan, Andrew Hredzak, Christian Stoddard, Benjamin Berrey, Madhav Seth, Hector Nunez, Ahmed Alkhateeb

Wireless Intelligence Lab, Arizona State University

An illustration of the mmWave basestation serving a drone in a real wireless environment. The basestation utilizes additional sensing data such as RGB images, GPS location of the drone, etc., to predict the optimal beam.

Abstract

Millimeter-wave (mmWave) and terahertz (THz) communication systems typically deploy large antenna arrays to guarantee sufficient receive signal power. The beam training overhead associated with these arrays, however, make it hard for these systems to support highly-mobile applications such as drone communication. To overcome this challenge, this paper proposes a machine learning-based approach that leverages additional sensory data, such as visual and positional data, for fast and accurate mmWave/THz beam prediction. The developed framework is evaluated on a real-world multi-modal mmWave drone communication dataset comprising of co-existing camera, practical GPS, and mmWave beam training data. The proposed sensing-aided solution achieves a top-1 beam prediction accuracy of 86.32% and close to 100% top-3 and top-5 accuracies, while considerably reducing the beam training overhead. This highlights a promising solution for enabling highly mobile 6G drone communications.

Proposed Solution

A block diagram showing the proposed solution for both the vision and position-aided beam prediction task. As shown in the figure, the camera installed at the basestation captures real-time images of the drone in the wireless environment. A CNN is then utilized to predict the optimal beam index. The basestation receives the information for the other three sensing data, which is then provided to a fully-connected neural network to predict the beam.

Real-World Demonstration

DeepSense 6G Dataset

DeepSense 6G is a real-world multi-modal dataset that comprises coexisting multi-modal sensing and communication data, such as mmWave wireless communication, Camera, GPS data, LiDAR, and Radar, collected in realistic wireless environments.  Link to the DeepSense 6G website is provided below. 

DeepSense 6G Scenario

In this mmWave drone beam prediction task, we build development/challenge datasets based on the DeepSense data from scenario 23. For further details regarding the scenarios, follow the links provided below. 

Citation

If you want to use the dataset or scripts in this page, please cite the following two papers:

A. Alkhateeb, G. Charan, T. Osman, A. Hredzak, and N. Srinivas, “DeepSense 6G: large-scale real-world multi-modal sensing and communication datasets,” to be available on arXiv, 2022. [Online]. Available: https://www.DeepSense6G.net

@Article{DeepSense,
author = {Alkhateeb, A. and Charan, G.  and Osman, T. and Hredzak, A. and Srinivas, N.},
title = {{DeepSense 6G}: Large-Scale Real-World Multi-Modal Sensing and Communication Datasets},
journal={to be available on arXiv},
year = {2022},
url = {https://www.DeepSense6G.net},}

G. Charan, A. Hredzak, C. Stoddard, B. Berrey, M. Seth, H. Nunez, and A. Alkhateeb, “Towards Real-World 6G Drone Communication: Position and Camera Aided Beam Prediction,” In Proc. of IEEE Global Communications Conference (GLOBECOM), 2022

@INPROCEEDINGS{Charan2022,
author = {Charan, G. and Hredzak, A. and Stoddard, C. and Berrey, B. and Seth, M. and Nunez, H. and Alkhateeb, A.},
title = {Towards Real-World 6G Drone Communication: Position and Camera Aided Beam Prediction},
booktitle={2022 IEEE Global Communications Conference (GLOBECOM)},
year = {2022},}