RIS-Assisted UAV for Timely Data Collection in IoT Networks
Loading...
Files
Date
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers Inc.
Abstract
Intelligent transportation systems are thriving thanks to a wide range of technological advances, namely 5G communications, the Internet of Things (IoT), artificial intelligence, and edge computing. Meanwhile, in environments where direct communication can be impaired by, for instance, blockages, such as in urban cities, unmanned aerial vehicles (UAVs) can be considered as an alternative for providing and enhancing connectivity. In this article, we consider a time-constrained data gathering problem from a network of sensing devices and with assistance from a UAV. A reconfigurable intelligent surface (RIS) is deployed to further help in improving the connectivity and energy efficiency of the UAV. This integrated problem brings challenges related to the configuration of the phase shift elements of the RIS, the scheduling of IoT devices' transmissions, as well as the trajectory of the UAV. First, the problem is formulated with the objective of maximizing the total number of served devices each during its activation period. Owing to its complexity, we leverage deep reinforcement learning in our solution; the UAV trajectory planning is modeled as a Markov decision process, and proximal policy optimization is invoked to solve it. Next, the RIS configuration is then handled via block coordinate descent. Finally, extensive simulations are conducted to demonstrate the efficiency of our solution approach that, in many cases, outperforms other methods by more than 50%. We also show that integrating an RIS with a UAV in IoT networks can improve the UAV energy efficiency. © 2007-2012 IEEE.
Description
Keywords
Deep reinforcement learning (drl), Energy efficiency, Internet of things (iot), Reconfigurable intelligent surface, Unmanned aerial vehicles (uavs), 5g mobile communication systems, Antennas, Deep learning, Information management, Intelligent systems, Intelligent vehicle highway systems, Internet of things, Markov processes, Reinforcement learning, Scheduling, Trajectories, Unmanned aerial vehicles (uav), Vehicle to vehicle communications, Aerial vehicle, Deep reinforcement learning, Internet of thing, Optimisations, Reconfigurable, Reinforcement learnings, Resource management, Trajectory planning, Unmanned aerial vehicle, Wireless communications