AUB ScholarWorks

Vision-based Autonomous Navigation of UAVs

Show simple item record

dc.contributor.advisor Elhajj, Imad
dc.contributor.author Wardeh, Majd
dc.date.accessioned 2023-02-10T11:40:04Z
dc.date.available 2023-02-10T11:40:04Z
dc.date.issued 2/10/2023
dc.date.submitted 2/7/2023
dc.identifier.uri http://hdl.handle.net/10938/23958
dc.description.abstract Autonomous Drone Racing (ADR) has recently become the going-to-the-moon milestone for many roboticists. A challenging problem that advances the development of vision-based perception and navigation algorithms to perform fast, agile maneuvers while working on constrained onboard computing resources and dealing with imperfect sensing of the Unmanned Aerial Vehicles (UAVs). Due to such constraints, traditional navigation methods of map-localize-plan are infeasible. However, mapless navigation algorithms using Machine Learning approaches are showing promising results. In this thesis, we present an approach for vision-based navigation for quadrotors in an autonomous drone racing configuration. We propose the use of short-trajectory segments as control commands inferred directly from a deep-learning model and tracked by a high-level controller in a receding-horizon fashion. The direct use of short-trajectory segments eliminates the role of a path-planning module, thus, reducing the overall latency of the system, and as a result, allowing higher flight speeds. Furthermore, short-trajectory segments permit the use of deeper neural network models thanks to their relaxed update rate, compared with low-level commands, such as thrust and angular-body rates, that require high and fixed update rates. In addition, we train our policy network to predict short-trajectory segments that jointly traverse a racing gate and maintain it in the field of view of the camera. Keeping the racing gate in the camera’s field of view increases the accuracy of future predictions while permitting more accurate and robust state estimation. We compare the performance of our proposed system against one of the state-of-the-art methods in simulation. Our system flies at nearly double the speed on average reaching speeds up to around 4 m/s while achieving a comparable successful traversing rate (91% compared with 92% for the baseline).
dc.language.iso en_US
dc.subject Autonomous Navigation
dc.subject UAVs
dc.subject Artificial Intelligence
dc.subject Neural-Networks
dc.subject Vision-based Navigation
dc.subject Computer Vision
dc.subject Robotics
dc.title Vision-based Autonomous Navigation of UAVs
dc.type Thesis
dc.contributor.department Department of Electrical and Computer Engineering
dc.contributor.faculty Maroun Semaan Faculty of Engineering and Architecture
dc.contributor.institution American University of Beirut
dc.contributor.commembers Asmar, Daniel
dc.contributor.commembers Daher, Naseem
dc.contributor.commembers Shammas, Elie
dc.contributor.degree MS
dc.contributor.AUBidnumber 201921349


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search AUB ScholarWorks


Browse

My Account