使用OAK-D智能摄像头实现无GPU的自主无人机比赛-2021年

ID:37310

大小:13.00 MB

页数:19页

时间:2023-03-03

金币:10

上传者:战必胜
sensors
Article
Towards Autonomous Drone Racing without GPU Using an
OAK-D Smart Camera
Leticia Oyuki Rojas-Perez and Jose Martinez-Carranza *

 
Citation: Rojas-Perez, L. O.;
Martinez-Carranza, J. Toward
Autonomous Drone Racing without
GPU Using the OAK-D Smart
Camera. Sensors 2021, 21, 7436.
https://doi.org/10.3390/s21227436
Academic Editors: Yangquan Chen,
Nunzio Cennamo, M. Jamal Deen,
Simone Morais, Subhas
Mukhopadhyay and Junseop Lee
Received: 26 September 2021
Accepted: 2 November 2021
Published: 9 November 2021
Publishers Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Department of Computational Science, Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE),
Puebla 72840, Mexico; oyukirojas@inaoep.mx
* Correspondence: carranza@inaoep.mx
Abstract:
Recent advances have shown for the first time that it is possible to beat a human with
an autonomous drone in a drone race. However, this solution relies heavily on external sensors,
specifically on the use of a motion capture system. Thus, a truly autonomous solution demands
performing computationally intensive tasks such as gate detection, drone localisation, and state
estimation. To this end, other solutions rely on specialised hardware such as graphics processing
units (GPUs) whose onboard hardware versions are not as powerful as those available for desktop
and server computers. An alternative is to combine specialised hardware with smart sensors capable
of processing specific tasks on the chip, alleviating the need for the onboard processor to perform
these computations. Motivated by this, we present the initial results of adapting a novel smart
camera, known as the OpenCV AI Kit or OAK-D, as part of a solution for the ADR running entirely
on board. This smart camera performs neural inference on the chip that does not use a GPU. It can
also perform depth estimation with a stereo rig and run neural network models using images from
a 4K colour camera as the input. Additionally, seeking to limit the payload to 200 g, we present a
new 3D-printed design of the camera’s back case, reducing the original weight 40%, thus enabling
the drone to carry it in tandem with a host onboard computer, the Intel Stick compute, where we
run a controller based on gate detection. The latter is performed with a neural model running on an
OAK-D at an operation frequency of 40 Hz, enabling the drone to fly at a speed of 2 m/s. We deem
these initial results promising toward the development of a truly autonomous solution that will run
intensive computational tasks fully on board.
Keywords: Autonomous Drone Racing; OAK-D; CNN; deep learning; smart camera
1. Introduction
Since its creation in IROS 2016, the Autonomous Drone Racing (ADR) competition
has posed the challenge of developing an autonomous drone capable of beating a human
in a drone race. The first editions of this competition gathered research groups whose
first solutions broke down the problem into three main problems to be addressed: (1) gate
detection; (2) drone localisation on the race track for control and navigation; (3) suitable
hardware for onboard processing. As reported in [
1
,
2
], gate detection was first attempted
with colour-based segmentation algorithms and some initial efforts using convolutional
neural networks for robust gate detection [3].
For drone localisation, visual simultaneous localisation and mapping (SLAM) and
visual odometry techniques were employed, seeking to provide global localisation on
the race track, which was exploited by teams to implement a waypoint-based navigation
system [
2
]. Further improvements proposed adding local pose estimation with respect to
the gate in a top-down navigation scheme where the controller drives the drone to follow a
global trajectory, which is refined once the drone flies toward the next gate [
4
,
5
]. The Game
of Drones competition at NeurIPS [
6
] called upon researchers to ignore hardware and
efficient performance to focus on high-level navigation strategies while seeking to push for
Sensors 2021, 21, 7436. https://doi.org/10.3390/s21227436 https://www.mdpi.com/journal/sensors
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭