Article
Towards Autonomous Drone Racing without GPU Using an
OAK-D Smart Camera
Leticia Oyuki Rojas-Perez and Jose Martinez-Carranza *
Citation: Rojas-Perez, L. O.;
Martinez-Carranza, J. Toward
Autonomous Drone Racing without
GPU Using the OAK-D Smart
Camera. Sensors 2021, 21, 7436.
https://doi.org/10.3390/s21227436
Academic Editors: Yangquan Chen,
Nunzio Cennamo, M. Jamal Deen,
Simone Morais, Subhas
Mukhopadhyay and Junseop Lee
Received: 26 September 2021
Accepted: 2 November 2021
Published: 9 November 2021
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Department of Computational Science, Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE),
Puebla 72840, Mexico; oyukirojas@inaoep.mx
* Correspondence: carranza@inaoep.mx
Abstract:
Recent advances have shown for the first time that it is possible to beat a human with
an autonomous drone in a drone race. However, this solution relies heavily on external sensors,
specifically on the use of a motion capture system. Thus, a truly autonomous solution demands
performing computationally intensive tasks such as gate detection, drone localisation, and state
estimation. To this end, other solutions rely on specialised hardware such as graphics processing
units (GPUs) whose onboard hardware versions are not as powerful as those available for desktop
and server computers. An alternative is to combine specialised hardware with smart sensors capable
of processing specific tasks on the chip, alleviating the need for the onboard processor to perform
these computations. Motivated by this, we present the initial results of adapting a novel smart
camera, known as the OpenCV AI Kit or OAK-D, as part of a solution for the ADR running entirely
on board. This smart camera performs neural inference on the chip that does not use a GPU. It can
also perform depth estimation with a stereo rig and run neural network models using images from
a 4K colour camera as the input. Additionally, seeking to limit the payload to 200 g, we present a
new 3D-printed design of the camera’s back case, reducing the original weight 40%, thus enabling
the drone to carry it in tandem with a host onboard computer, the Intel Stick compute, where we
run a controller based on gate detection. The latter is performed with a neural model running on an
OAK-D at an operation frequency of 40 Hz, enabling the drone to fly at a speed of 2 m/s. We deem
these initial results promising toward the development of a truly autonomous solution that will run
intensive computational tasks fully on board.
Keywords: Autonomous Drone Racing; OAK-D; CNN; deep learning; smart camera
1. Introduction
Since its creation in IROS 2016, the Autonomous Drone Racing (ADR) competition
has posed the challenge of developing an autonomous drone capable of beating a human
in a drone race. The first editions of this competition gathered research groups whose
first solutions broke down the problem into three main problems to be addressed: (1) gate
detection; (2) drone localisation on the race track for control and navigation; (3) suitable
hardware for onboard processing. As reported in [
1
,
2
], gate detection was first attempted
with colour-based segmentation algorithms and some initial efforts using convolutional
neural networks for robust gate detection [3].
For drone localisation, visual simultaneous localisation and mapping (SLAM) and
visual odometry techniques were employed, seeking to provide global localisation on
the race track, which was exploited by teams to implement a waypoint-based navigation
system [
2
]. Further improvements proposed adding local pose estimation with respect to
the gate in a top-down navigation scheme where the controller drives the drone to follow a
global trajectory, which is refined once the drone flies toward the next gate [
4
,
5
]. The Game
of Drones competition at NeurIPS [
6
] called upon researchers to ignore hardware and
efficient performance to focus on high-level navigation strategies while seeking to push for
Sensors 2021, 21, 7436. https://doi.org/10.3390/s21227436 https://www.mdpi.com/journal/sensors