Citation: Lee, J.-W.; Yu, K.-H.
Wearable Drone Controller: Machine
Learning-Based Hand Gesture
Recognition and Vibrotactile
Feedback. Sensors 2023, 23, 2666.
https://doi.org/10.3390/s23052666
Academic Editors: Enrico Vezzetti,
Andrea Luigi Guerra, Gabriele
Baronio, Domenico Speranza and
Luca Ulrich
Received: 10 January 2023
Revised: 14 February 2023
Accepted: 23 February 2023
Published: 28 February 2023
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Wearable Drone Controller: Machine Learning-Based Hand
Gesture Recognition and Vibrotactile Feedback
Ji-Won Lee
1
and Kee-Ho Yu
2,3,
*
1
KEPCO Research Institute, Daejeon 34056, Republic of Korea
2
Department of Aerospace Engineering, Jeonbuk National University, Jeonju 54896, Republic of Korea
3
Future Air Mobility Research Center, Jeonbuk National University, Jeonju 54896, Republic of Korea
* Correspondence: yu@jbnu.ac.kr
Abstract:
We proposed a wearable drone controller with hand gesture recognition and vibrotactile
feedback. The intended hand motions of the user are sensed by an inertial measurement unit (IMU)
placed on the back of the hand, and the signals are analyzed and classified using machine learning
models. The recognized hand gestures control the drone, and the obstacle information in the heading
direction of the drone is fed back to the user by activating the vibration motor attached to the
wrist. Simulation experiments for drone operation were performed, and the participants’ subjective
evaluations regarding the controller’s convenience and effectiveness were investigated. Finally,
experiments with a real drone were conducted and discussed to validate the proposed controller.
Keywords:
human–drone interface; wearable device; hand gesture recognition; machine learning;
vibrotactile feedback
1. Introduction
Nowadays, multicopter drones have been widely used because of their simple mecha-
nism, control convenience, and hovering feature [
1
]. Drones are important in surveillance
and reconnaissance, aerial photography and measurement, search and rescue missions,
communication relay, and environmental monitoring [
2
,
3
]. To complete such applica-
tions and missions, highly sophisticated and dexterous drone control is required. Au-
tonomous control has been partly used in their applications, as in waypoint following
and programmed flight and mission, because of limited autonomy [
4
–
6
]. However, in
the autonomous flight of drones, sometimes the autopilot is switched to manual control
by a human operator according to the flight phase, such as landing, and in unexpected
circumstances. The human role is necessary in the control loop when the system cannot
fully reach an autonomous state.
Therefore, natural user interfaces for human operators have been studied extensively.
An early study reported a novel user interface for manual control based on gestures, haptics,
and PDA [
7
]. Moreover, multimodal natural user interfaces, such as speech, gestures, and
vision for human-drone interaction, were introduced [
8
,
9
]. Recently, hand gesture-based
interfaces and interactions using machine learning models have been proposed [
10
–
15
].
Hand gestures are a natural way to express human intent, and their effectiveness for
applications of human–machine/computer interface/interaction has been reported in
previous works. Some of the applications were focused on the control of drones based on
deep learning models [
16
–
19
]. They used vision, optical sensors with infrared lights, and
an inertial measurement unit (IMU) to capture the motion of the hand. The IMU attached
to the user’s hand senses the motion of the hand robustly compared to conventional vision
systems, which are easily affected by light conditions and require tedious calibrations.
In contrast, to interact with a machine/computer, tactile stimulation has been adopted
as a feedback means to humans for a long time [
20
–
22
]. Tactile stimulation is an addi-
tional channel to visual information for providing necessary information to humans. In
Sensors 2023, 23, 2666. https://doi.org/10.3390/s23052666 https://www.mdpi.com/journal/sensors