Citation: Qiu, W.; Wang, G.; Zhang,
W. Acoustic SLAM Based on the
Direction-of-Arrival and the
Direct-to-Reverberant Energy Ratio.
Drones 2023, 7, 120. https://doi.org/
10.3390/drones7020120
Academic Editors: Andrzej
Łukaszewicz, Wojciech Giernacki,
Zbigniew Kulesza, Jaroslaw Pytka
and Andriy Holovatyy
Received: 10 January 2023
Revised: 6 February 2023
Accepted: 8 February 2023
Published: 9 February 2023
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Acoustic SLAM Based on the Direction-of-Arrival and the
Direct-to-Reverberant Energy Ratio
Wenhao Qiu, Gang Wang * and Wenjing Zhang
State Key Laboratory of Advanced Design and Manufacturing for Vehicle Body, Hunan University,
Changsha 410082, China
* Correspondence: wangg@hnu.edu.cn
Abstract:
This paper proposes a new method that fuses acoustic measurements in the reverberation
field and low-accuracy inertial measurement unit (IMU) motion reports for simultaneous localization
and mapping (SLAM). Different from existing studies that only use acoustic data for direction-of-
arrival (DoA) estimates, the source’s distance from sensors is calculated with the direct-to-reverberant
energy ratio (DRR) and applied to eliminate the nonlinear noise from motion reports. A particle filter
is applied to estimate the critical distance, which is key for associating the source’s distance with the
DRR. A keyframe method is used to eliminate the deviation of the source position estimation toward
the robot. The proposed DoA-DRR acoustic SLAM (D-D SLAM) is designed for three-dimensional
motion and is suitable for drones. The method is the first acoustic SLAM algorithm that has been
validated on a real-world drone dataset that contains only acoustic data and IMU measurements.
Compared with previous methods, D-D SLAM has acceptable performance in locating the drone
and building a source map from a real-world drone dataset. The average location accuracy is 0.48 m,
while the source position error converges to less than 0.25 m within 2.8 s. These results prove the
effectiveness of D-D SLAM in real-world scenes.
Keywords:
simultaneous localization and mapping; robot audition; direct-to-reverberant energy
ratio; mobile robots
1. Introduction
Recently, there has been renewed interest in simultaneous localization and mapping
(SLAM). Many meaningful and excellent works in SLAM have been based on optical and
visual sensors, such as VINS [
1
]. Different from visual SLAM, some works have focused on
acoustic SLAM, where acoustic sensors are involved. Most works on acoustic SLAM have
been conducted in underwater environments [
2
–
4
], while indoor acoustic SLAM [
5
–
7
], by
contrast, has received scant attention. Conventional SLAM techniques based on optical and
visual sensors are unsuitable for some special indoor environments, for example, foggy
rooms where light and lasers cannot penetrate. Conversely, the indoor acoustic SLAM-
based acoustic sensors can use continuous environmental sources as landmarks to assist
the mapping of robots in such a foggy indoor environment. It is preferable to use acoustic
SLAM in an indoor environment where light and lasers cannot penetrate and continuous
environmental sources exist.
Based on the sensor type used, indoor acoustic SLAM can be classified as active or
passive acoustic SLAM. Active indoor acoustic SLAM is usually based on active sonar,
and a sonar beam is utilized in an active sonar sensor model to measure the positions
of landmarks. To assist in localization, a motion sensor is required to generate motion
reports. Passive indoor acoustic SLAM is usually based on microphone arrays for direction-
of-arrival (DoA) estimates and motion sensors (such as an odometer) for motion reports.
In 2009 Hu et al. [
5
] proposed an acoustic SLAM method based on a cross-shaped micro-
phone array and odometry, and in 2013, Kallakuri et al. [
6
] developed a method based
Drones 2023, 7, 120. https://doi.org/10.3390/drones7020120 https://www.mdpi.com/journal/drones