基于到达方向和直接混响能量比的声SLAM

ID:38779

大小:5.26 MB

页数:24页

时间:2023-03-14

金币:2

上传者:战必胜
Citation: Qiu, W.; Wang, G.; Zhang,
W. Acoustic SLAM Based on the
Direction-of-Arrival and the
Direct-to-Reverberant Energy Ratio.
Drones 2023, 7, 120. https://doi.org/
10.3390/drones7020120
Academic Editors: Andrzej
Łukaszewicz, Wojciech Giernacki,
Zbigniew Kulesza, Jaroslaw Pytka
and Andriy Holovatyy
Received: 10 January 2023
Revised: 6 February 2023
Accepted: 8 February 2023
Published: 9 February 2023
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
drones
Article
Acoustic SLAM Based on the Direction-of-Arrival and the
Direct-to-Reverberant Energy Ratio
Wenhao Qiu, Gang Wang * and Wenjing Zhang
State Key Laboratory of Advanced Design and Manufacturing for Vehicle Body, Hunan University,
Changsha 410082, China
* Correspondence: wangg@hnu.edu.cn
Abstract:
This paper proposes a new method that fuses acoustic measurements in the reverberation
field and low-accuracy inertial measurement unit (IMU) motion reports for simultaneous localization
and mapping (SLAM). Different from existing studies that only use acoustic data for direction-of-
arrival (DoA) estimates, the source’s distance from sensors is calculated with the direct-to-reverberant
energy ratio (DRR) and applied to eliminate the nonlinear noise from motion reports. A particle filter
is applied to estimate the critical distance, which is key for associating the source’s distance with the
DRR. A keyframe method is used to eliminate the deviation of the source position estimation toward
the robot. The proposed DoA-DRR acoustic SLAM (D-D SLAM) is designed for three-dimensional
motion and is suitable for drones. The method is the first acoustic SLAM algorithm that has been
validated on a real-world drone dataset that contains only acoustic data and IMU measurements.
Compared with previous methods, D-D SLAM has acceptable performance in locating the drone
and building a source map from a real-world drone dataset. The average location accuracy is 0.48 m,
while the source position error converges to less than 0.25 m within 2.8 s. These results prove the
effectiveness of D-D SLAM in real-world scenes.
Keywords:
simultaneous localization and mapping; robot audition; direct-to-reverberant energy
ratio; mobile robots
1. Introduction
Recently, there has been renewed interest in simultaneous localization and mapping
(SLAM). Many meaningful and excellent works in SLAM have been based on optical and
visual sensors, such as VINS [
1
]. Different from visual SLAM, some works have focused on
acoustic SLAM, where acoustic sensors are involved. Most works on acoustic SLAM have
been conducted in underwater environments [
2
4
], while indoor acoustic SLAM [
5
7
], by
contrast, has received scant attention. Conventional SLAM techniques based on optical and
visual sensors are unsuitable for some special indoor environments, for example, foggy
rooms where light and lasers cannot penetrate. Conversely, the indoor acoustic SLAM-
based acoustic sensors can use continuous environmental sources as landmarks to assist
the mapping of robots in such a foggy indoor environment. It is preferable to use acoustic
SLAM in an indoor environment where light and lasers cannot penetrate and continuous
environmental sources exist.
Based on the sensor type used, indoor acoustic SLAM can be classified as active or
passive acoustic SLAM. Active indoor acoustic SLAM is usually based on active sonar,
and a sonar beam is utilized in an active sonar sensor model to measure the positions
of landmarks. To assist in localization, a motion sensor is required to generate motion
reports. Passive indoor acoustic SLAM is usually based on microphone arrays for direction-
of-arrival (DoA) estimates and motion sensors (such as an odometer) for motion reports.
In 2009 Hu et al. [
5
] proposed an acoustic SLAM method based on a cross-shaped micro-
phone array and odometry, and in 2013, Kallakuri et al. [
6
] developed a method based
Drones 2023, 7, 120. https://doi.org/10.3390/drones7020120 https://www.mdpi.com/journal/drones
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭