Article
A Decentralized Sensor Fusion Scheme for Multi Sensorial
Fault Resilient Pose Estimation
Moumita Mukherjee *, Avijit Banerjee , Andreas Papadimitriou , Sina Sharif Mansouri
and George Nikolakopoulos
Citation: Mukherjee, M.; Banerjee,
A.; Papadimitriou, A.; Mansouri, S.S.;
Nikolakopoulos, G. A Decentralized
Sensor Fusion Scheme for Multi
Sensorial Fault Resilient Pose
Estimation. Sensors 2021, 21, 8259.
https://doi.org/10.3390/s21248259
Academic Editor: Pablo
Rodríguez-Gonzálvez
Received: 6 October 2021
Accepted: 6 December 2021
Published: 10 December 2021
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Robotics and AI Group, Department of Computer, Electrical and Space Engineering, Luleå University of
Technology, SE-97187 Luleå, Sweden; avijit.banerjee@ltu.se (A.B.); andreas.papadimitriou@ltu.se (A.P.);
sina.sharif.mansouri@ltu.se (S.S.M.); george.nikolakopoulos@ltu.se (G.N.)
* Correspondence: moumita.mukherjee@ltu.se
Abstract:
This article proposes a novel decentralized two-layered and multi-sensorial based fusion
architecture for establishing a novel resilient pose estimation scheme. As it will be presented, the first
layer of the fusion architecture considers a set of distributed nodes. All the possible combinations of
pose information, appearing from different sensors, are integrated to acquire various possibilities
of estimated pose obtained by involving multiple extended Kalman filters. Based on the estimated
poses, obtained from the first layer, a Fault Resilient Optimal Information Fusion (FR-OIF) paradigm
is introduced in the second layer to provide a trusted pose estimation. The second layer incorporates
the output of each node (constructed in the first layer) in a weighted linear combination form,
while explicitly accounting for the maximum likelihood fusion criterion. Moreover, in the case of
inaccurate measurements, the proposed FR-OIF formulation enables a self resiliency by embedding a
built-in fault isolation mechanism. Additionally, the FR-OIF scheme is also able to address accurate
localization in the presence of sensor failures or erroneous measurements. To demonstrate the
effectiveness of the proposed fusion architecture, extensive experimental studies have been conducted
with a micro aerial vehicle, equipped with various onboard pose sensors, such as a 3D lidar, a
real-sense camera, an ultra wide band node, and an IMU. The efficiency of the proposed novel
framework is extensively evaluated through multiple experimental results, while its superiority is also
demonstrated through a comparison with the classical multi-sensorial centralized fusion approach.
Keywords:
multi sensor fusion; decentralized fusion; linear minimum variance; maximum
likelihood
function; optimal information filter; fault resilient optimal information fusion
1. Introduction
State estimation is a challenging problem in the field of robotics that has been signifi-
cantly explored in the recent years and in different scientific and technological oriented
communities, such as: robotics [
1
], aerospace [
2
], automatic control [
3
], artificial intelli-
gence [
4
], and computer vision [
5
]. In this framework, one of the most interesting problems
is the one that is related to the estimation of the pose of a robot, especially for the case that
multi-sensors are utilized for the pose determination problem, with related sensor fusion
schemes, in order to increase the overall accuracy of the estimation but also at the same
time introduce the proper resiliency.
Towards this direction, lately the research on sensor fusion has been evolving in a
rapid manner, since multi-sensor fusion has the ability to integrate or to combine data
streams from different sources and simultaneously to increase the quality of the mea-
surements, while decreasing the corrupting noise from the measurements. Thus, in the
multi-sensorial fusion architectures, such for example the case of the pose estimation, it is
very common to have multiple sensors that are providing, e.g., full pose state estimations
or partial pose estimations (translation or orientation) and to have an overall sensor fusion
scheme that is commonly realized in a centralized [
6
] or distributed approaches [
6
,
7
]. In
Sensors 2021, 21, 8259. https://doi.org/10.3390/s21248259 https://www.mdpi.com/journal/sensors