Citation: Yan, Z.; Wang, H.; Ning, Q.;
Lu, Y. Robust Image Matching Based
on Image Feature and Depth
Information Fusion. Machines 2022,
10, 456. https://doi.org/10.3390/
machines10060456
Academic Editors: Shuai Li,
Dechao Chen, Mohammed
Aquil Mirza, Vasilios N. Katsikis,
Dunhui Xiao and
Predrag Stanimirovi´c
Received: 5 May 2022
Accepted: 6 June 2022
Published: 8 June 2022
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Robust Image Matching Based on Image Feature and Depth
Information Fusion
Zhiqiang Yan , Hongyuan Wang *, Qianhao Ning and Yinxi Lu
Space Optical Engineering Research Center, Harbin Institute of Technology, Harbin 150001, China;
18b921006@stu.hit.edu.cn (Z.Y.); 19b921012@stu.hit.edu.cn (Q.N.); 20s121066@stu.hit.edu.cn (Y.L.)
* Correspondence: fountainhy@hit.edu.cn
Abstract:
In this paper, we propose a robust image feature extraction and fusion method to effectively
fuse image feature and depth information and improve the registration accuracy of RGB-D images.
The proposed method directly splices the image feature point descriptors with the corresponding
point cloud feature descriptors to obtain the fusion descriptor of the feature points. The fusion feature
descriptor is constructed based on the SIFT, SURF, and ORB feature descriptors and the PFH and
FPFH point cloud feature descriptors. Furthermore, the registration performance based on fusion
features is tested through the RGB-D datasets of YCB and KITTI. ORBPFH reduces the false-matching
rate by 4.66~16.66%, and ORBFPFH reduces the false-matching rate by 9~20%. The experimental
results show that the RGB-D robust feature extraction and fusion method proposed in this paper is
suitable for the fusion of ORB with PFH and FPFH, which can improve feature representation and
registration, representing a novel approach for RGB-D image matching.
Keywords: feature fusion; feature extraction; feature descriptor; RGB-D
1. Introduction
Since the advent of the Microsoft Kinect camera, various new RGB-D cameras have
been launched. RGB-D cameras can simultaneously provide color images and dense depth
images. Owing to his data acquisition advantage, RGB-D cameras are widely used in
robotics and computer vision. The extraction and matching of image features are the
basis for realizing these applications. Significant progress has been made in the feature
extraction, representation, and matching of images and depth maps (or point clouds).
However, there is room for further improvement of these processes. For example, the depth
image includes information not contained in the original color image. Further research is
required to effectively and comprehensively utilize the color image information and depth
information to improve feature-matching accuracy. Therefore, to effectively fuse image
and depth information and improve feature-matching accuracy, a robust RGB-D image
feature extraction and fusion method based on image and depth feature fusion is proposed
in this paper. The main idea of the proposed method is to directly splice the image feature
point descriptor and the corresponding point cloud feature descriptor to obtain the fusion
descriptor of feature points to be used as the basis of feature matching. The methodology
framework comprises image feature extraction and representation, point cloud feature
extraction and representation, and feature fusion, as shown in Figure 1.
The main contributions of this paper are as follows:
1.
A feature point description method that fuses image feature and depth information is
proposed, which has the potential to improve the accuracy of feature matching.
2.
The feature-matching performance of different fusion features constructed based on
the proposed method is verified on public RGB-D datasets.
Machines 2022, 10, 456. https://doi.org/10.3390/machines10060456 https://www.mdpi.com/journal/machines