基于图像特征和深度信息融合的鲁棒图像匹配

ID:39120

大小:16.68 MB

页数:16页

时间:2023-03-14

金币:2

上传者:战必胜
Citation: Yan, Z.; Wang, H.; Ning, Q.;
Lu, Y. Robust Image Matching Based
on Image Feature and Depth
Information Fusion. Machines 2022,
10, 456. https://doi.org/10.3390/
machines10060456
Academic Editors: Shuai Li,
Dechao Chen, Mohammed
Aquil Mirza, Vasilios N. Katsikis,
Dunhui Xiao and
Predrag Stanimirovi´c
Received: 5 May 2022
Accepted: 6 June 2022
Published: 8 June 2022
Publishers Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
machines
Article
Robust Image Matching Based on Image Feature and Depth
Information Fusion
Zhiqiang Yan , Hongyuan Wang *, Qianhao Ning and Yinxi Lu
Space Optical Engineering Research Center, Harbin Institute of Technology, Harbin 150001, China;
18b921006@stu.hit.edu.cn (Z.Y.); 19b921012@stu.hit.edu.cn (Q.N.); 20s121066@stu.hit.edu.cn (Y.L.)
* Correspondence: fountainhy@hit.edu.cn
Abstract:
In this paper, we propose a robust image feature extraction and fusion method to effectively
fuse image feature and depth information and improve the registration accuracy of RGB-D images.
The proposed method directly splices the image feature point descriptors with the corresponding
point cloud feature descriptors to obtain the fusion descriptor of the feature points. The fusion feature
descriptor is constructed based on the SIFT, SURF, and ORB feature descriptors and the PFH and
FPFH point cloud feature descriptors. Furthermore, the registration performance based on fusion
features is tested through the RGB-D datasets of YCB and KITTI. ORBPFH reduces the false-matching
rate by 4.66~16.66%, and ORBFPFH reduces the false-matching rate by 9~20%. The experimental
results show that the RGB-D robust feature extraction and fusion method proposed in this paper is
suitable for the fusion of ORB with PFH and FPFH, which can improve feature representation and
registration, representing a novel approach for RGB-D image matching.
Keywords: feature fusion; feature extraction; feature descriptor; RGB-D
1. Introduction
Since the advent of the Microsoft Kinect camera, various new RGB-D cameras have
been launched. RGB-D cameras can simultaneously provide color images and dense depth
images. Owing to his data acquisition advantage, RGB-D cameras are widely used in
robotics and computer vision. The extraction and matching of image features are the
basis for realizing these applications. Significant progress has been made in the feature
extraction, representation, and matching of images and depth maps (or point clouds).
However, there is room for further improvement of these processes. For example, the depth
image includes information not contained in the original color image. Further research is
required to effectively and comprehensively utilize the color image information and depth
information to improve feature-matching accuracy. Therefore, to effectively fuse image
and depth information and improve feature-matching accuracy, a robust RGB-D image
feature extraction and fusion method based on image and depth feature fusion is proposed
in this paper. The main idea of the proposed method is to directly splice the image feature
point descriptor and the corresponding point cloud feature descriptor to obtain the fusion
descriptor of feature points to be used as the basis of feature matching. The methodology
framework comprises image feature extraction and representation, point cloud feature
extraction and representation, and feature fusion, as shown in Figure 1.
The main contributions of this paper are as follows:
1.
A feature point description method that fuses image feature and depth information is
proposed, which has the potential to improve the accuracy of feature matching.
2.
The feature-matching performance of different fusion features constructed based on
the proposed method is verified on public RGB-D datasets.
Machines 2022, 10, 456. https://doi.org/10.3390/machines10060456 https://www.mdpi.com/journal/machines
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭