基于极限学习机的自适应神经PID视觉伺服跟踪控制

ID:39116

大小:9.65 MB

页数:16页

时间:2023-03-14

金币:2

上传者:战必胜
Citation: Luo, J.; Zhu, L.; Wu, N.;
Chen, M.; Liu, D.; Zhang, Z.; Liu, J.
Adaptive Neural-PID Visual Servoing
Tracking Control via Extreme Learning
Machine. Machines 2022, 10, 782.
https://doi.org/10.3390/
machines10090782
Academic Editors: Shuai Li, Dechao
Chen, Mohammed Aquil Mirza,
Vasilios N. Katsikis, Dunhui Xiao and
Predrag Stanimirovi´c
Received: 31 July 2022
Accepted: 5 September 2022
Published: 7 September 2022
Publishers Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
machines
Article
Adaptive Neural-PID Visual Servoing Tracking Control via
Extreme Learning Machine
Junqi Luo
1,2
, Liucun Zhu
1,3,
* , Ning Wu
2,
* , Mingyou Chen
3
, Daopeng Liu
4
, Zhenyu Zhang
3
and Jiyuan Liu
3
1
College of Mechanical Engineering, Guangxi University, Nanning 530004, China;
2
Key Laboratory of Beibu Gulf Offshore Engineering Equipment and Technology, Beibu Gulf University,
Qinzhou 535000, China
3
Advanced Science and Technology Research Institute, Beibu Gulf University, Qinzhou 535000, China
4
School of Mechanical Engineering, Jiangsu University, Zhenjiang 212000, China
* Correspondence: lczhu@bbgu.edu.cn (L.Z.); n.wu@bbgu.edu.cn (N.W.)
Abstract:
The vision-guided robot is intensively embedded in modern industry, but it is still a
challenge to track moving objects in real time accurately. In this paper, a hybrid adaptive control
scheme combined with an Extreme Learning Machine (ELM) and proportional–integral–derivative
(PID) is proposed for dynamic visual tracking of the manipulator. The scheme extracts line features
on the image plane based on a laser-camera system and determines an optimal control input to
guide the robot, so that the image features are aligned with their desired positions. The observation
and state–space equations are first determined by analyzing the motion features of the camera and
the object. The system is then represented as an autoregressive moving average with extra input
(ARMAX) and a valid estimation model. The adaptive predictor estimates online the relevant 3D
parameters between the camera and the object, which are subsequently used to calculate the system
sensitivity of the neural network. The ELM–PID controller is designed for adaptive adjustment of
control parameters, and the scheme was validated on a physical robot platform. The experimental
results showed that the proposed method’s vision-tracking control displayed superior performance
to pure P and PID controllers.
Keywords: adaptive visual tracking; visual servoing; laser-camera system; ELM–PID control
1. Introduction
Robotic vision has important commercial and domestic applications such as in assem-
bly and welding, fruit picking, and household services. Most robots, however, follow a set
program to complete repetitive tasks. When discrepancies occur in the target or the robot, it
tends to be unable to make timely environmental adjustments, which is largely due to the
inherent lack of an adequate perception capability [
1
]. Visual servoing enables dexterous
control of robots through continuous visual perception and has drawn consistent attention.
Since 1996, Hutchinson’s three classic surveys [
2
4
] have provided a systematic un-
derstanding of visual servoing. According to the representation of control signals, it can
be categorized as position-based (PBVS), image-based (IBVS) or hybrid (HVS). In partic-
ular, IBVS has attracted widespread interest for its simple structure and insensitivity to
calibration accuracy. Common methods of IBVS control are adaptive [
5
], sliding mode [
6
],
fuzzy [
7
] and learning-based [
8
]. Saleem et al. [
9
] proposed an adaptive fuzzy-tuned
proportional derivative (AFT-PD) control scheme to improve the visual tracking control of
a mobile wheeled robot. YANG et al. [
10
] used radial basis function (RBF) neural networks
to estimate the dynamic parameters of the robot and compensate for the robot’s torque to
improve the tracking performance of the controller.
Most IBVS studies have been carried out under the assumption that the target is
stationary, so visual tracking in dynamic scenes has rarely been considered. Certain re-
searchers have estimated the Jacobi matrix of IBVS by developing an adaptive algorithm.
Machines 2022, 10, 782. https://doi.org/10.3390/machines10090782 https://www.mdpi.com/journal/machines
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭