Citation: Gao, F.; Hsieh, J.-G.; Kuo,
Y.-S.; Jeng, J.-H. Study on Resistant
Hierarchical Fuzzy Neural Networks.
Electronics 2022, 11, 598. https://
doi.org/10.3390/electronics11040598
Academic Editors: Slawomir
Nowaczyk, Rita P. Ribeiro and
Grzegorz Nalepa
Received: 30 January 2022
Accepted: 13 February 2022
Published: 15 February 2022
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Study on Resistant Hierarchical Fuzzy Neural Networks
Fengyu Gao
1,2
, Jer-Guang Hsieh
1
, Ying-Sheng Kuo
3
and Jyh-Horng Jeng
4,
*
1
Department of Electrical Engineering, I-Shou University, Kaohsiung 84001, Taiwan;
isu10702050d@cloud.isu.edu.tw (F.G.); jghsieh@isu.edu.tw (J.-G.H.)
2
School of Electronic and Mechanical Engineering, Fujian Polytechnic Normal University,
Fuqing 350300, China
3
Department of Technology Management, Open University of Kaohsiung, Kaohsiung 81208, Taiwan;
ysk@ouk.edu.tw
4
Department of Information Engineering, I-Shou University, Kaohsiung 84001, Taiwan
* Correspondence: jjeng@isu.edu.tw
Abstract:
Novel resistant hierarchical fuzzy neural networks are proposed in this study and their
deep learning problems are investigated. These fuzzy neural networks can be used to model complex
controlled plants and can also be used as fuzzy controllers. In general, real-world data are usually
contaminated by outliers. These outliers may have undesirable or unpredictable influences on the
final learning machines. The correlations between the target and each of the predictors are utilized
to partition input variables into groups so that each group becomes the input variables of a fuzzy
system in each level of the hierarchical fuzzy neural network. In order to enhance the resistance of the
learning machines, we use the least trimmed squared error as the cost function. To test the resistance
of learning machines to adverse effects of outliers, we add at the output node some noise from three
different types of distributions, namely, normal, Laplace, and uniform distributions. Real-world
datasets are used to compare the performances of the proposed resistant hierarchical fuzzy neural
networks, resistant densely connected artificial neural networks, and densely connected artificial
neural networks without noise.
Keywords:
fuzzy neural network; hierarchical fuzzy neural network; outlier; resistant learning
machine; deep learning
1. Introduction
No matter how the data are collected, the data at hand usually contain outliers. These
data points are well separated from the bulk of data points or deviate from the general
pattern of the data in some fashion. The outliers may have adverse or unpredictable
influence on the final discriminant or predictive functions. In the past, many methods
were proposed in statistical regression to address the problems with the outliers [
1
–
6
].
Regression is one of the major tasks in machine learning [
7
–
10
], and it is extensively
studied using various models. Regression is applied in science education, agriculture, and
signal processing [
11
–
14
]. The purpose of regression is to find the relationship between
input variables and output variables in a dataset. However, the presence of noise and
outliers changes the relationship. The main spirit of resistant regression is not to completely
discard the outliers in the dataset, but to reduce the influence of these outliers on the final
estimator. These robust regression problems were also investigated in the machine learning
field [15–19].
The resistant regressors using the least trimmed squares (LTS) approach is
particularly notable because of its simplicity and ease of use.
Fuzzy neural networks (FNNs) possess the advantages of both fuzzy systems [
20
,
21
]
and neural networks [
22
,
23
], do not require accurate mathematical models, and have good
learning ability which can approximate a wide range of nonlinear functions. FNNs have
been widely used as machine learning models to deal with regression
problems [24,25].
Electronics 2022, 11, 598. https://doi.org/10.3390/electronics11040598 https://www.mdpi.com/journal/electronics