Citation: Ding, Y.; Zhu, H.; Chen, R.;
Li, R. An Efficient AdaBoost
Algorithm with the Multiple
Thresholds Classification. Appl. Sci.
2022, 12, 5872. https://doi.org/
10.3390/app12125872
Academic Editors: Sławomir
Nowaczyk, Rita P. Ribeiro and
Grzegorz Nalepa
Received: 22 April 2022
Accepted: 1 June 2022
Published: 9 June 2022
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
An Efficient AdaBoost Algorithm with the Multiple
Thresholds Classification
Yi Ding
1
, Hongyang Zhu
2,
* , Ruyun Chen
2
and Ronghui Li
1
1
Maritime College, Guangdong Ocean University, Zhanjiang 524091, China; dingyi@gdou.edu.cn (Y.D.);
lirh@gdou.edu.cn (R.L.)
2
College of Mathematics and Computer, Guangdong Ocean University, Zhanjiang 524091, China;
chenry@gdou.edu.cn
* Correspondence: zhuhongyang@gdou.edu.cn
Featured Application: A new Weak Learn algorithm which classifies examples is proposed based
on multiple thresholds. The weight assigning scheme of the Weak Learn algorithm is changed
correspondingly for the AdaBoost algorithm in this paper. Theoretical identification is provided
to show the superiority. Experimental studies are also presented to verify the effectiveness of
the method.
Abstract:
Adaptive boost (AdaBoost) is a prominent example of an ensemble learning algorithm that
combines weak classifiers into strong classifiers through weighted majority voting rules. AdaBoost’s
weak classifier, with threshold classification, tries to find the best threshold in one of the data
dimensions, dividing the data into two categories-1 and 1. However, in some cases, this Weak
Learning algorithm is not accurate enough, showing poor generalization performance and a tendency
to over-fit. To solve these challenges, we first propose a new Weak Learning algorithm that classifies
examples based on multiple thresholds, rather than only one, to improve its accuracy. Second, in this
paper, we make changes to the weight allocation scheme of the Weak Learning algorithm based on
the AdaBoost algorithm to use potential values of other dimensions in the classification process, while
the theoretical identification is provided to show its generality. Finally, comparative experiments
between the two algorithms on 18 datasets on UCI show that our improved AdaBoost algorithm has
a better generalization effect in the test set during the training iteration.
Keywords: AdaBoost; Multiple Thresholds Classification; accuracy; generalization
1. Introduction
The rapid growth of the Internet has led to a dramatic increase in the rate of data
generation. Data mining technology is one of the most important means of mining value
from such a large amount of data. Classification is the initialization operation that processes
the digitized information of data mining. Obviously, accurate classification will save a
lot of time and economic costs for subsequent work, such as analysis, forecasting, and
fitting processes.
Ensemble methods are ideal for regression and classification, and by combining
multiple models into a very reliable model, it can reduce bias and variance to boost the
accuracy of predictions [
1
,
2
]. Two common techniques for constructing Ensemble classifiers
are Boosting [
3
–
7
] and Bagging [
8
–
10
]. Boosting is better than Bagging, with less noise in
the data [11].
In the field of machine learning, the boosting algorithm is a more classic general-
purpose learning algorithm, which is based on the “probably approximately correct”
learning model proposed by Valiant. Freund and Schapire improved the Boosting algorithm
in 1995 and named it the AdaBoost algorithm.
Appl. Sci. 2022, 12, 5872. https://doi.org/10.3390/app12125872 https://www.mdpi.com/journal/applsci