一种有效的多阈值分类AdaBoost算法

ID:39407

大小:2.13 MB

页数:13页

时间:2023-03-14

金币:2

上传者:战必胜
Citation: Ding, Y.; Zhu, H.; Chen, R.;
Li, R. An Efficient AdaBoost
Algorithm with the Multiple
Thresholds Classification. Appl. Sci.
2022, 12, 5872. https://doi.org/
10.3390/app12125872
Academic Editors: Sławomir
Nowaczyk, Rita P. Ribeiro and
Grzegorz Nalepa
Received: 22 April 2022
Accepted: 1 June 2022
Published: 9 June 2022
Publishers Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
applied
sciences
Article
An Efficient AdaBoost Algorithm with the Multiple
Thresholds Classification
Yi Ding
1
, Hongyang Zhu
2,
* , Ruyun Chen
2
and Ronghui Li
1
1
Maritime College, Guangdong Ocean University, Zhanjiang 524091, China; dingyi@gdou.edu.cn (Y.D.);
lirh@gdou.edu.cn (R.L.)
2
College of Mathematics and Computer, Guangdong Ocean University, Zhanjiang 524091, China;
chenry@gdou.edu.cn
* Correspondence: zhuhongyang@gdou.edu.cn
Featured Application: A new Weak Learn algorithm which classifies examples is proposed based
on multiple thresholds. The weight assigning scheme of the Weak Learn algorithm is changed
correspondingly for the AdaBoost algorithm in this paper. Theoretical identification is provided
to show the superiority. Experimental studies are also presented to verify the effectiveness of
the method.
Abstract:
Adaptive boost (AdaBoost) is a prominent example of an ensemble learning algorithm that
combines weak classifiers into strong classifiers through weighted majority voting rules. AdaBoost’s
weak classifier, with threshold classification, tries to find the best threshold in one of the data
dimensions, dividing the data into two categories-1 and 1. However, in some cases, this Weak
Learning algorithm is not accurate enough, showing poor generalization performance and a tendency
to over-fit. To solve these challenges, we first propose a new Weak Learning algorithm that classifies
examples based on multiple thresholds, rather than only one, to improve its accuracy. Second, in this
paper, we make changes to the weight allocation scheme of the Weak Learning algorithm based on
the AdaBoost algorithm to use potential values of other dimensions in the classification process, while
the theoretical identification is provided to show its generality. Finally, comparative experiments
between the two algorithms on 18 datasets on UCI show that our improved AdaBoost algorithm has
a better generalization effect in the test set during the training iteration.
Keywords: AdaBoost; Multiple Thresholds Classification; accuracy; generalization
1. Introduction
The rapid growth of the Internet has led to a dramatic increase in the rate of data
generation. Data mining technology is one of the most important means of mining value
from such a large amount of data. Classification is the initialization operation that processes
the digitized information of data mining. Obviously, accurate classification will save a
lot of time and economic costs for subsequent work, such as analysis, forecasting, and
fitting processes.
Ensemble methods are ideal for regression and classification, and by combining
multiple models into a very reliable model, it can reduce bias and variance to boost the
accuracy of predictions [
1
,
2
]. Two common techniques for constructing Ensemble classifiers
are Boosting [
3
7
] and Bagging [
8
10
]. Boosting is better than Bagging, with less noise in
the data [11].
In the field of machine learning, the boosting algorithm is a more classic general-
purpose learning algorithm, which is based on the “probably approximately correct”
learning model proposed by Valiant. Freund and Schapire improved the Boosting algorithm
in 1995 and named it the AdaBoost algorithm.
Appl. Sci. 2022, 12, 5872. https://doi.org/10.3390/app12125872 https://www.mdpi.com/journal/applsci
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭