基于注意力的问题分类变换器BiGRU

ID:39395

大小:2.05 MB

页数:22页

时间:2023-03-14

金币:2

上传者:战必胜
Citation: Han, D.; Tohti, T.;
Hamdulla, A. Attention-Based
Transformer-BiGRU for Question
Classification. Information 2022, 13,
214. https://doi.org/10.3390/
info13050214
Academic Editor: Kostas Vergidis
Received: 11 March 2022
Accepted: 15 April 2022
Published: 20 April 2022
Publishers Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
information
Article
Attention-Based Transformer-BiGRU for
Question Classification
Dongfang Han, Turdi Tohti * and Askar Hamdulla
College of Information Science and Engineering, Xinjiang University, Urumqi 830017, China;
easth@stu.xju.edu.cn (D.H.); askar@xju.edu.cn (A.H.)
* Correspondence: turdy@xju.edu.cn; Tel.: +86-139-9999-4696
Abstract:
A question answering (QA) system is a research direction in the field of artificial intelligence
and natural language processing (NLP) that has attracted much attention and has broad development
prospects. As one of the main components in the QA system, the accuracy of question classification
plays a key role in the entire QA task. Therefore, not only the traditional machine learning methods
but also today’s deep learning methods are widely used and deeply studied in question classification
tasks. This paper mainly introduces our work on two aspects of Chinese question classification. The
first is to use an answer-driven method to build a richer Chinese question classification dataset for the
small-scale problems of the existing experimental dataset, which has a certain reference value for the
expansion of the dataset, especially for the construction of those low-resource language datasets. The
second is to propose a deep learning model of problem classification with a Transformer + Bi-GRU +
Attention structure. Transformer has strong learning and coding ability, but it adopts the scheme of
fixed coding length, which divides the long text into multiple segments, and each segment is coded
separately; there is no interaction that occurs between segments. Here, we achieve the information
interaction between segments through Bi-GRU so as to improve the coding effect of long sentences.
Our purpose of adding the Attention mechanism is to highlight the key semantics in questions
that contain answers. The experimental results show that the model proposed in this paper has
significantly improved the accuracy of question classification.
Keywords: QA system; question classification; deep learning; Transformer; Bi-GRU; Attention
1. Introduction
According to the prediction of Data Age 2025 white paper released by IDC, in 2025, the
amount of global data will reach an unprecedented 163ZB [
1
]. All walks of life are constantly
generating data every day: Mobike generates 25 million orders per day, 50 million messages
per day from Twitter, Youtube uploads more than 400 h of video per minute, Taobao
generates 20 tb data every day, Facebook generates 300 tb data every day, and Google
processes 24 pb data every day. In this age of information explosion, people are often
dissatisfied with search engines simply returning to a related page, especially in specific
areas, such as law, health care, etc. As traditional search engines return more web pages,
it is more difficult to find the key information they need. However, a QA system can
better identify users’ intentions and meet their needs for obtaining information quickly and
accurately, which has become one of the current research hotspots.
The question and answering system is an information retrieval system that accepts
questions from users in natural language (e.g., what is the longest river in the world?) and
finds accurate, concise answers to those questions (e.g., the Nile) from a large amount of
heterogeneous data. There is a fundamental difference from traditional search engines.
The goal of a question and answering system is to accurately answer the questions that
users ask in natural language. Compared with traditional search engines that search
based on keywords and return a collection of relevant documents, question answering
Information 2022, 13, 214. https://doi.org/10.3390/info13050214 https://www.mdpi.com/journal/information
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭