知识图的变分自动编码器共嵌入模型

ID:38843

大小:0.25 MB

页数:11页

时间:2023-03-14

金币:2

上传者:战必胜

 
Citation: Xie, L.; Huang, H.; Du, Q.
A Co-Embedding Model with
Variational Auto-Encoder for
Knowledge Graphs. Appl. Sci. 2022,
12, 715. https://doi.org/10.3390/
app12020715
Academic Editors: Nikos D. Lagaros
and Vagelis Plevris
Received: 29 November 2021
Accepted: 6 January 2022
Published: 12 January 2022
Publishers Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
applied
sciences
Article
A Co-Embedding Model with Variational Auto-Encoder for
Knowledge Graphs
Luodi Xie
1
, Huimin Huang
2,
* and Qing Du
3
1
School of Computer Science, Sun Yat-Sen University, Guangzhou 510000, China; xield@mail2.sysu.edu.cn
2
School of Data Science and Artificial Intelligence, Wenzhou University of Technology, Wenzhou 325000, China
3
School of Software, South China University of Technology, Guangzhou 510000, China; duqing@scut.edu.cn
* Correspondence: huanghm45@gmail.com
Abstract:
Knowledge graph (KG) embedding has been widely studied to obtain low-dimensional
representations for entities and relations. It serves as the basis for downstream tasks, such as
KG completion and relation extraction. Traditional KG embedding techniques usually represent
entities/relations as vectors or tensors, mapping them in different semantic spaces and ignoring
the uncertainties. The affinities between entities and relations are ambiguous when they are not
embedded in the same latent spaces. In this paper, we incorporate a co-embedding model for KG
embedding, which learns low-dimensional representations of both entities and relations in the same
semantic space. To address the issue of neglecting uncertainty for KG components, we propose
a variational auto-encoder that represents KG components as Gaussian distributions. In addition,
compared with previous methods, our method has the advantages of high quality and interpretability.
Our experimental results on several benchmark datasets demonstrate our model’s superiority over
the state-of-the-art baselines.
Keywords: knowledge graph; embedding; variational auto-encoder
1. Introduction
Knowledge graph (KG) embeddings are low-dimensional representations for entites
and relations. This approach can benefit a range of downstream tasks, such as seman-
tic parsing [
1
,
2
], knowledge reasoning [
3
], and question answering [
4
,
5
]. Embeddings
are supposed to contain semantic information and should be able to deal with multiple
linguistic relations.
At present, research on knowledge graph embedding occurs mainly along three main
lines. One of these lines of research includes studies based on translation. TransE [
6
] was
the first model to introduce translation-based embedding, which represents entities and
relationships in the same space, and regards the relationship vector r as the translation
between the head entity vector h and the tail entity vector t, that is, h + r
t. Since transE
cannot handle one-to-many, many-to-one, and many-to-many relationships (1-to-N, N-to-1,
N-to-N), TransH [
7
] is proposed to enable an entity to have different representations when
involved in various relations. In the TransR model [
8
], an entity is a complex of multiple
attributes, and different relationships focus on different attributes of the entity. Another
line of research includes studies based on semantic matching. RESCAL [
9
] obtains its
latent semantics by using a vector to represent each entity. Each relationship is represented
as a matrix that is used to model the interaction of potential relationships. It defines
the scoring function of the triple (h, r, t) as a bilinear function. DistMult [
10
] simplifies
RESCAL by restricting the relationship matrix to a diagonal matrix, which greatly improves
training efficiency. ComplEx [
11
] extends DistMult by introducing complex number domain
embedding to better model asymmetric relationships. In ComplEx, the embedding of
entities and relationships no longer exists in real space, but in complex space. The third line
Appl. Sci. 2022, 12, 715. https://doi.org/10.3390/app12020715 https://www.mdpi.com/journal/applsci
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭