| 注册
首页|期刊导航|东南大学学报(英文版)|面向知识库问答的多注意力RNN关系链接方法

面向知识库问答的多注意力RNN关系链接方法

李慧颖 赵满 余文麒

东南大学学报(英文版)2020,Vol.36Issue(4):385-392,8.
东南大学学报(英文版)2020,Vol.36Issue(4):385-392,8.DOI:10.3969/j.issn.1003-7985.2020.04.003

面向知识库问答的多注意力RNN关系链接方法

A multi-attention RNN-based relation linking approach for question answering over knowledge base

李慧颖 1赵满 1余文麒1

作者信息

  • 1. 东南大学计算机科学与工程学院,南京211189
  • 折叠

摘要

Abstract

Aiming at the relation linking task for question answering over knowledge base,especially the multi relation linking task for complex questions,a relation linking approach based on the multi-attention recurrent neural network(RNN)model is proposed,which works for both simple and complex questions.First,the vector representations of questions are learned by the bidirectional long short-term memory(Bi-LSTM)model at the word and character levels,and named entities in questions are labeled by the conditional random field(CRF)model.Candidate entities are generated based on a dictionary,the disambiguation of candidate entities is realized based on predefined rules,and named entities mentioned in questions are linked to entities in knowledge base.Next,questions are classified into simple or complex questions by the machine learning method.Starting from the identified entities,for simple questions,one-hop relations are collected in the knowledge base as candidate relations;for complex questions,two-hop relations are collected as candidates.Finally,the multi-attention Bi-LSTM model is used to encode questions and candidate relations,compare their similarity,and return the candidate relation with the highest similarity as the result of relation linking.It is worth noting that the Bi-LSTM model with one attentions is adopted for simple questions,and the Bi-LSTM model with two attentions is adopted for complex questions.The experimental results show that,based on the effective entity linking method,the Bi-LSTM model with the attention mechanism improves the relation linking effectiveness of both simple and complex questions,which outperforms the existing relation linking methods based on graph algorithm or linguistics understanding.

关键词

知识库问答/实体链接/关系链接/多注意力双向长短时记忆网络/大规模复杂问答数据集

Key words

question answering over knowledge base(KBQA)/entity linking/relation linking/multi-attention bidirectional long short-term memory(Bi-LSTM)/large-scale complex question answering dataset(LC-QuAD)

分类

信息技术与安全科学

引用本文复制引用

李慧颖,赵满,余文麒..面向知识库问答的多注意力RNN关系链接方法[J].东南大学学报(英文版),2020,36(4):385-392,8.

基金项目

Foundation item:The National Natural Science Foundation of China(No.61502095). (No.61502095)

东南大学学报(英文版)

1003-7985

访问量0
|
下载量0
段落导航相关论文