西安电子科技大学学报 ›› 2020, Vol. 47 ›› Issue (3): 97-104.doi: 10.19665/j.issn1001-2400.2020.03.014

• • 上一篇    下一篇

结合主题感知与通信代理的文本摘要模型

张哲铭,任淑霞(),郭凯杰   

  1. 天津工业大学 计算机科学与技术学院,天津 300387
  • 收稿日期:2019-11-26 出版日期:2020-06-20 发布日期:2020-06-19
  • 通讯作者: 任淑霞
  • 作者简介:张哲铭(1996—),男,天津工业大学硕士研究生,E-mail: 1831125496@stu.tjpu.edu.com

Model of abstractive text summarization for topic-aware communicating agents

ZHANG Zheming,REN Shuxia(),GUO Kaijie   

  1. Department of Computer Science and Technology, Tiangong University, Tianjin 300387, China
  • Received:2019-11-26 Online:2020-06-20 Published:2020-06-19
  • Contact: Shuxia REN

摘要:

针对传统自动文本摘要模型受循环神经网络长度限制而无法生成高质量的长文本摘要这一问题,提出了一种结合主题感知与通信代理的文本摘要模型。首先,将编码器划分为相互之间存在通信的多个代理,以解决长短期记忆网络输入序列较长而不能联合先验信息生成摘要的问题;然后,使用联合注意力机制加入主题信息,提高生成摘要与源文本的相关性;最后,使用带有强化学习的混合训练方法对模型进行训练,解决曝光偏差问题,直接对评价指标进行优化。实验结果表明,该模型不仅生成了主题突出的长文本摘要,并且得分比目前最先进的模型有一定提升。说明在主题信息的帮助下,该通信代理模型能够更好地生成长文本摘要。

关键词: 自动文本摘要, 通信代理, 主题感知, 联合注意力机制, 强化学习

Abstract:

To solve the problem that the traditional automatic text summary model cannot generate a high-quality long text summary due to the limitation of the length of the RNN (Recurrent Neural Network), a model of abstractive text summarization for topic-aware communicating agents has been proposed. First, the problem that the LSTM (Long Short-Term Memory) input sequence is too long to generate the abstract with prior information has been solved by dividing the encoder into multiple collaborating agents. Then for providing topic information and improving the correlation between the generated abstract and the source text, the joint attention mechanism has been added into our model. Finally, a hybrid training method with reinforcement learning has been employed in order to solve the problem of exposure bias, and optimize the evaluation index directly. The results show that our model not only generate long text summaries with prominent themes, but also has a higher score than the state-of-the-art models, which indicates that with the help of topic information, the model for communicating agents can be expected to generate long text summaries better.

Key words: abstractive text summarization, communicating agents, topic awareness, joint attention, reinforcement learning

中图分类号: 

  • TP183
Baidu
map