Event co-occurrences for prompt-based generative event argument extraction

基于提示的生成式事件论证提取的事件共现

阅读:1

Abstract

Recent works have introduced prompt learning for Event Argument Extraction (EAE) since prompt-based approaches transform downstream tasks into a more consistent format with the training task of Pre-trained Language Model (PLM). This helps bridge the gap between downstream tasks and model training. However, these previous works overlooked the complex number of events and their relationships within sentences. In order to address this issue, we propose Event Co-occurrences Prefix Event Argument Extraction (ECPEAE). ECPEAE utilizes the co-occurrences events prefixes module to incorporate template information corresponding to all events present in the current input as prefixes. These co-occurring event knowledge assist the model in handling complex event relationships. Additionally, to emphasize the template corresponding to the current event being extracted and enhance its constraint on the output format, we employ the present event bias module to integrate the template information into the calculation of attention at each layer of the model. Furthermore, we introduce an adjustable copy mechanism to overcome potential noise introduced by the additional information in the attention calculation at each layer. We validate our model using two widely used EAE datasets, ACE2005-EN and ERE-EN. Experimental results demonstrate that our ECPEAE model achieves state-of-the-art performance on both the ACE2005-EN dataset and the ERE dataset. Additionally, according to the results, our model also can be adapted to the low resource environment of different training sizes effectively.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。