已收录 272912 条政策
 政策提纲
  • 暂无提纲
D-BERT: Incorporating dependency-based attention into BERT for relation extraction
[摘要] Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre-trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high-level syntactic features that consider the dependency between each word and the target entities into the pre-trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi-granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.
[发布日期]  [发布机构] 
[效力级别]  [学科分类] 数学(综合)
[关键词] pattern classification;feature extraction;text analysis;natural language processing [时效性] 
   浏览次数:3      统一登录查看全文      激活码登录查看全文