HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Position-aware Attention and Supervised Data Improve Slot Filling

{Victor Zhong Gabor Angeli Danqi Chen Yuhao Zhang Christopher D. Manning}

Position-aware Attention and Supervised Data Improve Slot Filling

Abstract

Organized relational knowledge in the form of {``}knowledge graphs{''} is important for many applications. However, the ability to populate knowledge bases with facts automatically extracted from documents has improved frustratingly slowly. This paper simultaneously addresses two issues that have held back prior work. We first propose an effective new model, which combines an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction. Then we build TACRED, a large (119,474 examples) supervised relation extraction dataset obtained via crowdsourcing and targeted towards TAC KBP relations. The combination of better supervised data and a more appropriate high-capacity model enables much better relation extraction performance. When the model trained on this new dataset replaces the previous relation extraction component of the best TAC KBP 2015 slot filling system, its F1 score increases markedly from 22.2{%} to 26.7{%}.

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-re-tacredPA-LSTM
F1: 79.4
relation-extraction-on-tacredPA-LSTM
F1: 65.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Position-aware Attention and Supervised Data Improve Slot Filling | Papers | HyperAI