Coreference for Learning to Extract Relations: Yes Virginia, Coreference Matters

Ryan Gabbard,  Marjorie Freedman,  Ralph Weischedel
Raytheon BBN Technologies


Abstract

As an alternative to requiring substantial supervised relation training data, many have explored bootstrapping relation extraction from a few seed examples. Most techniques assume that the examples are based on easily spotted anchors, e.g., names or dates. Sentences in a corpus which contain the anchors are then used to induce alternative ways of expressing the relation. We explore whether coreference can improve the learning process. That is, if the algorithm considered examples such as his sister, would accuracy be improved? With coreference, we see on average a 2-fold increase in F-Score. Despite using potentially errorful machine coreference, we see significant increase in recall on all relations. Precision increases in four cases and decreases in six.




Full paper: http://www.aclweb.org/anthology/P/P11/P11-2050.pdf