PropertyValue
?:abstract
  • Relation extraction (RE) is one of the most important tasks in information extraction, as it provides essential information for many NLP applications. In this paper, we propose a cross-lingual RE approach that does not require any human annotation in a target language or any cross-lingual resources. Building upon unsupervised cross-lingual representation learning frameworks, we develop several deep Transformer based RE models with a novel encoding scheme that can effectively encode both entity location and entity type information. Our RE models, when trained with English data, outperform several deep neural network based English RE models. More importantly, our models can be applied to perform zero-shot cross-lingual RE, achieving the state-of-the-art cross-lingual RE performance on two datasets (68-89% of the accuracy of the supervised target-language RE model). The high cross-lingual transfer efficiency without requiring additional training data or cross-lingual resources shows that our RE models are especially useful for low-resource languages.
is ?:annotates of
?:arxiv_id
  • 2010.08652
?:creator
?:externalLink
?:license
  • arxiv
?:pdf_json_files
  • document_parses/pdf_json/f9254d5e7980d1d8dd766fa88baff55b9c3464b1.json
?:publication_isRelatedTo_Disease
?:sha_id
?:source
  • ArXiv
?:title
  • Cross-Lingual Relation Extraction with Transformers
?:type
?:year
  • 2020-10-16

Metadata

Anon_0  
expand all