Paper

Type-enhanced Ensemble Triple Representation via Triple-aware Attention for Cross-lingual Entity Alignment

Entity alignment(EA) is a crucial task for integrating cross-lingual and cross-domain knowledge graphs(KGs), which aims to discover entities referring to the same real-world object from different KGs. Most existing methods generate aligning entity representation by mining the relevance of triple elements via embedding-based methods, paying little attention to triple indivisibility and entity role diversity. In this paper, a novel framework named TTEA -- Type-enhanced Ensemble Triple Representation via Triple-aware Attention for Cross-lingual Entity Alignment is proposed to overcome the above issues considering ensemble triple specificity and entity role features. Specifically, the ensemble triple representation is derived by regarding relation as information carrier between semantic space and type space, and hence the noise influence during spatial transformation and information propagation can be smoothly controlled via specificity-aware triple attention. Moreover, our framework uses triple-ware entity enhancement to model the role diversity of triple elements. Extensive experiments on three real-world cross-lingual datasets demonstrate that our framework outperforms state-of-the-art methods.

Results in Papers With Code
(↓ scroll down to see all results)