no code implementations • CoNLL (EMNLP) 2021 • Junxing Wang, Xinyi Li, Zhen Tan, Xiang Zhao, Weidong Xiao
A bidirectional attention mechanism is applied between the question sequence and the paths that connect entities, which provides us with transparent interpretability.