no code implementations • EMNLP 2020 • Chen Jia, Yuefeng Shi, Qinrong Yang, Yue Zhang
We then integrate the entity information into BERT using Char-Entity-Transformer, which augments the self-attention using a combination of character and entity representations.