no code implementations • 7 Feb 2024 • Zhepei Wei, Chuanhao Li, Tianze Ren, Haifeng Xu, Hongning Wang
To enhance the efficiency and practicality of federated bandit learning, recent advances have introduced incentives to motivate communication among clients, where a client participates only when the incentive offered by the server outweighs its participation cost.
1 code implementation • 7 Nov 2022 • Erxin Yu, Lan Du, Yuan Jin, Zhepei Wei, Yi Chang
Recently, discrete latent variable models have received a surge of interest in both Natural Language Processing (NLP) and Computer Vision (CV), attributed to their comparable performance to the continuous counterparts in representation learning, while being more interpretable in their predictions.
1 code implementation • 24 Nov 2021 • Zhining Liu, Pengfei Wei, Zhepei Wei, Boyang Yu, Jing Jiang, Wei Cao, Jiang Bian, Yi Chang
Class-imbalance is a common problem in machine learning practice.
5 code implementations • ACL 2020 • Zhepei Wei, Jianlin Su, Yue Wang, Yuan Tian, Yi Chang
Extracting relational triples from unstructured text is crucial for large-scale knowledge graph construction.
Ranked #5 on Relation Extraction on NYT11-HRL
no code implementations • 23 Aug 2019 • Zhepei Wei, Yantao Jia, Yuan Tian, Mohammad Javad Hosseini, Sujian Li, Mark Steedman, Yi Chang
In this work, we first introduce the hierarchical dependency and horizontal commonality between the two levels, and then propose an entity-enhanced dual tagging framework that enables the triple extraction (TE) task to utilize such interactions with self-learned entity features through an auxiliary entity extraction (EE) task, without breaking the joint decoding of relational triples.