no code implementations • EMNLP 2020 • Wenqing Chen, Jidong Tian, Liqiang Xiao, Hao He, Yaohui Jin
In the field of causal inference, GS in our model is essentially a counterfactual reasoning process, trying to estimate the causal effect between tasks and utilize it to improve MTL.
no code implementations • EMNLP 2021 • Jidong Tian, Yitian Li, Wenqing Chen, Liqiang Xiao, Hao He, Yaohui Jin
Recently, language models (LMs) have achieved significant performance on many NLU tasks, which has spurred widespread interest for their possible applications in the scientific and social area.
no code implementations • EMNLP 2020 • Liqiang Xiao, Lu Wang, Hao He, Yaohui Jin
Previous work is mostly based on statistical methods that estimate word-level salience, which does not consider semantics and larger context when quantifying importance.
no code implementations • EMNLP 2021 • Liqiang Xiao, Jun Ma2, Xin Luna Dong, Pascual Martinez-Gomez, Nasser Zalmout, Wei Chen, Tong Zhao, Hao He, Yaohui Jin
Successful conversational search systems can present natural, adaptive and interactive shopping experience for online shopping customers.
no code implementations • COLING 2020 • Wenqing Chen, Jidong Tian, Liqiang Xiao, Hao He, Yaohui Jin
In this paper, we propose a semantically consistent and syntactically variational encoder-decoder framework, which uses adversarial learning to ensure the syntactic latent variable be semantic-free.
no code implementations • EMNLP 2018 • Liqiang Xiao, Honglun Zhang, Wenqing Chen, Yongkun Wang, Yaohui Jin
Multi-task learning has an ability to share the knowledge among related tasks and implicitly increase the training data.
no code implementations • COLING 2018 • Liqiang Xiao, Honglun Zhang, Wenqing Chen, Yongkun Wang, Yaohui Jin
Neural network based multi-task learning has achieved great success on many NLP problems, which focuses on sharing knowledge among tasks by linking some layers to enhance the performance.
no code implementations • NAACL 2018 • Liqiang Xiao, Honglun Zhang, Wenqing Chen
This success can be largely attributed to the feature sharing by fusing some layers among tasks.
no code implementations • EMNLP 2018 • Honglun Zhang, Liqiang Xiao, Wenqing Chen, Yongkun Wang, Yaohui Jin
Multi-task learning in text classification leverages implicit correlations among related tasks to extract common features and yield performance gains.
no code implementations • 10 Jul 2017 • Honglun Zhang, Liqiang Xiao, Yongkun Wang, Yaohui Jin
Multi-task learning leverages potential correlations among related tasks to extract common features and yield performance gains.