no code implementations • EMNLP 2020 • Sungjoon Park, Kiwoong Park, Jaimeen Ahn, Alice Oh
We analyze social media for detecting the suicidal risk of military personnel, which is especially crucial for countries with compulsory military service such as the Republic of Korea.
no code implementations • NAACL (GeBNLP) 2022 • Jaimeen Ahn, Hwaran Lee, JinHwa Kim, Alice Oh
Knowledge distillation is widely used to transfer the language understanding of a large model to a smaller model. However, after knowledge distillation, it was found that the smaller model is more biased by gender compared to the source large model. This paper studies what causes gender bias to increase after the knowledge distillation process. Moreover, we suggest applying a variant of the mixup on knowledge distillation, which is used to increase generalizability during the distillation process, not for augmentation. By doing so, we can significantly reduce the gender bias amplification after knowledge distillation. We also conduct an experiment on the GLUE benchmark to demonstrate that even if the mixup is applied, it does not have a significant adverse effect on the model’s performance.
1 code implementation • 1 Sep 2022 • Dongkwan Kim, Jiho Jin, Jaimeen Ahn, Alice Oh
Subgraphs are rich substructures in graphs, and their nodes and edges can be partially observed in real-world tasks.
1 code implementation • 23 May 2022 • Younghoon Jeong, Juhyun Oh, Jaimeen Ahn, Jongwon Lee, Jihyung Moon, Sungjoon Park, Alice Oh
Recent directions for offensive language detection are hierarchical modeling, identifying the type and the target of offensive language, and interpretability with offensive span annotation and prediction.
no code implementations • 29 Sep 2021 • Dongkwan Kim, Jiho Jin, Jaimeen Ahn, Alice Oh
Subgraphs are important substructures of graphs, but learning their representations has not been studied well.
1 code implementation • EMNLP 2021 • Jaimeen Ahn, Alice Oh
Which of the two methods works better depends on the amount of NLP resources available for that language.