1 code implementation • 31 May 2024 • Byeonghu Na, Suhyeon Jo, Yeongmin Kim, Il-Chul Moon
Relation extraction (RE) is a fundamental task in natural language processing, aiming to identify relations between target entities in text.
1 code implementation • 28 May 2024 • Byeonghu Na, Yeongmin Kim, Minsang Park, DongHyeok Shin, Wanmo Kang, Il-Chul Moon
Recent advances in powerful pre-trained diffusion models encourage the development of methods to improve the sampling performance under well-trained diffusion models.
no code implementations • 27 May 2024 • Yeongmin Kim, Kwanghyeon Lee, Minsang Park, Byeonghu Na, Il-Chul Moon
Recent studies have employed an auxiliary encoder to identify a corresponding representation from a sample and to adjust the dimensionality of a latent variable z.
1 code implementation • 2 Mar 2024 • Yeongmin Kim, Byeonghu Na, Minsang Park, JoonHo Jang, Dongjun Kim, Wanmo Kang, Il-Chul Moon
While directly applying it to score-matching is intractable, we discover that using the time-dependent density ratio both for reweighting and score correction can lead to a tractable form of the objective function to regenerate the unbiased data density.
1 code implementation • 27 Feb 2024 • Byeonghu Na, Yeongmin Kim, HeeSun Bae, Jung Hyun Lee, Se Jung Kwon, Wanmo Kang, Il-Chul Moon
This paper proposes Transition-aware weighted Denoising Score Matching (TDSM) for training conditional diffusion models with noisy labels, which is the first study in the line of diffusion models.
1 code implementation • Proceedings of the 40th International Conference on Machine Learning 2023 • Yoon-Yeong Kim, Youngjae Cho, JoonHo Jang, Byeonghu Na, Yeongmin Kim, Kyungwoo Song, Wanmo Kang, Il-Chul Moon
Specifically, our proposed method, Sharpness-Aware Active Learning (SAAL), constructs its acquisition function by selecting unlabeled instances whose perturbed loss becomes maximum.
2 code implementations • 28 Nov 2022 • Dongjun Kim, Yeongmin Kim, Se Jung Kwon, Wanmo Kang, Il-Chul Moon
In sample generation, we add an auxiliary term to the pre-trained score to deceive the discriminator.
Ranked #1 on Conditional Image Generation on CIFAR-10
no code implementations • 26 Oct 2022 • Yeongmin Kim, Huiwon Jang, DongKeon Lee, Ho-Jin Choi
To break through these observations, we propose a simple solution AltUB which introduces alternating training to update the base distribution of normalizing flow for anomaly detection.
Ranked #2 on Anomaly Detection on BTAD (using extra training data)