no code implementations • 2 Apr 2024 • Chaerin Kong, Seungyong Lee, Soohyeok Im, Wonsuk Yang
Image editing has been a long-standing challenge in the research community with its far-reaching impact on numerous applications.
no code implementations • 22 Aug 2023 • Donghoon Han, Seunghyeon Seo, Donghyeon Jeon, Jiho Jang, Chaerin Kong, Nojun Kwak
Transformers have demonstrated tremendous success not only in the natural language processing (NLP) domain but also the field of computer vision, igniting various creative approaches and applications.
no code implementations • 6 May 2023 • Seungwoo Lee, Chaerin Kong, Donghyeon Jeon, Nojun Kwak
Recent advances in diffusion models have showcased promising results in the text-to-video (T2V) synthesis task.
no code implementations • 10 Feb 2023 • Chaerin Kong, Nojun Kwak
Recent years have witnessed astonishing advances in the field of multimodal representation learning, with contrastive learning being the cornerstone for major breakthroughs.
no code implementations • 21 Nov 2022 • Jiho Jang, Chaerin Kong, Donghyeon Jeon, Seonhoon Kim, Nojun Kwak
Contrastive learning is a form of distance learning that aims to learn invariant features from two related representations.
no code implementations • 12 Oct 2022 • Chaerin Kong, Donghyeon Jeon, Ohjoon Kwon, Nojun Kwak
Fashion attribute editing is a task that aims to convert the semantic attributes of a given fashion image while preserving the irrelevant regions.
no code implementations • 9 Oct 2022 • Yeji Song, Chaerin Kong, Seoyoung Lee, Nojun Kwak, Joonseok Lee
Neural Radiance Fields (NeRF) achieves photo-realistic image rendering from novel views, and the Neural Scene Graphs (NSG) \cite{ost2021neural} extends it to dynamic scenes (video) with multiple objects.
no code implementations • 29 Jul 2022 • Chaerin Kong, Nojun Kwak
The capacity to learn incrementally from an online stream of data is an envied trait of human learners, as deep neural networks typically suffer from catastrophic forgetting and stability-plasticity dilemma.
1 code implementation • 25 Nov 2021 • Jiho Jang, Seonhoon Kim, KiYoon Yoo, Chaerin Kong, Jangho Kim, Nojun Kwak
Through self-distillation, the intermediate layers are better suited for instance discrimination, making the performance of an early-exited sub-network not much degraded from that of the full network.
3 code implementations • 23 Nov 2021 • Chaerin Kong, Jeesoo Kim, Donghoon Han, Nojun Kwak
Producing diverse and realistic images with generative models such as GANs typically requires large scale training with vast amount of images.