1 code implementation • 7 Apr 2024 • Saeyoon Oh, Shin Yoo
When applying the Transformer architecture to source code, designing a good self-attention mechanism is critical as it affects how node relationship is extracted from the Abstract Syntax Trees (ASTs) of the source code.
1 code implementation • 22 Aug 2022 • Jinwoo Kim, Saeyoon Oh, Sungjun Cho, Seunghoon Hong
Many problems in computer vision and machine learning can be cast as learning on hypergraphs that represent higher-order relations.
2 code implementations • NeurIPS 2021 • Jinwoo Kim, Saeyoon Oh, Seunghoon Hong
We present a generalization of Transformers to any-order permutation invariant data (sets, graphs, and hypergraphs).
Ranked #5 on Graph Regression on PCQM4M-LSC (Validation MAE metric)
1 code implementation • NeurIPS 2021 • Jinwoo Kim, Saeyoon Oh, Seunghoon Hong
We present a generalization of Transformers to any-order permutation invariant data (sets, graphs, and hypergraphs).