Search Results for author: Saeyoon Oh

Found 4 papers, 4 papers with code

CSA-Trans: Code Structure Aware Transformer for AST

1 code implementation7 Apr 2024 Saeyoon Oh, Shin Yoo

When applying the Transformer architecture to source code, designing a good self-attention mechanism is critical as it affects how node relationship is extracted from the Abstract Syntax Trees (ASTs) of the source code.

Code Summarization Stochastic Block Model +1

Equivariant Hypergraph Neural Networks

1 code implementation22 Aug 2022 Jinwoo Kim, Saeyoon Oh, Sungjun Cho, Seunghoon Hong

Many problems in computer vision and machine learning can be cast as learning on hypergraphs that represent higher-order relations.

Transformers Generalize DeepSets and Can be Extended to Graphs and Hypergraphs

2 code implementations NeurIPS 2021 Jinwoo Kim, Saeyoon Oh, Seunghoon Hong

We present a generalization of Transformers to any-order permutation invariant data (sets, graphs, and hypergraphs).

Ranked #5 on Graph Regression on PCQM4M-LSC (Validation MAE metric)

2k Graph Regression +2

Transformers Generalize DeepSets and Can be Extended to Graphs & Hypergraphs

1 code implementation NeurIPS 2021 Jinwoo Kim, Saeyoon Oh, Seunghoon Hong

We present a generalization of Transformers to any-order permutation invariant data (sets, graphs, and hypergraphs).

2k Graph Regression

Cannot find the paper you are looking for? You can Submit a new open access paper.