2 code implementations • 7 Apr 2024 • Shurui Gui, Xiner Li, Shuiwang Ji
Extensive experimental results confirm consistency with our theoretical analyses and show that the proposed ATTA method yields substantial performance improvements over TTA methods while maintaining efficiency and shares similar effectiveness to the more demanding active domain adaptation (ADA) methods.
1 code implementation • 28 Mar 2024 • Xuan Zhang, Jacob Helwig, Yuchao Lin, Yaochen Xie, Cong Fu, Stephan Wojtowytsch, Shuiwang Ji
While the U-Net architecture with skip connections is commonly used by prior studies to enable multi-scale processing, our analysis shows that the need for features to evolve across layers results in temporally misaligned features in skip connections, which limits the model's performance.
1 code implementation • 18 Mar 2024 • Keqiang Yan, Cong Fu, Xiaofeng Qian, Xiaoning Qian, Shuiwang Ji
Crystal structures are characterized by atomic bases within a primitive unit cell that repeats along a regular lattice throughout 3D space.
no code implementations • 7 Mar 2024 • Montgomery Bohde, Meng Liu, Alexandra Saxton, Shuiwang Ji
To address challenges in training ForgetNet at early stages, we further introduce G-ForgetNet, which uses a gating mechanism to allow for the selective integration of historical embeddings.
1 code implementation • 10 Jan 2024 • Lichao Sun, Yue Huang, Haoran Wang, Siyuan Wu, Qihui Zhang, Yuan Li, Chujie Gao, Yixin Huang, Wenhan Lyu, Yixuan Zhang, Xiner Li, Zhengliang Liu, Yixin Liu, Yijue Wang, Zhikun Zhang, Bertie Vidgen, Bhavya Kailkhura, Caiming Xiong, Chaowei Xiao, Chunyuan Li, Eric Xing, Furong Huang, Hao liu, Heng Ji, Hongyi Wang, huan zhang, Huaxiu Yao, Manolis Kellis, Marinka Zitnik, Meng Jiang, Mohit Bansal, James Zou, Jian Pei, Jian Liu, Jianfeng Gao, Jiawei Han, Jieyu Zhao, Jiliang Tang, Jindong Wang, Joaquin Vanschoren, John Mitchell, Kai Shu, Kaidi Xu, Kai-Wei Chang, Lifang He, Lifu Huang, Michael Backes, Neil Zhenqiang Gong, Philip S. Yu, Pin-Yu Chen, Quanquan Gu, ran Xu, Rex Ying, Shuiwang Ji, Suman Jana, Tianlong Chen, Tianming Liu, Tianyi Zhou, William Wang, Xiang Li, Xiangliang Zhang, Xiao Wang, Xing Xie, Xun Chen, Xuyu Wang, Yan Liu, Yanfang Ye, Yinzhi Cao, Yong Chen, Yue Zhao
This paper introduces TrustLLM, a comprehensive study of trustworthiness in LLMs, including principles for different dimensions of trustworthiness, established benchmark, evaluation, and analysis of trustworthiness for mainstream LLMs, and discussion of open challenges and future directions.
no code implementations • 26 Sep 2023 • Yaochen Xie, Ziqian Xie, Sheikh Muhammad Saiful Islam, Degui Zhi, Shuiwang Ji
Genome-wide association studies (GWAS) are used to identify relationships between genetic variations and specific traits.
1 code implementation • NeurIPS 2023 • Meng Liu, Mingda Zhang, Jialu Liu, Hanjun Dai, Ming-Hsuan Yang, Shuiwang Ji, Zheyun Feng, Boqing Gong
In this paper, we present a novel problem, namely video timeline modeling.
1 code implementation • 17 Jul 2023 • Xuan Zhang, Limei Wang, Jacob Helwig, Youzhi Luo, Cong Fu, Yaochen Xie, Meng Liu, Yuchao Lin, Zhao Xu, Keqiang Yan, Keir Adams, Maurice Weiler, Xiner Li, Tianfan Fu, Yucheng Wang, Haiyang Yu, Yuqing Xie, Xiang Fu, Alex Strasser, Shenglong Xu, Yi Liu, Yuanqi Du, Alexandra Saxton, Hongyi Ling, Hannah Lawrence, Hannes Stärk, Shurui Gui, Carl Edwards, Nicholas Gao, Adriana Ladera, Tailin Wu, Elyssa F. Hofgard, Aria Mansouri Tehrani, Rui Wang, Ameya Daigavane, Montgomery Bohde, Jerry Kurtin, Qian Huang, Tuong Phung, Minkai Xu, Chaitanya K. Joshi, Simon V. Mathis, Kamyar Azizzadenesheli, Ada Fang, Alán Aspuru-Guzik, Erik Bekkers, Michael Bronstein, Marinka Zitnik, Anima Anandkumar, Stefano Ermon, Pietro Liò, Rose Yu, Stephan Günnemann, Jure Leskovec, Heng Ji, Jimeng Sun, Regina Barzilay, Tommi Jaakkola, Connor W. Coley, Xiaoning Qian, Xiaofeng Qian, Tess Smidt, Shuiwang Ji
Advances in artificial intelligence (AI) are fueling a new paradigm of discoveries in natural sciences.
1 code implementation • NeurIPS 2023 • Haiyang Yu, Meng Liu, Youzhi Luo, Alex Strasser, Xiaofeng Qian, Xiaoning Qian, Shuiwang Ji
Supervised machine learning approaches have been increasingly used in accelerating electronic structure prediction as surrogates of first-principle computational methods, such as density functional theory (DFT).
no code implementations • 13 Jun 2023 • Xiner Li, Shurui Gui, Youzhi Luo, Shuiwang Ji
Out-of-distribution (OOD) generalization deals with the prevalent learning scenario where test distribution shifts from training distribution.
1 code implementation • 12 Jun 2023 • Yuchao Lin, Keqiang Yan, Youzhi Luo, Yi Liu, Xiaoning Qian, Shuiwang Ji
This is enabled by our approximations of infinite potential summations, where we extend the Ewald summation for several potential series approximations with provable error bounds.
Ranked #1 on Band Gap on Materials Project
1 code implementation • 11 Jun 2023 • Hongyi Ling, Zhimeng Jiang, Meng Liu, Shuiwang Ji, Na Zou
We conduct systematic experiments to show that S-Mixup can improve the performance and generalization of graph neural networks (GNNs) on various graph classification tasks.
1 code implementation • 9 Jun 2023 • Jacob Helwig, Xuan Zhang, Cong Fu, Jerry Kurtin, Stephan Wojtowytsch, Shuiwang Ji
We consider solving partial differential equations (PDEs) with Fourier neural operators (FNOs), which operate in the frequency domain.
1 code implementation • 8 Jun 2023 • Haiyang Yu, Zhao Xu, Xiaofeng Qian, Xiaoning Qian, Shuiwang Ji
We consider the prediction of the Hamiltonian matrix, which finds use in quantum chemistry and condensed matter physics.
2 code implementations • NeurIPS 2023 • Shurui Gui, Meng Liu, Xiner Li, Youzhi Luo, Shuiwang Ji
In this work, we propose to simultaneously incorporate label and environment causal independence (LECI) to fully make use of label and environment information, thereby addressing the challenges faced by prior methods on identifying causal and invariant subgraphs.
no code implementations • 25 May 2023 • Xuan Zhang, Shenglong Xu, Shuiwang Ji
Existing optimization approaches compute the energy by sampling local energy from an explicit probability distribution given by the wavefunction.
1 code implementation • 6 May 2023 • Cong Fu, Keqiang Yan, Limei Wang, Wing Yee Au, Michael McThrow, Tao Komikado, Koji Maruhashi, Kanji Uchino, Xiaoning Qian, Shuiwang Ji
Proteins are complex biomolecules that perform a variety of crucial functions within living organisms.
no code implementations • 1 May 2023 • Zhao Xu, Yaochen Xie, Youzhi Luo, Xuan Zhang, Xinyi Xu, Meng Liu, Kaleb Dickerson, Cheng Deng, Maho Nakata, Shuiwang Ji
Here, we propose a novel deep learning framework to predict 3D geometries from molecular graphs.
1 code implementation • NeurIPS 2023 • Weitao Du, Yuanqi Du, Limei Wang, Dieqiao Feng, Guifeng Wang, Shuiwang Ji, Carla Gomes, Zhi-Ming Ma
Geometric deep learning enables the encoding of physical symmetries in modeling 3D objects.
no code implementations • 17 Mar 2023 • Jie Wang, Zhihao Shi, Xize Liang, Shuiwang Ji, Bin Li, Feng Wu
During the message passing (MP) in GNNs, subgraph-wise sampling methods discard messages outside the mini-batches in backward passes to avoid the well-known neighbor explosion problem, i. e., the exponentially increasing dependencies of nodes with the number of MP iterations.
1 code implementation • 19 Feb 2023 • Jie Wang, Rui Yang, Zijie Geng, Zhihao Shi, Mingxuan Ye, Qi Zhou, Shuiwang Ji, Bin Li, Yongdong Zhang, Feng Wu
The appealing features of RSD-OA include that: (1) RSD-OA is invariant to visual distractions, as it is conditioned on the predefined subsequent action sequence without task-irrelevant information from transition dynamics, and (2) the reward sequence captures long-term task-relevant information in both rewards and transition dynamics.
no code implementations • 21 Nov 2022 • Haitao Lin, Yufei Huang, Meng Liu, Xuanjing Li, Shuiwang Ji, Stan Z. Li
Previous works usually generate atoms in an auto-regressive way, where element types and 3D coordinates of atoms are generated one by one.
1 code implementation • 11 Oct 2022 • Meng Liu, Haoran Liu, Shuiwang Ji
the discrete data space to approximately construct the provably optimal proposal distribution, which is subsequently used by importance sampling to efficiently estimate the original ratio matching objective.
2 code implementations • 23 Sep 2022 • Keqiang Yan, Yi Liu, Yuchao Lin, Shuiwang Ji
Our Matformer is designed to be invariant to periodicity and can capture repeating patterns explicitly.
Ranked #2 on Band Gap on JARVIS-DFT
1 code implementation • 26 Jul 2022 • Limei Wang, Haoran Liu, Yi Liu, Jerry Kurtin, Shuiwang Ji
In this work, we propose to develop a novel hierarchical graph network, known as ProNet, to capture the relations.
2 code implementations • 26 Jun 2022 • Shurui Gui, Hao Yuan, Jie Wang, Qicheng Lao, Kang Li, Shuiwang Ji
We investigate the explainability of graph neural networks (GNNs) as a step toward elucidating their working mechanisms.
1 code implementation • 17 Jun 2022 • Limei Wang, Yi Liu, Yuchao Lin, Haoran Liu, Shuiwang Ji
To incorporate 3D information completely and efficiently, we propose a novel message passing scheme that operates within 1-hop neighborhood.
Ranked #4 on Drug Discovery on QM9
no code implementations • 16 Jun 2022 • Xueliang Wang, Jianyu Cai, Shuiwang Ji, Houqiang Li, Feng Wu, Jie Wang
A major novelty of SALA is the task-adaptive metric, which can learn the metric adaptively for different tasks in an end-to-end fashion.
1 code implementation • 16 Jun 2022 • Shurui Gui, Xiner Li, Limei Wang, Shuiwang Ji
Our GOOD benchmark is a growing project and expects to expand in both quantity and variety of resources as the area develops.
no code implementations • 15 Jun 2022 • Cong Fu, Xuan Zhang, Huixin Zhang, Hongyi Ling, Shenglong Xu, Shuiwang Ji
Based on the proposed lattice convolutions, we design lattice convolutional networks (LCN) that use self-gating and attention mechanisms.
1 code implementation • 14 Jun 2022 • Haiyang Yu, Limei Wang, Bokun Wang, Meng Liu, Tianbao Yang, Shuiwang Ji
GraphFM-IB applies FM to in-batch sampled data, while GraphFM-OB applies FM to out-of-batch data that are 1-hop neighborhood of in-batch data.
no code implementations • 4 Jun 2022 • Meng Liu, Haiyang Yu, Shuiwang Ji
Message passing graph neural networks (GNNs) are known to have their expressiveness upper-bounded by 1-dimensional Weisfeiler-Leman (1-WL) algorithm.
1 code implementation • 20 May 2022 • Rui Yang, Jie Wang, Zijie Geng, Mingxuan Ye, Shuiwang Ji, Bin Li, Feng Wu
Generalization across different environments with the same tasks is critical for successful applications of visual reinforcement learning (RL) in real scenarios.
1 code implementation • 19 Apr 2022 • Meng Liu, Youzhi Luo, Kanji Uchino, Koji Maruhashi, Shuiwang Ji
Second, to preserve the desirable equivariance property, we select a local reference atom according to the designed auxiliary classifiers and then construct a local spherical coordinate system.
no code implementations • 24 Mar 2022 • Jie Wang, Zhanqiu Zhang, Zhihao Shi, Jianyu Cai, Shuiwang Ji, Feng Wu
Semantic matching models -- which assume that entities with similar semantics have similar embeddings -- have shown great power in knowledge graph embeddings (KGE).
no code implementations • 26 Feb 2022 • Youzhi Luo, Michael McThrow, Wing Yee Au, Tao Komikado, Kanji Uchino, Koji Maruhashi, Shuiwang Ji
In this work, we propose GraphAug, a novel automated data augmentation method aiming at computing label-invariant augmentations for graph classification.
no code implementations • 16 Feb 2022 • Yaochen Xie, Zhao Xu, Shuiwang Ji
Self-supervised learning (SSL) of graph neural networks is emerging as a promising way of leveraging unlabeled data.
1 code implementation • 16 Feb 2022 • Yaochen Xie, Sumeet Katariya, Xianfeng Tang, Edward Huang, Nikhil Rao, Karthik Subbian, Shuiwang Ji
They are also unable to provide explanations in cases where the GNN is trained in a self-supervised manner, and the resulting representations are used in future downstream tasks.
1 code implementation • 7 Feb 2022 • Meng Liu, Shuiwang Ji
Therefore, our Neighbor2Seq naturally endows GNNs with the efficiency and advantages of deep learning operations on grid-like data by precomputing the Neighbor2Seq transformations.
1 code implementation • NeurIPS 2021 • Zhanqiu Zhang, Jie Wang, Jiajun Chen, Shuiwang Ji, Feng Wu
To address this challenge, we propose a novel query embedding model, namely Cone Embeddings (ConE), which is the first geometry-based QE model that can handle all the FOL operations, including conjunction, disjunction, and negation.
3 code implementations • 30 Sep 2021 • Zhao Xu, Youzhi Luo, Xuan Zhang, Xinyi Xu, Yaochen Xie, Meng Liu, Kaleb Dickerson, Cheng Deng, Maho Nakata, Shuiwang Ji
Here, we propose to predict the ground-state 3D geometries from molecular graphs using machine learning methods.
Ranked #1 on 3D Geometry Prediction on Molecule3D val
no code implementations • 29 Sep 2021 • Yaochen Xie, Sumeet Katariya, Xianfeng Tang, Edward W Huang, Nikhil Rao, Karthik Subbian, Shuiwang Ji
TAGE enables the explanation of GNN embedding models without downstream tasks and allows efficient explanation of multitask models.
no code implementations • 29 Sep 2021 • Meng Liu, Keqiang Yan, Bora Oztekin, Shuiwang Ji
In this work, we propose GraphEBM, a molecular graph generation method via energy-based models (EBMs), as an exploratory work to perform permutation invariant and multi-objective molecule generation.
1 code implementation • 29 Sep 2021 • Meng Liu, Haoran Liu, Shuiwang Ji
In this study, we propose ratio matching with gradient-guided importance sampling (RMwGGIS) to alleviate the above limitations.
no code implementations • ICLR 2022 • Youzhi Luo, Shuiwang Ji
We consider the problem of generating 3D molecular geometries from scratch.
no code implementations • 20 Jul 2021 • Xinyi Xu, Cheng Deng, Yaochen Xie, Shuiwang Ji
Our framework embeds the given graph into multiple subspaces, of which each representation is prompted to encode specific characteristics of graphs.
1 code implementation • NeurIPS Workshop AI4Scien 2021 • Meng Liu, Cong Fu, Xuan Zhang, Limei Wang, Yaochen Xie, Hao Yuan, Youzhi Luo, Zhao Xu, Shenglong Xu, Shuiwang Ji
We employ our methods to participate in the 2021 KDD Cup on OGB Large-Scale Challenge (OGB-LSC), which aims to predict the HOMO-LUMO energy gap of molecules.
1 code implementation • 26 Apr 2021 • Yushun Dong, Kaize Ding, Brian Jalaian, Shuiwang Ji, Jundong Li
Existing efforts can be mainly categorized as spectral-based and spatial-based methods.
1 code implementation • NeurIPS 2021 • Qi Qi, Youzhi Luo, Zhao Xu, Shuiwang Ji, Tianbao Yang
Compared with AUROC, AUPRC is a more appropriate metric for highly imbalanced datasets.
1 code implementation • 23 Mar 2021 • Meng Liu, Youzhi Luo, Limei Wang, Yaochen Xie, Hao Yuan, Shurui Gui, Haiyang Yu, Zhao Xu, Jingtun Zhang, Yi Liu, Keqiang Yan, Haoran Liu, Cong Fu, Bora Oztekin, Xuan Zhang, Shuiwang Ji
Although there exist several libraries for deep learning on graphs, they are aiming at implementing basic operations for graph deep learning.
no code implementations • 15 Mar 2021 • Hongyang Gao, Yi Liu, Xuan Zhang, Shuiwang Ji
We study text representation methods using deep models.
no code implementations • 22 Feb 2021 • Yaochen Xie, Zhao Xu, Jingtun Zhang, Zhengyang Wang, Shuiwang Ji
Our unified treatment of SSL methods for GNNs sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms.
1 code implementation • ICLR 2022 • Yi Liu, Limei Wang, Meng Liu, Xuan Zhang, Bora Oztekin, Shuiwang Ji
Based on such observations, we propose the spherical message passing (SMP) as a novel and powerful scheme for 3D molecular learning.
Ranked #3 on Drug Discovery on QM9
1 code implementation • 9 Feb 2021 • Hao Yuan, Haiyang Yu, Jie Wang, Kang Li, Shuiwang Ji
To make the tree search more effective, we propose to use Shapley values as a measure of subgraph importance, which can also capture the interactions among different subgraphs.
no code implementations • 1 Feb 2021 • Youzhi Luo, Keqiang Yan, Shuiwang Ji
We consider the problem of molecular graph generation using deep models.
1 code implementation • ICLR Workshop EBM 2021 • Meng Liu, Keqiang Yan, Bora Oztekin, Shuiwang Ji
We note that most existing approaches for molecular graph generation fail to guarantee the intrinsic property of permutation invariance, resulting in unexpected bias in generative models.
no code implementations • 14 Jan 2021 • Yi Liu, Shuiwang Ji
The method is then integrated to the last stage of the proposed transfer learning framework to reuse the complex patterns learned from the same CT images.
no code implementations • 12 Jan 2021 • Yi Liu, Shuiwang Ji
The effectiveness of our methods is evaluated on both online and offline tasks.
no code implementations • 6 Jan 2021 • Hao Yuan, Shuiwang Ji
Several graph neural network approaches are proposed for node feature learning and they generally follow a neighboring information aggregation scheme to learn node features.
no code implementations • 1 Jan 2021 • Hongyang Gao, Shuiwang Ji
To address these limitations, we propose a teleport graph convolution layer (TeleGCL) that uses teleport functions to enable each node to aggregate information from a much larger neighborhood.
no code implementations • 1 Jan 2021 • Hongyang Gao, Shuiwang Ji
Line graphs have shown to be effective in improving feature learning in graph neural networks.
no code implementations • 31 Dec 2020 • Hao Yuan, Haiyang Yu, Shurui Gui, Shuiwang Ji
To facilitate evaluations, we generate a set of benchmark graph datasets specifically for GNN explainability.
1 code implementation • 2 Dec 2020 • Zhengyang Wang, Meng Liu, Youzhi Luo, Zhao Xu, Yaochen Xie, Limei Wang, Lei Cai, Qi Qi, Zhuoning Yuan, Tianbao Yang, Shuiwang Ji
Here we develop a suite of comprehensive machine learning methods and tools spanning different computational models, molecular representations, and loss functions for molecular property prediction and drug discovery.
1 code implementation • 17 Nov 2020 • Xinyi Xu, Zhengyang Wang, Cheng Deng, Hao Yuan, Shuiwang Ji
Grouping has been commonly used in deep metric learning for computing diverse features.
no code implementations • 6 Nov 2020 • Yaochen Xie, Yu Ding, Shuiwang Ji
Advances in deep learning enable us to perform image-to-image transformation tasks for various types of microscopy image reconstruction, computationally producing high-quality images from the physically acquired low-quality ones.
1 code implementation • NeurIPS 2020 • Yaochen Xie, Zhengyang Wang, Shuiwang Ji
Self-supervised frameworks that learn denoising models with merely individual noisy images have shown strong capability and promising performance in various image denoising tasks.
2 code implementations • 20 Oct 2020 • Lei Cai, Jundong Li, Jie Wang, Shuiwang Ji
In this formalism, a link prediction problem is converted to a graph classification task.
1 code implementation • 20 Oct 2020 • Lei Cai, Zhengyang Wang, Rob Kulathinal, Sudhir Kumar, Shuiwang Ji
In our task, saliency maps are used to assist the identification and visualization of developmental landmarks.
no code implementations • 19 Oct 2020 • Hongyang Gao, Yi Liu, Shuiwang Ji
In addition, graph topology is incorporated in global voting to compute the importance score of each node globally in the entire graph.
no code implementations • 15 Sep 2020 • Zhengyang Wang, Bunyamin Sisman, Hao Wei, Xin Luna Dong, Shuiwang Ji
We evaluate CorDEL with extensive experiments conducted on both public benchmark datasets and a real-world dataset.
Ranked #7 on Entity Resolution on Amazon-Google
1 code implementation • 5 Aug 2020 • Zhengyang Wang, Yaochen Xie, Shuiwang Ji
In this work, we introduce global voxel transformer networks (GVTNets), an advanced deep learning tool for augmented microscopy that overcomes intrinsic limitations of the current U-Net based models and achieves improved performance.
no code implementations • 24 Jul 2020 • Sina Mohseni, Fan Yang, Shiva Pentyala, Mengnan Du, Yi Liu, Nic Lupfer, Xia Hu, Shuiwang Ji, Eric Ragan
Combating fake news and misinformation propagation is a challenging task in the post-truth era.
1 code implementation • 20 Jul 2020 • Zhengyang Wang, Shuiwang Ji
In addition, compared to existing graph pooling methods, second-order pooling is able to use information from all nodes and collect second-order statistics, making it more powerful.
3 code implementations • 18 Jul 2020 • Meng Liu, Hongyang Gao, Shuiwang Ji
Based on our theoretical and empirical analysis, we propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
Ranked #2 on Node Classification on AMZ Computers
no code implementations • 18 Jul 2020 • Yi Liu, Hao Yuan, Lei Cai, Shuiwang Ji
However, these methods do not incorporate the important sequential information from amino acid chains and the high-order pairwise interactions.
Protein Interface Prediction Vocal Bursts Intensity Prediction
1 code implementation • ICLR 2020 • Hongyang Gao, Zhengyang Wang, Shuiwang Ji
Use of attention operators on high-order data requires flattening of the spatial or spatial-temporal dimensions into a vector, which is assumed to follow a multivariate normal distribution.
no code implementations • 3 Jun 2020 • Hao Yuan, Jiliang Tang, Xia Hu, Shuiwang Ji
Furthermore, our experimental results indicate that the generated graphs can provide guidance on how to improve the trained GNNs.
1 code implementation • 29 May 2020 • Meng Liu, Zhengyang Wang, Shuiwang Ji
Modern graph neural networks (GNNs) learn node embeddings through multilayer local aggregation and achieve great success in applications on assortative graphs.
no code implementations • 16 May 2020 • Zhengyang Wang, Xia Hu, Shuiwang Ji
On the other hand, iCapsNets explore a novel way to explain the model's general behavior, achieving global interpretability.
1 code implementation • ICLR 2020 • Hao Yuan, Shuiwang Ji
Learning high-level representations for graphs is of great importance for graph analysis tasks.
3 code implementations • 2 Mar 2020 • Wei Jin, Ya-Xin Li, Han Xu, Yiqi Wang, Shuiwang Ji, Charu Aggarwal, Jiliang Tang
As the extensions of DNNs to graphs, Graph Neural Networks (GNNs) have been demonstrated to inherit this vulnerability.
no code implementations • 25 Sep 2019 • Hongyang Gao, Yaochen Xie, Shuiwang Ji
This results in the Siamese attention operator (SAO).
no code implementations • 25 Sep 2019 • Hongyang Gao, Shuiwang Ji
Previous studies used global ranking methods to sample some of the important nodes, but most of them are not able to incorporate graph topology information in computing ranking scores.
no code implementations • Twenty-Eighth International Joint Conference on Artificial Intelligence (IJCAI-19) 2019 • Jun Li, Yongjun Chen, Lei Cai, Ian Davidson, Shuiwang Ji
The proposed dense transformer modules are differentiable, thus the entire network can be trained.
Ranked #1 on Electron Microscopy Image Segmentation on SNEMI3D
Electron Microscopy Image Segmentation Image Segmentation +1
no code implementations • 8 Jul 2019 • Fan Yang, Shiva K. Pentyala, Sina Mohseni, Mengnan Du, Hao Yuan, Rhema Linder, Eric D. Ragan, Shuiwang Ji, Xia Hu
In this demo paper, we present the XFake system, an explainable fake news detector that assists end-users to identify news credibility.
1 code implementation • 5 Jul 2019 • Hongyang Gao, Shuiwang Ji
To further reduce the requirements on computational resources, we propose the cGAO that performs attention operations along channels.
Ranked #8 on Graph Classification on D&D (using extra training data)
no code implementations • 1 Jul 2019 • Yi Liu, Hao Yuan, Zhengyang Wang, Shuiwang Ji
It is also shown that our proposed global pixel transformer layer is useful to improve the fluorescence image prediction results.
3 code implementations • 11 May 2019 • Hongyang Gao, Shuiwang Ji
We further propose the gUnpool layer as the inverse operation of the gPool layer.
Ranked #4 on Graph Classification on D&D
no code implementations • 27 Mar 2019 • Mengnan Du, Ninghao Liu, Fan Yang, Shuiwang Ji, Xia Hu
REAT decomposes the final prediction of a RNN into additive contribution of each word in the input text.
1 code implementation • 21 Jan 2019 • Hongyang Gao, Yongjun Chen, Shuiwang Ji
Another limitation of GCN when used on graph-based text representation tasks is that, GCNs do not consider the order information of nodes in graph.
3 code implementations • 10 Dec 2018 • Zhengyang Wang, Na Zou, Dinggang Shen, Shuiwang Ji
In this work, we propose the non-local U-Nets, which are equipped with flexible global aggregation blocks, for biomedical image segmentation.
no code implementations • 27 Sep 2018 • Hongyang Gao, Shuiwang Ji
We further propose the gUnpool layer as the inverse operation of the gPool layer.
2 code implementations • NeurIPS 2018 • Hongyang Gao, Zhengyang Wang, Shuiwang Ji
Compared to prior CNNs designed for mobile devices, ChannelNets achieve a significant reduction in terms of the number of parameters and computational cost without loss in accuracy.
1 code implementation • 27 Aug 2018 • Zhengyang Wang, Shuiwang Ji
Unlike existing models, which explore solutions by focusing on a block of cascaded dilated convolutional layers, our methods address the gridding artifacts by smoothing the dilated convolution itself.
1 code implementation • 12 Aug 2018 • Hongyang Gao, Zhengyang Wang, Shuiwang Ji
However, the number of neighboring units is neither fixed nor are they ordered in generic graphs, thereby hindering the applications of convolutional operations.
Ranked #2 on Document Classification on Cora
no code implementations • 24 Nov 2017 • Hongyang Gao, Shuiwang Ji
In this paper, we propose a set of methods based on kernel rotation and flip to enable rotation and flip invariance in convolutional neural networks.
1 code implementation • 24 May 2017 • Jun Li, Yongjun Chen, Lei Cai, Ian Davidson, Shuiwang Ji
The proposed dense transformer modules are differentiable, thus the entire network can be trained.
1 code implementation • 19 May 2017 • Lei Cai, Hongyang Gao, Shuiwang Ji
In the simplest case, the proposed multi-stage VAE divides the decoder into two components in which the second component generates refined images based on the course images generated by the first component.
1 code implementation • 18 May 2017 • Zhengyang Wang, Shuiwang Ji
We also show that the text representation requirement in visual question answering is more complicated and comprehensive than that in conventional natural language processing tasks, making it a better task to evaluate textual representation methods.
4 code implementations • ICLR 2018 • Hongyang Gao, Hao Yuan, Zhengyang Wang, Shuiwang Ji
When used in image generation tasks, our PixelDCL can largely overcome the checkerboard problem suffered by regular deconvolution operations.
1 code implementation • 18 May 2017 • Zhengyang Wang, Hao Yuan, Shuiwang Ji
In this work, we propose spatial VAEs that use feature maps of larger size as latent variables to explicitly capture spatial information.
no code implementations • NeurIPS 2008 • Shuiwang Ji, Liang Sun, Rong Jin, Jieping Ye
We present a multi-label multiple kernel learning (MKL) formulation, in which the data are embedded into a low-dimensional space directed by the instance-label correlations encoded into a hypergraph.