1 code implementation • 2 May 2024 • Nima Hosseini Dashtbayaz, Ghazal Farhani, Boyu Wang, Charles X. Ling
Specifically, we first show that under certain conditions, the residual loss of PINNs can be globally minimized by a wide neural network.
no code implementations • 12 Feb 2024 • Qiuhao Zeng, Wei Wang, Fan Zhou, Gezheng Xu, Ruizhi Pu, Changjian Shui, Christian Gagne, Shichun Yang, Boyu Wang, Charles X. Ling
By employing Koopman Operators, we effectively address the time-evolving distributions encountered in TDG using the principles of Koopman theory, where measurement functions are sought to establish linear transition relations between evolving domains.
1 code implementation • 26 Nov 2023 • Jiaqi Li, Rui Wang, Yuanhao Lai, Changjian Shui, Sabyasachi Sahoo, Charles X. Ling, Shichun Yang, Boyu Wang, Christian Gagné, Fan Zhou
We conduct extensive experiments on various benchmarks, including a dataset with large-scale tasks, and compare our method against some recent state-of-the-art methods to demonstrate the effectiveness and scalability of our proposed method.
1 code implementation • 6 May 2021 • Tanner Bohn, Charles X. Ling
We present HARE, a new task where reader feedback is used to optimize document summaries for personal interest during the normal flow of reading.
no code implementations • 1 May 2021 • Charles X. Ling, Tanner Bohn
Thus, while our framework is still conceptual, and our experiment results are surely not SOTA, we hope that this unified lifelong learning framework inspires new work towards large-scale experiments and understanding human learning in general.
no code implementations • ICML Workshop LifelongML 2020 • Xinyu Yun, Tanner A Bohn, Charles X. Ling
Humans are the best example of agents that can learn a variety of skills incrementally over the course of their lives, and imbuing machines with this skill is the goal of lifelong machine learning.
1 code implementation • COLING 2020 • Tanner Bohn, Charles X. Ling
To advance understanding on how to engage readers, we advocate the novel task of automatic pull quote selection.
no code implementations • 21 Nov 2019 • Charles X. Ling, Tanner Bohn
Humans can learn a variety of concepts and skills incrementally over the course of their lives while exhibiting many desirable properties, such as continual learning without forgetting, forward transfer and backward transfer of knowledge, and learning a new concept or task with only a few examples.
no code implementations • 4 Oct 2019 • Tanner Bohn, Yining Hu, Charles X. Ling
We present an image preprocessing technique capable of improving the performance of few-shot classifiers on abstract visual reasoning tasks.
no code implementations • RANLP 2019 • Tanner Bohn, Yining Hu, Jinhang Zhang, Charles X. Ling
We present a novel and effective technique for performing text coherence tasks while facilitating deeper insights into the data.
9 code implementations • NeurIPS 2018 • Robert J. Wang, Xiang Li, Charles X. Ling
In this study, we propose an efficient architecture named PeleeNet, which is built with conventional convolution instead.
no code implementations • 28 Jan 2013 • Nima Mirbakhsh, Charles X. Ling
In this paper we propose an extension of matrix factorization which adds general neighborhood information on the recommendation model.