no code implementations • 16 Apr 2024 • Chanwook Park, Sourav Saha, Jiachen Guo, Xiaoyu Xie, Satyajit Mojumder, Miguel A. Bessa, Dong Qian, Wei Chen, Gregory J. Wagner, Jian Cao, Wing Kam Liu
The evolution of artificial intelligence (AI) and neural network theories has revolutionized the way software is programmed, shifting from a hard-coded series of codes to a vast neural network.
no code implementations • 7 Mar 2024 • Gawel Kus, Miguel A. Bessa
Gradient-free optimizers allow for tackling problems regardless of the smoothness or differentiability of their objective function, but they require many more iterations to converge when compared to gradient-based algorithms.
no code implementations • 8 Dec 2023 • Aleksandr Dekhovich, Miguel A. Bessa
We introduce a new continual (or lifelong) learning algorithm called LDA-CP&S that performs segmentation tasks without undergoing catastrophic forgetting.
no code implementations • 10 Apr 2023 • Aleksandr Dekhovich, Marcel H. F. Sluiter, David M. J. Tax, Miguel A. Bessa
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs).
1 code implementation • 23 Nov 2022 • Aleksandr Dekhovich, O. Taylan Turan, Jiaxiang Yi, Miguel A. Bessa
However, artificial neural networks suffer from catastrophic forgetting, i. e. they forget how to perform an old task when trained on a new one.
1 code implementation • 9 Aug 2022 • Aleksandr Dekhovich, David M. J. Tax, Marcel H. F. Sluiter, Miguel A. Bessa
In particular, CP&S is capable of sequentially learning 10 tasks from ImageNet-1000 keeping an accuracy around 94% with negligible forgetting, a first-of-its-kind result in class-incremental learning.
no code implementations • 24 Sep 2021 • Bernardo P. Ferreira, F. M. Andrade Pires, Miguel A. Bessa
This paper proposes a novel Adaptive Clustering-based Reduced-Order Modeling (ACROM) framework to significantly improve and extend the recent family of clustering-based reduced-order models (CROMs).
1 code implementation • 22 Sep 2021 • Aleksandr Dekhovich, David M. J. Tax, Marcel H. F. Sluiter, Miguel A. Bessa
Current deep neural networks (DNNs) are overparameterized and use most of their neuronal connections during inference for each task.
no code implementations • 10 Aug 2021 • Dongil Shin, Andrea Cupertino, Matthijs H. J. de Jong, Peter G. Steeneken, Miguel A. Bessa, Richard A. Norte
From ultra-sensitive detectors of fundamental forces to quantum networks and sensors, mechanical resonators are enabling next-generation technologies to operate in room temperature environments.