Search Results for author: Aleksandr Dekhovich

Found 5 papers, 3 papers with code

Continual learning for surface defect segmentation by subnetwork creation and selection

no code implementations8 Dec 2023 Aleksandr Dekhovich, Miguel A. Bessa

We introduce a new continual (or lifelong) learning algorithm called LDA-CP&S that performs segmentation tasks without undergoing catastrophic forgetting.

Continual Learning Segmentation

iPINNs: Incremental learning for Physics-informed neural networks

no code implementations10 Apr 2023 Aleksandr Dekhovich, Marcel H. F. Sluiter, David M. J. Tax, Miguel A. Bessa

Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs).

Incremental Learning Multi-Task Learning

Cooperative data-driven modeling

1 code implementation23 Nov 2022 Aleksandr Dekhovich, O. Taylan Turan, Jiaxiang Yi, Miguel A. Bessa

However, artificial neural networks suffer from catastrophic forgetting, i. e. they forget how to perform an old task when trained on a new one.

Continual Learning

Continual Prune-and-Select: Class-incremental learning with specialized subnetworks

1 code implementation9 Aug 2022 Aleksandr Dekhovich, David M. J. Tax, Marcel H. F. Sluiter, Miguel A. Bessa

In particular, CP&S is capable of sequentially learning 10 tasks from ImageNet-1000 keeping an accuracy around 94% with negligible forgetting, a first-of-its-kind result in class-incremental learning.

Class Incremental Learning Incremental Learning +1

Neural network relief: a pruning algorithm based on neural activity

1 code implementation22 Sep 2021 Aleksandr Dekhovich, David M. J. Tax, Marcel H. F. Sluiter, Miguel A. Bessa

Current deep neural networks (DNNs) are overparameterized and use most of their neuronal connections during inference for each task.

Cannot find the paper you are looking for? You can Submit a new open access paper.