1 code implementation • 4 Apr 2024 • Adam Pardyl, Michał Wronka, Maciej Wołczyk, Kamil Adamczewski, Tomasz Trzciński, Bartosz Zieliński
Active Visual Exploration (AVE) is a task that involves dynamically selecting observations (glimpses), which is critical to facilitate comprehension and navigation within an environment.
1 code implementation • 12 Feb 2024 • Jakub Krajewski, Jan Ludziejewski, Kamil Adamczewski, Maciej Pióro, Michał Krutul, Szymon Antoniak, Kamil Ciebiera, Krystian Król, Tomasz Odrzygóźdź, Piotr Sankowski, Marek Cygan, Sebastian Jaszczur
Our findings not only show that MoE models consistently outperform dense Transformers but also highlight that the efficiency gap between dense and MoE models widens as we scale up the model size and training budget.
no code implementations • 6 Oct 2023 • Filip Szatkowski, Bartosz Wójcik, Mikołaj Piórczyński, Kamil Adamczewski
Transformer models, despite their impressive performance, often face practical limitations due to their high computational requirements.
no code implementations • 19 Jun 2023 • Kamil Adamczewski, Yingchen He, Mijung Park
To tackle this challenge, we take advantage of the fact that neural networks are overparameterized, which allows us to improve neural network training with differential privacy.
no code implementations • 21 Mar 2023 • Kamil Adamczewski, Christos Sakaridis, Vaishakh Patil, Luc van Gool
Lidar is a vital sensor for estimating the depth of a scene.
no code implementations • 8 Mar 2023 • Kamil Adamczewski, Mijung Park
We study the interplay between neural network pruning and differential privacy, through the two modes of parameter updates.
no code implementations • 3 Mar 2023 • Yilin Yang, Kamil Adamczewski, Danica J. Sutherland, Xiaoxiao Li, Mijung Park
Maximum mean discrepancy (MMD) is a particularly useful distance metric for differentially private data generation: when used with finite-dimensional features it allows us to summarize and privatize the data distribution once, which we can repeatedly use during generator training without further privacy loss.
1 code implementation • CVPR 2022 • Yawei Li, Kamil Adamczewski, Wen Li, Shuhang Gu, Radu Timofte, Luc van Gool
The proposed approach provides a new way to compare different methods, namely how well they behave compared with random pruning.
1 code implementation • 9 Jun 2021 • Margarita Vinaroz, Mohammad-Amin Charusaie, Frederik Harder, Kamil Adamczewski, Mijung Park
Hence, a relatively low order of Hermite polynomial features can more accurately approximate the mean embedding of the data distribution compared to a significantly higher number of random features.
1 code implementation • 10 Nov 2020 • Kamil Adamczewski, Mijung Park
We introduce Dirichlet pruning, a novel post-processing technique to transform a large neural network model into a compressed one.
no code implementations • 26 Oct 2020 • Kamil Adamczewski, Frederik Harder, Mijung Park
We introduce a simple and intuitive framework that provides quantitative explanations of statistical models through the probabilistic assessment of input feature importance.
1 code implementation • 26 Feb 2020 • Frederik Harder, Kamil Adamczewski, Mijung Park
We propose a differentially private data generation paradigm using random feature representations of kernel mean embeddings when comparing the distribution of true data with that of synthetic data.
no code implementations • 3 Jul 2019 • Kamil Adamczewski, Mijung Park
Convolutional neural networks (CNNs) in recent years have made a dramatic impact in science, technology and industry, yet the theoretical mechanism of CNN architecture design remains surprisingly vague.
2 code implementations • 7 Feb 2019 • Changyong Oh, Kamil Adamczewski, Mijung Park
We propose a new variational family for Bayesian neural networks.
no code implementations • ICCV 2015 • Kamil Adamczewski, Yumin Suh, Kyoung Mu Lee
Graph matching is a fundamental problem in computer vision.
no code implementations • CVPR 2015 • Yumin Suh, Kamil Adamczewski, Kyoung Mu Lee
By constructing Markov chain on the restricted search space instead of the original solution space, our method approximates the solution effectively.
no code implementations • 27 Sep 2014 • Kamil Adamczewski, Szymon Matejczyk, Tomasz P. Michalak
Intuitively, since the Shapley value evaluates the average marginal contribution of a player to the coalitional game, it can be used in the network context to evaluate the marginal contribution of a node in the process of information diffusion given various groups of already 'infected' nodes.