no code implementations • NeurIPS 2023 • Sebastian Ament, Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy
Expected Improvement (EI) is arguably the most popular acquisition function in Bayesian optimization and has found countless successful applications, but its performance is often exceeded by that of more recent methods.
no code implementations • 28 Apr 2023 • Kevin Luxem, David Eriksson
We review the possibility of adding area-specific environmental enrichment and automatized behavioral tasks to identify neurons in specific brain areas.
1 code implementation • 3 Mar 2023 • Aryan Deshwal, Sebastian Ament, Maximilian Balandat, Eytan Bakshy, Janardhan Rao Doppa, David Eriksson
We use Bayesian Optimization (BO) and propose a novel surrogate modeling approach for efficiently handling a large number of binary and categorical parameters.
1 code implementation • 20 Oct 2022 • Natalie Maus, Kaiwen Wu, David Eriksson, Jacob Gardner
Bayesian optimization (BO) is a popular approach for sample-efficient optimization of black-box objective functions.
2 code implementations • 18 Oct 2022 • Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A. Osborne, Eytan Bakshy
We prove that under suitable reparameterizations, the BO policy that maximizes the probabilistic objective is the same as that which maximizes the AF, and therefore, PR enjoys the same regret bounds as the original BO policy using the underlying AF.
1 code implementation • 3 Mar 2022 • Sulin Liu, Qing Feng, David Eriksson, Benjamin Letham, Eytan Bakshy
Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box objective functions.
no code implementations • 22 Sep 2021 • Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy
Many real world scientific and industrial applications require optimizing multiple competing black-box objectives.
no code implementations • ICML Workshop AutoML 2021 • David Eriksson, Pierce I-Jen Chuang, Samuel Daulton, Peng Xia, Akshat Shrivastava, Arun Babu, Shicong Zhao, Ahmed Aly, Ganesh Venkatesh, Maximilian Balandat
When tuning the architecture and hyperparameters of large machine learning models for on-device deployment, it is desirable to understand the optimal trade-offs between on-device latency and model accuracy.
1 code implementation • 10 Jun 2021 • Eric Hans Lee, David Eriksson, Valerio Perrone, Matthias Seeger
Bayesian optimization (BO) is a popular method for optimizing expensive-to-evaluate black-box functions.
1 code implementation • 20 Apr 2021 • Ryan Turner, David Eriksson, Michael McCourt, Juha Kiili, Eero Laaksonen, Zhen Xu, Isabelle Guyon
It was based on tuning (validation set) performance of standard machine learning models on real datasets.
2 code implementations • 27 Feb 2021 • David Eriksson, Martin Jankowiak
Bayesian optimization (BO) is a powerful paradigm for efficient optimization of black-box objective functions.
1 code implementation • NeurIPS 2020 • Geoff Pleiss, Martin Jankowiak, David Eriksson, Anil Damle, Jacob R. Gardner
Matrix square roots and their inverses arise frequently in machine learning, e. g., when sampling from high-dimensional Gaussians $\mathcal{N}(\mathbf 0, \mathbf K)$ or whitening a vector $\mathbf b$ against covariance matrix $\mathbf K$.
1 code implementation • 24 Feb 2020 • Eric Hans Lee, David Eriksson, Bolong Cheng, Michael McCourt, David Bindel
Non-myopic acquisition functions consider the impact of the next $h$ function evaluations and are typically computed through rollout, in which $h$ steps of BO are simulated.
no code implementations • 20 Feb 2020 • David Eriksson, Matthias Poloczek
The global optimization of a high-dimensional black-box function under black-box constraints is a pervasive task in machine learning, control, and engineering.
2 code implementations • NeurIPS 2019 • David Eriksson, Michael Pearce, Jacob R. Gardner, Ryan Turner, Matthias Poloczek
This motivates the design of a local probabilistic approach for global optimization of large-scale high-dimensional problems.
3 code implementations • 30 Jul 2019 • David Eriksson, David Bindel, Christine A. Shoemaker
This paper describes Plumbing for Optimization with Asynchronous Parallelism (POAP) and the Python Surrogate Optimization Toolbox (pySOT).
1 code implementation • NeurIPS 2018 • David Eriksson, Kun Dong, Eric Hans Lee, David Bindel, Andrew Gordon Wilson
Gaussian processes (GPs) with derivatives are useful in many applications, including Bayesian optimization, implicit surface reconstruction, and terrain reconstruction.
3 code implementations • NeurIPS 2017 • Kun Dong, David Eriksson, Hannes Nickisch, David Bindel, Andrew Gordon Wilson
For applications as varied as Bayesian neural networks, determinantal point processes, elliptical graphical models, and kernel learning for Gaussian processes (GPs), one must compute a log determinant of an $n \times n$ positive definite matrix, and its derivatives - leading to prohibitive $\mathcal{O}(n^3)$ computations.
1 code implementation • 5 Feb 2009 • David Eriksson, Johan Rathsman, Oscar Stål
This manual describes the public code 2HDMC which can be used to perform calculations in a general, CP-conserving, two-Higgs-doublet model (2HDM).
High Energy Physics - Phenomenology