Variational multiscale nonparametric regression: Algorithms
Many modern statistically efficient methods come with tremendous computational challenges, often leading to large scale optimization problems. In this work we examine such computational issues for modern estimation methods in nonparametric regression with a specific view on image denoising. We consider in particular variational multiscale methods which are statistically optimal in minimax sense, yet computationally intensive. The computation of such a multiscale Nemirovski Dantzig estimator (MIND) requires to solve a high dimensional convex optimization problem with a specific structure of the constraints induced by a multiple statistical multiscale testing criterion. To solve this, we discuss three different algorithmic approaches: The Chambolle-Pock, ADMM and semismooth Newton algorithms. Explicit implementation is presented and the solutions are then compared numerically in a simulation study and on various test images. We thereby recommend the Chambolle-Pock algorithm in most cases for its fast convergence.
PDF Abstract