Variational multiscale nonparametric regression: Algorithms

20 Oct 2020  ·  Miguel del Alamo, Housen Li, Axel Munk, Frank Werner ·

Many modern statistically efficient methods come with tremendous computational challenges, often leading to large scale optimization problems. In this work we examine such computational issues for modern estimation methods in nonparametric regression with a specific view on image denoising. We consider in particular variational multiscale methods which are statistically optimal in minimax sense, yet computationally intensive. The computation of such a multiscale Nemirovski Dantzig estimator (MIND) requires to solve a high dimensional convex optimization problem with a specific structure of the constraints induced by a multiple statistical multiscale testing criterion. To solve this, we discuss three different algorithmic approaches: The Chambolle-Pock, ADMM and semismooth Newton algorithms. Explicit implementation is presented and the solutions are then compared numerically in a simulation study and on various test images. We thereby recommend the Chambolle-Pock algorithm in most cases for its fast convergence.

PDF Abstract

Categories


Computation Optimization and Control 62G05, 68U10

Datasets


  Add Datasets introduced or used in this paper