Moreau-Yosida approximation

2018-10-23 14:53:16

Let $f\colon H\to\mathbb{R}\cup \{+\infty\}$ is convex, lower semicontinuous, proper and coercive function. $H$ is a hilbert space.

$f_{\lambda}:H\to\mathbb{R}\cup \{+\infty\}$ is the Moreau-Yosida approximation with $\lambda>0$: $$f_\lambda(x)=\inf_{y\in X} \left\{ f(y)+\frac{1}{2\lambda}||x-y||^2\right\}$$

Define $J_{\lambda}(x)=y$, $y$ is the point where the infimum is attained.

Determine $\partial f_{\lambda}(x)$ (the subdifferential of a convex function). I don't know how to do this. Any help would be greatly appreciated!

This can for example be found in "Convex Analysis ans Monotone Operator Theory in Hilbert Spaces" by Bauschke and Combette, Proposition 12.29.

The answer is

$$

\nabla (f_\lambda) = \lambda^{-1}(Id - J_\lambda),

$$

i.e. a gradient step wrt the Moreau-Yosida Regulariztion corresponds to a proximal step of the original function.

Here a sketch of the proof: First you need the result that

$$

p = J_\lambda (x) \quad \Leftrightarrow \quad (\fo

  • This can for example be found in "Convex Analysis ans Monotone Operator Theory in Hilbert Spaces" by Bauschke and Combette, Proposition 12.29.

    The answer is

    $$

    \nabla (f_\lambda) = \lambda^{-1}(Id - J_\lambda),

    $$

    i.e. a gradient step wrt the Moreau-Yosida Regulariztion corresponds to a proximal step of the original function.

    Here a sketch of the proof: First you need the result that

    $$

    p = J_\lambda (x) \quad \Leftrightarrow \quad (\forall y \in H) \quad \langle y-p,x-p \rangle + f(p) \le f(y).

    $$

    This can be e.g. derived from the optimality conditions (if you know that $Id+\partial f$ has a singlevalued inverse.

    Next you derive

    $$

    f_\lambda (y) - f_\lambda (x) \ge \langle y-x, x - J_\lambda (x) \rangle \gamma^{-1}

    $$

    and

    $$

    f_\lambda (y) - f_\lambda (x) \le \langle y-x, y - J_\lambda (y) \rangle \gamma^{-1}.

    $$

    Combining these two equation with the firmly non-expansiveness of the prox you get that

    $$

    0 \le f_\lambda (y) - f_\lambda (x) - \langle y-x, x - J_\lambda (x) \rangle \ga

    2018-10-23 15:06:58