 Raspian “Welcome to Raspberry Pi” after every reboot
 How to add a fan if no gpio pins available
 Raspberry Pi 3 GPIO pins always active
 Does using stonger soaps help me lose weight?
 Isomerism:Which of the following pairs of compunds are not position isomers?
 Why does cooked liver turn green after a while?
 Maltose Nomenclature
 pKa of 2fluoropropanoic acid v. 3fluoropropanoic acid
 Contacted by a recruitment agency regarding a role working for a company that a mate of mine has just left
 How can I deal with troublesome Professional Engineer?
 Being Hired as a Vendor or Contractor, how do you put it on your resume?
 I'm a few months into grad school but not keen on continuing. Should I address why I'm droppingout when applying places?
 Proving given language using finite automata
 Find Hamiltonian cycle in polynomial time
 Is a Turing machine too strong of a model to model physical computation?
 Conference rebuttal: can I add results?
 Uni group project being used by companies for profit
 Few very basic queries about post doctoral studies
 Is there any maximum time for staying in a belt in BJJ?
 Plastic vs vacuum insulated water bottle for hiking and everyday use
MoreauYosida approximation
Let $f\colon H\to\mathbb{R}\cup \{+\infty\}$ is convex, lower semicontinuous, proper and coercive function. $H$ is a hilbert space.
$f_{\lambda}:H\to\mathbb{R}\cup \{+\infty\}$ is the MoreauYosida approximation with $\lambda>0$: $$f_\lambda(x)=\inf_{y\in X} \left\{ f(y)+\frac{1}{2\lambda}xy^2\right\}$$
Define $J_{\lambda}(x)=y$, $y$ is the point where the infimum is attained.
Determine $\partial f_{\lambda}(x)$ (the subdifferential of a convex function). I don't know how to do this. Any help would be greatly appreciated!
This can for example be found in "Convex Analysis ans Monotone Operator Theory in Hilbert Spaces" by Bauschke and Combette, Proposition 12.29.
The answer is
$$
\nabla (f_\lambda) = \lambda^{1}(Id  J_\lambda),
$$
i.e. a gradient step wrt the MoreauYosida Regulariztion corresponds to a proximal step of the original function.
Here a sketch of the proof: First you need the result that
$$
p = J_\lambda (x) \quad \Leftrightarrow \quad (\fo

This can for example be found in "Convex Analysis ans Monotone Operator Theory in Hilbert Spaces" by Bauschke and Combette, Proposition 12.29.
The answer is
$$
\nabla (f_\lambda) = \lambda^{1}(Id  J_\lambda),
$$
i.e. a gradient step wrt the MoreauYosida Regulariztion corresponds to a proximal step of the original function.
Here a sketch of the proof: First you need the result that
$$
p = J_\lambda (x) \quad \Leftrightarrow \quad (\forall y \in H) \quad \langle yp,xp \rangle + f(p) \le f(y).
$$
This can be e.g. derived from the optimality conditions (if you know that $Id+\partial f$ has a singlevalued inverse.
Next you derive
$$
f_\lambda (y)  f_\lambda (x) \ge \langle yx, x  J_\lambda (x) \rangle \gamma^{1}
$$
and
$$
f_\lambda (y)  f_\lambda (x) \le \langle yx, y  J_\lambda (y) \rangle \gamma^{1}.
$$
Combining these two equation with the firmly nonexpansiveness of the prox you get that
$$
0 \le f_\lambda (y)  f_\lambda (x)  \langle yx, x  J_\lambda (x) \rangle \ga
20181023 15:06:58