# Convex Analysis and Nonlinear Optimization: Theoryand by Jonathan Borwein, Adrian S. Lewis

R by d (x , y ) = ¢(x) - ¢( y) - ¢'( y)(x - y ). (a) Prove d (x , y ) :::: 0, wit h equality if and on ly if x = y. (b) Compute d when ¢ (t ) = t 2 /2 and when ¢ is t he funct ion p defined in Exer cise 27 .

2) is called the Lagrangian. A f easible solution is a p oint x in dom f satisfying the cons t raint s. We should emphasiz e t hat the t erm "Lagrange mul tiplier" has differ ent mean ings in differ ent contexts. In t he pr esent contex t we say a vector 5. E R + is a Lagrange multipl ier vector for a feasible solution x if x minimizes t he fun ction L ( . ) over E and 5. satisfies t he com pleme ntary slackness condit ions: 5. i = 0 whenever gi(X) < O. We can ofte n use t he following pr inciple to solve sim ple optim ization problem s.

Prove the nearest point in R+. to a vector y in R " is y+, where yt = max{Yi , O} for each i , For a matrix U in and a vector Y in R " , prove that t he nearest positive semide finite matrix to UT Diag yU is UT Diag y+U . on 9. * (Coercivity) Suppose that the fun ction f : E ---+ R is differentiable and satisfies the growt h condition limllxll-+oo f( x) /ll xll = +00. Prove that the gradient map V'f has range E . ) 10. + ---+ R defined by f(X) = tr X - I is differentiable on S+'+ . + ---+ R by f(X) = logdetX.