Minimizers – 01, a variational inequality

To minimize a differentiable function, we usually know it’s gradient. Consider the function f \in C^1(K) with K be a closed convex set. Let F(x) be gradient of f, we have:

Prop: Suppose there exists an x \in K such that f(x) = \min\limits_{y \in K} f(y).

Then x is a solution of the variational inequality x \in K: (F(x), y-x) \ge 0 for y \in K.

Proof.

\varphi(t) = f(x + t(y-x)), 0 \le t \le 1. Minimum at x implies t=0. Therefore \varphi'(0) = 0, so  \varphi'(0) = (gradf,y-x), we have (gradf,y-x) \ge 0 since \varphi'(0) = (gradf,y-x).