A convex inequality

gugo82
Again! :-D
That's a nice one I learnt studying CoV.
The proof is hard for beginners, but funny and kinda instructive, for it teaches how to handle convexity.

***

Let remember that a function $f:RR^N\to RR$ is said to be convex iff the following holds:

(C) $\quad AA x,y\in RR^N ,\ AA lambda \in [0,1], \quad f(lambda x+(1-lambda) y)<= lambda f(x)+(1-lambda)f(y)$.

For example, the $p$-power of the euclidean norm, i.e. the function $|x|^p$ (here $|x|=\sqrt{\sum_(n=1)^N x_n^2}$), is convex iff $p>=1$.*

It's worth mentioning that induction can be used to prove the "general convexity inequality":

"If $f:RR^N \to RR$ is convex, then for all $M\in NN$, for all $x^1,\ldots ,x^M \in RR^N$, for all $lambda_1,\ldots ,lambda_M \in [0,1]$ s.t. $\sum_(m=1)^M lambda_m =1$ we have:

(gC) $\quad f(\sum_(m=1)^M lambda_mx^m) <= \sum_(m=1)^M lambda_m f(x^m)$."

For example, convexity of $|x|^p$ allows to write inequalities like:

$AA x,y,z\in RR^N,\ |x+y+z|^p <= 3^(p-1) [|x|^p+|y|^p+|z|^p]$

or (in general):

$AA M\in NN,\ AA x^1,\ldots ,x^M \in RR^N, \quad |\sum_(m=1)^M x^m|^p <= M^(p-1) \sum_(m=1)^M |x^m|^p$.**

A noticeable property of convex functions on $RR$ (one variable!) is the following:

"Let $g:RR\to RR$ be convex, $x\in RR$, $0
(D) $\quad (g(xpm mu)-g(x))/mu <= (g(xpm lambda)-g(x))/lambda$."

Inequality (D) shows that the difference quotient [read: rapporto incrementale] of $g$ in $x$, i.e. the ratio $(g(x+h)-g(x))/h$, is a monotone non-decreasing function of the increment $h$.

***

Problem:

Let $f:RR^N\to RR$ be convex and satisfy the following estimate:

(E) $exists alpha >0,\ exists p>=1 :\ AA x \in RR^N,\ |f(x)|<= alpha* (1+|x|^p)$.

1. Prove that exists $beta >0$ s.t.:

(B) $AA x,y\in RR^N,\ |f(x)-f(y)|<=beta* (1+|x|^(p-1)+|y|^(p-1))*|x-y|$.

2. If $f \in C^1(RR^N)$, then exists $gamma >0$ s.t.:

(B') $AA x\in RR^N,\ |D_i f(x)|<=gamma* (1+|x|^(p-1)) \quad$ for all first derivatives $D_i f$.



Example:
Take $f(x):=sqrt(1+|x|^4)$ ($x\in RR$) and show that $f$ is of class $C^1$ and satisfies the hypothesis of the theorem; then verify that the estimate for $f'$ given by (B') is correct with a direct computation.

__________
* Actually $|x|^p$ is a function composition of $phi(x):=|x|$ and $psi(t):=t^p$; $phi(x)$ is (strictly) convex by triangle inequality and homogeneity of the norm; on the other hand, $psi(t)$ is convex only if $p>=1$ (because of Calculus facts); then $|x|^p=psi(phi(x))$ is convex iff $p>=1$ as we claim.
** Take $lambda_m=1/M$ in (gC) and use homogeneity.

Risposte
gugo82
:smt022 :smt089

Fioravante Patrone1
:smt090 boooooooooooring :smt118

gugo82
As usual, I'm going to post my solution since none in the last month tried.

"Gugo82":
Problem:

Let $f:RR^N\to RR$ be convex and satisfy the following estimate:

(E) $exists alpha >0,\ exists p>=1 :\ AA x \in RR^N,\ |f(x)|<= alpha* (1+|x|^p)$.

1. Prove that exists $beta >0$ s.t.:

(B) $AA x,y\in RR^N,\ |f(x)-f(y)|<=beta* (1+|x|^(p-1)+|y|^(p-1))*|x-y|$.

For sake of simplicity, we work out the case [tex]N=2[/tex].

Let's put [tex]x=(t,\eta ), y=(s,\eta )[/tex], with [tex]t,s,\eta \in \mathbb{R}[/tex] and [tex]g(\cdot ):=f(\cdot ,\eta )[/tex].
Since [tex]f[/tex] is convex, [tex]g[/tex] is convex as well and by (D) we have:

[tex]\frac{g(t\pm \mu)-g(t)}{\mu} \leq \frac{g(t\pm \lambda )-g(t)}{\lambda }[/tex]

for [tex]0<\mu <\lambda[/tex].
As Hint 1 says, let's chose [tex]t
(1) [tex]\frac{g(s)-g(t)}{|x-y|} \leq \frac{g(t+1+|x|+|y|)-g(t)}{1+|x|+|y|}[/tex].

Now we focus our attenction on [tex]g(t+1+|x|+|y|)=f(t+1+|x|+|y|,\eta)[/tex]: since [tex](t+1+|x|+|y|,\eta)=(t,0)+(1+|x|+|y|,\eta )[/tex] the estimate (E), together with |t|^p\leq |x|^p and triangular inequality, gives:

[tex]g(t+1+|x|+|y|) \leq \alpha \left { 1+|(t+1+|x|+|y|,\eta )|^p\right\}[/tex]
[tex]\qquad \leq \alpha \left\{ 1+|(t,0)|^p+|(1+|x|+|y|, \eta )|^p\right\}[/tex]
[tex]\qquad \leq \alpha \left\{ 1+|x|^p+|(1+|x|+|y|, \eta )|^p\right\}[/tex]
[tex]\qquad =\alpha \left\{ 1+|x|^p+[(1+|x|+|y|)^2+|\eta |^2]^\frac{p}{2}\right\}[/tex]

since [tex]|\eta |^2\leq (1+|x|+|y|)^2[/tex] the previous inequalities imply:

[tex]g(t+1+|x|+|y|) \leq \alpha \left\{ 1+|x|^p+2^\frac{p}{2}(1+|x|+|y|)^p\right\}[/tex].

From the convexity of [tex][0,+\infty[ \ni \tau \mapsto \tau^p[/tex] (which holds for [tex]p\geq 1[/tex]) we infer [tex]g(t+1+|x|+|y|) \leq \alpha \left\{ 1+|x|^p+2^\frac{p}{2} 3^{p-1} (1+|x|^p+|y|^p)\right\}[/tex], hence

(2) [tex]g(t+1+|x|+|y|)\leq \alpha \left\{ (2^\frac{p}{2} 3^{p-1} +1)+(2^\frac{p}{2} 3^{p-1} +1)|x|^p + 2^\frac{p}{2} 3^{p-1} |y|^p\right\}[/tex].

Analogously, from the estimate (E) we get [tex]-g(t)=-f(x)\leq \alpha (1+|x|^p)[/tex] and:

(3) [tex]-g(t) \leq \alpha (1+|x|^p+2|y|^p)[/tex]

for [tex]0\leq 2|y|^p[/tex].
Finally we put (2) and (3) together in (1) to have:

[tex]\frac{g(s)-g(t)}{|x-y|} \leq \alpha (2^\frac{p}{2} 3^{p-1}+2) \frac{1+|x|^p+|y|^p}{1+|x|+|y|}[/tex];

but [tex]1+|x|^p+|y|^p\leq (1+|x|+|y|)\cdot(1+|x|^{p-1}+|y|^{p-1})[/tex] and then:

(4) [tex]g(s)-g(t) \leq \beta^\prime (1+|x|^{p-1}+|y|^{p-1}) |x-y|[/tex]

with [tex]\beta^\prime := \alpha (2^\frac{p}{2} 3^{p-1}+2)[/tex].

Repeating the same argument, we get inequality:

(5) [tex]g(t)-g(s) \leq \beta^\prime (1+|x|^{p-1}+|y|^{p-1}) |x-y|[/tex]

and from a comparison between (4) and (5) we finally have:

[tex]|g(t)-g(s)|\leq \beta^\prime (1+|x|^{p-1}+|y|^{p-1}) |x-y|[/tex],

which trivially holds also for [tex]t=s[/tex], and the constant [tex]\beta^\prime[/tex] doesn't actually depends on [tex]t[/tex], [tex]s[/tex] or [tex]\eta[/tex].

In the same manner, if we put [tex]h(\cdot )=f(\xi ,\cdot)[/tex] and chose [tex]t, s \in \mathbb{R}[/tex], we find:

[tex]|h(t)-h(s)|\leq \beta^\prime (1+|x|^{p-1}+|y|^{p-1}) |x-y|[/tex].

Finally, if we take [tex]x=(x_1,x_2),y=(y_1,y_2)\in \mathbb{R}[/tex], we have:

[tex]|f(x)-f(y)|\leq |f(x_1,x_2)-f(y_1,x_2)|+|f(y_1,x_2)-f(y_1,y_2)|[/tex]
[tex]\quad = |g(x_1)-g(y_1)|+|h(x_2)-h(y_2)|[/tex]
[tex]\quad \leq \beta^\prime \left[ (1+|x|^{p-1}+|(y_1,x_2)|^{p-1}) |x_1-y_1|+(1+|(y_1,x_2)|^{p-1}+|y|^{p-1})|x_2-y_2|\right][/tex]
[tex]\quad = \beta^\prime \left[ 2+|x|^{p-1}+2|(y_1,x_2)|^{p-1}+|y|^{p-1} \right] |x-y|[/tex]

because [tex]|x_1-y_1|,|x_2-y_2|\leq |x-y|[/tex].
Now [tex]|(y_1,x_2)|\leq |y_1|+|x_2|\leq |x|+|y|[/tex] by triangle inequality and [tex]|(y_1,x_2)|^{p-1} \leq (|x|+|y|)^{p-1}\leq 2^{p-2} (|x|^{p-1}+|y|^{p-1})[/tex] by the properties of the power [tex]\tau \mapsto \tau^{p-1}[/tex]; therefore:

[tex]|f(x)-f(y)|\leq \beta^\prime \left[ 2+|x|^{p-1}+2^{p-1} (|x|^{p-1}+|y|^{p-1})+|y|^{p-1} \right]|x-y|[/tex]
[tex]\quad = \beta^\prime \left[ 2+(2^{p-1}+1)|x|^{p-1}+(2^{p-1}+1)|y|^{p-1}\right] |x-y|\Rightarrow[/tex]

(6) [tex]\Rightarrow \quad |f(x)-f(y)|\leq \beta (1+|x|^{p-1}+|y|^{p-1})|x-y|[/tex],

with [tex]\beta :=(2^{p-1}+1) \cdot \beta^\prime \geq 0[/tex], which is our claim in the case [tex]N=2[/tex].

The general case with [tex]N>2[/tex] can be handled in similar way.

Thomas16
it looks like the usual exercise that I leave for Christmas time... I hope somebody tries before :)......

gugo82
None wants even try? :cry:
Oh, come on... I know it's hard (this inequality made me sweat my guts out for a couple of days a month ago, really!), but some of you have all the background to succeed in giving the proof.

Perhaps it could be easier to put $N=2$ and to work out this particular case.

Anyway, if anyone finds a way to give the proof, I swear that I'll post only easy exercises for three months. :wink:

Rispondi
Per rispondere a questa discussione devi prima effettuare il login.