Statistics Definitions > Cauchy-Schwarz Inequality
What is the Cauchy-Schwarz Inequality?
The Cauchy-Schwarz Inequality is useful for bounding expected values that are difficult to calculate. The formula is:
Given that X and Y have finite variances.
What this is basically saying is that for two random variables, X and Y, the expected value of the square of them multiplied together E(XY)2 will always be less than or equal to the expected value of the product of the squares of each. E(X2)E(Y2).
The inequality can be written, equivalently, as:
The Cauchy-Schwarz inequality is arguably the inequality with the widest number of applications. As well as probability and statistics, the inequality is used in many other branches of mathematics, including:
- Classical Real and Complex Analysis,
- Hilbert spaces theory,
- Numerical analysis,
- Qualitative theory of differential equations.
Example question: use the Cauchy-Schwarz inequality to find the maximum of x + 2y + 3z,
given that x2 + y2 + z2 = 1.
We know that: (x + 2y + 3x)2 ≤ (12 + 22 32)(x2 + y2 + z2) = 14.
Therefore: x + 2y + 3z ≤ √14.
The equality holds when: x/1 = y/2 = z/3.
We are given that: x2 + y2 + z2 = 1,
x = 1/√14,
x = 2/√14,
x = 3/√14,
Many proofs are out there for this inequality, but it’s actually one of the simplest to visualize; one look at the formula should tell you that it is true. Here’s one of the simpler proofs:
It can be shown that: 2|UV|≤U2 + V2.
Therefore: 2|E[UV]|≤2E[|UV|]≤E[U2] + E[V2] = 2
(E[UV])2 ≤ (E[|UV|])2 ≤ 1.
This implies that:
You can find a comprehensive list of proofs for the Cauchy-Schwarz inequality here.
If you prefer an online interactive environment to learn R and statistics, this free R Tutorial by Datacamp is a great way to get started. If you're are somewhat comfortable with R and are interested in going deeper into Statistics, try this Statistics with R track.Comments are now closed for this post. Need help or want to post a correction? Please post a comment on our Facebook page and I'll do my best to help!