Variance and Covariance
Moments
For \(k\in\mathbb Z^+\), the kth moment of \(X\) is defined as \(\mathbb E(X^k)\) if it is finite and the kth central moment is \(\mathbb E((X-\mathbb E(X))^k)\) (if it is finite).
Mean is defined as \(\mu := \mathbb E(X)\) is the expectation and as the the first moment.
Variance is defined as the 2nd central moment, a.k.a.
skewness is the standardized third moment
kurtosis is the standardized fourth moment
Theorem 1
Claim.
proof.
so that
Variance
Variance is defined as the 2nd central moment, a.k.a.
Covariance between \(X,Y\) is
\(X, Y\) are uncorrelated if \(cov(X, Y) = \mathbb E(XY)-\mathbb E(X)\mathbb E(Y) = 0\)
Correlation of \(X,Y\) (defined when \(X,Y\) have finite secomd moment)
Alternative Form of Variance
Claim \(var(X) = \mathbb E(X^2)- \mathbb E(X)^2\)
proof.
Alternative Form of Covariance
Claim \(cov(X,Y) = \mathbb E(XY)-\mathbb E(X)\mathbb E(Y)\)
proof.
Variance under Linear transfomation
Claim \(var(aX+b) = a^2var(X)\)
proof.
Variance of Sums
Claim \(var(X+Y) = var(X)+ var(Y) + 2cov(X,Y)\)
proof.
Corollary \(var(X+Y) = var(X) + var(Y)\) IFF \(X,Y\) uncorrelated.
Corollary \(var(\sum^n X_i) = \sum^n var(X_i)\) if \(X_i\) are pairwise uncorrelated.
Bounded random variable
Claim If \(X\) is bounded, then its variance is finite
proof. \(X\) bounded implies \(\mathbb E(X)\) bounded, and \(X^2\) bounded, so that \(\sigma^2 = \mathbb E(X^2) - \mathbb E(X)^2\) is also bounded
Zero Variance
Claim \(var(X) = 0\) IFF \(P(X=c) = 1\)
proof. \(\mathbb E(X - \mathbb E(X)^2) = 0\) IFF \(\mathbb E(X) = c\)