Skip to content

Functions of Random Variable

Change of Variable

Theorem Let XX be a r.v. and Y=g(X)Y=g(X) where g:RRg:\mathbb R\rightarrow\mathbb R is a funciton. Then

  • XX is discrete,
pmfY(y)=x:g(x)=ypmfX(x)\text{pmf}_Y(y) = \sum_{x: g(x) = y} \text{pmf}_X(x)
  • XX is continuous and gg is an appropriate transformation,
cdfY(y)=x:g(x)ypdfX(x)dx=P({x:g(x)y})\text{cdf}_Y(y) = \int_{x: g(x)\leq y}\text{pdf}_X(x)dx = P(\{x: g(x) \leq y\})
pdfY(y)=ddycdfY(y)\text{pdf}_Y(y) = \frac{d}{dy} \text{cdf}_Y(y)

Theorem Let F(x)=cdfX(x)F(x) = \text{cdf}_X(x), then F(X)uniform(0,1)F(X)\sim uniform(0, 1)

proof. Let Y=F(X)Y=F(X),

cdfY(y)=P({x:F(x)<y})=P(Xx)=F(x)=y\text{cdf}_Y(y) = P(\{x:F(x) < y\}) = P(X \leq x) = F(x) = y

Change of Single Variable

Theorem (change of variable) Let XX be continuous r.v. and function gg be differentiable and injective. Then

pdfY(y)=pdfX(g1(y))ddyg1(y)\text{pdf}_Y(y) = \text{pdf}_X(g^{-1}(y))|\frac{d}{dy}g^{-1}(y)|

proof. wlog assume gg is increasing (as a appropriate transformation function), then \begin{align} \text{pdf}Y(y) &= \frac{d}{dy}\text{cdf}_Y(y)\ &= \frac{d}{dy}\int_X(x)dx\ &= \frac{d}{dy} \text{cdf}_X(g^{-1}(y)) \ &= \text{pdf}_X(g^{-1}(y))|\frac{d}{dy}g^{-1}(y)| \end{align}^{g^{-1}(y)} \text{pdf}

Change of Variables for Multivariate Functions

Theorem For discrete random variables X=(X1,...,Xn)\mathbf X = (X_1,...,X_n), Let G:RnRm\mathbf G:\mathbb R^n\rightarrow\mathbb R^m be the transformation s.t. Y=(Y1,...,Ym),Y=G(X),Yi=gi(X1,...,Xn)\mathbf Y = (Y_1,...,Y_m), \mathbf Y = \mathbf G(\mathbf X), Y_i = g_i(X_1, ..., X_n). Then

pmfY=x:G(x)=ypmfX(x)\text{pmf}_{\mathbf Y} = \sum_{\mathbf x: \mathbf G(\mathbf x) = \mathbf y}\text{pmf}_{\mathbf X}(\mathbf x)

random variables X1,...,XnX_1,...,X_n are said to be independent and identically distributed (iid.) if XiX_i's are independent and have the same distribution.

Example

the sum of independent Bernoulli trails follows binomial distribution.

proof. Let XiBern.(p)X_i \sim \text{Bern.}(p). Yn=nXiY_n = \sum^n X_i. We will prove by induction.

Obviously Y1=X1Bern.(p)binomial(1,p)Y_1 = X_1 \sim \text{Bern.}(p)\equiv \text{binomial}(1, p)

Assume Ykbinomial(k,p)Y_k \sim \text{binomial}(k, p). Then

P(Yk+1=0)=P(Yk=0,Xk+10)=P(Yk=0)P(Xk+1=0)=(k0)(1p)k(1p)=(k+10)(1p)k+1P(Yk+1=j)=P((Yk=j,Xk+1=0)(Yk=j1,Xk+1=1))=P(Yk=j)P(Xk+1=0)+P(Yk=j1)P(Xk+1=1)=(kj)pj(1p)kj(1p)+(kj1)pj1(1p)k(j1)p=(k+1j)pj(1p)k+1jYk+1binomial(k+1,p)\begin{align*} P(Y_{k+1} = 0) &= P(Y_k=0, X_{k+1} - 0) \\ &= P(Y_k = 0)P(X_{k+1}=0)\\ &= {k\choose 0}(1-p)^k (1-p) \\ &= {k+1\choose 0}(1-p)^{k+1}\\ P(Y_{k+1}=j) &= P((Y_k = j, X_{k+1} = 0)\cup (Y_k = j-1, X_{k+1} = 1))\\ &= P(Y_k = j)P(X_{k+1} = 0) + P(Y_k = j-1)P(X_{k+1} = 1)\\ &= {k\choose j}p^j(1-p)^{k-j}(1-p) + {k\choose j-1}p^{j-1}(1-p)^{k-(j-1)}p\\ &= {k+1\choose j}p^j(1-p)^{k+1-j}\\ Y_{k+1}&\sim \text{binomial} (k+1,p) \end{align*}

Theorem

If X,YX,Y are independent continuous r.v. then

pdfX+Y(z)=pdfX(x)pdfY(zx)dx\text{pdf}_{X+Y}(z) = \int \text{pdf}_X(x)\text{pdf}_Y(z-x)dx

proof. Let Z=X+YZ=X+Y

cdfZ(z)=P(X+Yz)=P(Xx,Yzx)=P(Xx)P(Yzx)=zxpdfX(x)pdfY(y)dydx=pdfX(x)cdfY(zx)dxpdfZ(z)=ddzpdfX(x)cdfY(zx)dx=pdfX(x)pdfY(zx)dx\begin{align*} \text{cdf}_Z(z) &= P(X+Y \leq z)\\ &= P(X \leq x, Y \leq z-x)\\ &=P(X \leq x)P(Y \leq z-x)\\ &= \int_{-\infty}^{\infty} \int_{-\infty}^{z-x}\text{pdf}_X(x)\text{pdf}_Y(y)dydx\\ &= \int_{-\infty}^{\infty} \text{pdf}_X(x) \text{cdf}_Y(z-x)dx\\ \text{pdf}_Z(z) &= \frac{d}{dz} \int_{-\infty}^{\infty} \text{pdf}_X(x) \text{cdf}_Y(z-x)dx\\ &= \int_{-\infty}^{\infty} \text{pdf}_X(x) \text{pdf}_Y(z-x)dx\\ \end{align*}

Example

For X1,...,XnX_1,...,X_n iid, Yn=max(X1,...,Xn),Y1=min(X1,...,Xn)Y_n = \max(X_1,...,X_n), Y_1 = \min(X_1,...,X_n).

cdfYn(y)=P(max(X1,...,Xn)y)=P(X1y,...,Xny)=nP(Xiy)=cdfX(y)ncdfY1(y)=1P(Y1>y)=1P(min(X1,...,Xn)>y)=1P(X1>y,...,Xn>y)=1nP(Xi>y)=1(1cdfX(y))ncdfY1,Yn(y1,y2)=P(Y1y1,Y2y2)=P(y2y2)P(Y1>y1,Y2y2)=cdfX(y2)nnP(y1<Xiy2)=cdfX(y2)n(cdfX(y2)cdfX(y1))n\begin{align*} \text{cdf}_{Y_n}(y) &= P(\max(X_1,...,X_n) \leq y) \\ &= P(X_1 \leq y,...,X_n\leq y)\\ &= \prod^n P(X_i \leq y) \\ &= \text{cdf}_X(y)^n\\ \text{cdf}_{Y_1}(y) &= 1 - P(Y_1 > y)\\ &= 1 - P(\min(X_1,...,X_n) > y) \\ &= 1 - P(X_1 > y,...,X_n > y)\\ &= 1 - \prod^n P(X_i > y) \\ &= 1 - (1-\text{cdf}_X(y))^n\\ \text{cdf}_{Y_1,Y_n}(y_1,y_2)&= P(Y_1\leq y_1, Y_2\leq y2)\\ &= P(y_2\leq y_2) - P(Y_1 > y_1, Y_2\leq y_2)\\ &= \text{cdf}_X(y_2)^n - \prod^n P(y_1 < X_i \leq y_2)\\ &= \text{cdf}_X(y_2)^n - (\text{cdf}_X(y_2) - \text{cdf}_X(y_1))^n \end{align*}

Theorem (change of variables)

For random variables X=(X1,...,Xn)\mathbf X = (X_1,...,X_n), Let G:RnRm\mathbf G:\mathbb R^n\rightarrow\mathbb R^m be the transformation s.t. Y=(Y1,...,Ym),Y=G(X),Yi=gi(X1,...,Xn)\mathbf Y = (Y_1,...,Y_m), \mathbf Y = \mathbf G(\mathbf X), Y_i = g_i(X_1, ..., X_n). IF G\mathbf G is injective and differentiable, then

pdfY(y)=pdfX(G1(y))det(DDyG1(y))\text{pdf}_{\mathbf Y}(\mathbf y) = \text{pdf}_{\mathbf X}(\mathbf G^{-1}(\mathbf y))|\det(\frac{D}{D\mathbf y}\mathbf G^{-1}(\mathbf y))|