Properties of the Distribution

Calculation of Conditional Mean and Variance

Because we are dealing with a joint distribution of two variables, we will consider the conditional means and variances of X and Y for fixed y and x, respectively. The means and variances of the marginal distributions were given in the first section of the worksheet. Interestingly, the conditional densities of X and Y are normal distributions as well. This can be shown easily by examining the conditional densities.

> restart:

> with(plots,display,textplot3d):

> f(x,y):=exp((-1/(2*(1-rho^2)))*(((x-mu1)/sigma1)^2-2*rho*(x-mu1)*(y-mu2)/(sigma1*sigma2)+((y-mu2)/sigma2)^2))/(2*Pi*sigma1*sigma2*sqrt(1-rho^2)):

> assume(rho>-1):additionally(rho<1):

> assume(sigma1>0):assume(sigma2>0):

> g(x):=int(f(x,y),y=-infinity..infinity):

So, the conditional density of Y given X = x is

> f[givenX](y):=simplify((f(x,y)/g(x)));

[Maple Math]

and the conditional expectation of Y given X = x is

> EY[givenX]:=simplify(int(y*f[givenX](y),y=-infinity..infinity));

[Maple Math]

> E_Y_SQ[givenX]:=int((y^2)*f[givenX](y),y=-infinity..infinity);

[Maple Math]

Calculating the conditional variance using the typical computational formula:

> VarY[givenX]:=E_Y_SQ[givenX]-EY[givenX]^2;

[Maple Math]

Similarly, the conditional mean and variance for X given Y = y are [Maple Math] and [Maple Math] .

Moment Generating Function for the Bivariate Normal Distribution

The joint moment generating function for two random variables X and Y is given by [Maple Math] .

We now find this MGF for the bivariate normal distribution.

> restart:

> with(plots,display,textplot3d): with(student):

> f(x,y):=exp((-1/(2*(1-rho^2)))*(((x-mu1)/sigma1)^2-2*rho*(x-mu1)*(y-mu2)/(sigma1*sigma2)+((y-mu2)/sigma2)^2))/(2*Pi*sigma1*sigma2*sqrt(1-rho^2)):

> assume(rho>-1):additionally(rho<1):

> assume(sigma1>0):assume(sigma2>0):

> value(Doubleint(f(x,y)*exp(t[1]*x+t[2]*y),x=-infinity..infinity,y=-infinity..infinity));

[Maple Math]

So, the MGF of a bivariate normal distribution is given by [Maple Math]

We now define this MGF as a function of [Maple Math] and [Maple Math] .

> M[X,Y](t[1],t[2]):=exp(t[1]*mu[1]+t[2]*mu[2]+1*(sigma[1]^2*t[1]^2+2*rho*sigma[1]*sigma[2]*t[1]*t[2]+sigma[2]^2*t[2]^2)/2);

[Maple Math]

The joint MGF provides us with alternative ways of finding the means of the marginal distributions as well as an alternative method of finding the mean and variance of the marginal distributions as well as an alternative method of finding Cov( X, Y ) by way of the following formulas:

[Maple Math] [Maple Math]

[Maple Math] [Maple Math]

[Maple Math]

Let's start by finding the mean of the marginal distribution of X :

> EX:=simplify(subs(t[1]=0,t[2]=0,diff(M[X,Y](t[1],t[2]),t[1])));

[Maple Math]

which is what we expected. Now we use the computational formula for variance to find the variance of the marginal distribution of X :

> E_X_SQ:=simplify(subs(t[1]=0,t[2]=0,diff(M[X,Y](t[1],t[2]),t[1]$2)));

[Maple Math]

> VarX:=E_X_SQ-EX^2;

[Maple Math]

which is also what we expected. Similarly, we will use the computational formula for covariance to find Cov( X , Y ):

> E_XY:=simplify(subs(t[1]=0,t[2]=0,diff(M[X,Y](t[1],t[2]),t[1],t[2])));

[Maple Math]

> EY:=simplify(subs(t[1]=0,t[2]=0,diff(M[X,Y](t[1],t[2]),t[2])));

[Maple Math]

> CovXY:=E_XY-EX*EY;

[Maple Math]