Properties of the Distribution
Calculation of Mean, Variance, and Moment Generating Function
To calculate the mean of the Gamma distribution, we can simply integrate x *(Gamma PDF), per the definition of mathematical expectation.
> restart;
> with(plots, display):
> interface(showassumed=0);
> assume(alpha > 0); assume(beta > 0);
> f:=x->GammaPDF(alpha,beta,x);
> EX:=int(x*f(x),x=0..infinity);
Calculating Var( X ), the variance of the Gamma( ) distribution.
We will employ the formula: Var( X ) = E( ) -
> E_X_SQ:=int((x^2)*f(x),x=0..infinity);
> VarX:=simplify(E_X_SQ-EX^2);
The Moment Generating Function (MGF) can be easily calculated via Maple. Recall the moment generating function of a random variable X is defined as M ( t ) = E( ), provided this expectation exists. In taking the integral of
*GammaPDF(x) over the range (0, ), the argument of the exponential term is , and for the resulting integral to be convergent, we must have <0. Equivalently, we require .
> assume(t<1/beta);
> int(exp(t*x)*f(x),x=0..infinity);
So the moment generating function for a Gamma( ) random variable is given by
M ( t ) =
We will define M ( t ) as a function of t .
> restart; with(plots):
> M(t):=(1-beta*t)^(-alpha);
The moment generating function provides us with alternative ways to calculate the mean and variance by way of the formula:
(0) = E( ) , where ( t ) denotes the r th derivative of M ( t ) with respect to t .
This formula holds as long as M ( t ) exists in an open interval containing zero. See, for example, Mathematical Statistics and Data Analysis by John A. Rice for more on the moment generating function. Looking at the first derivative, we find:
> M_p:=diff(M(t),t);
> simplify(M_p);
> simplify(subs(t=0,M_p));
This agrees with our previous result, E( X ) = . Turning to the second derivative, we find:
> M_pp:=diff(M_p,t);
> simplify(subs(t=0,M_pp));
Therefore, E( ) = , which again is in agreement with the value calculated previously. The variance is now quickly calculated as
Var( X ) = E( ) -
=
= .