Statistics Definitions > Moment Generating Function (MGF)

If you aren’t familiar with moments, you may want to read this article first: what is a moment?

## What is a Moment Generating Function?

Moment generating functions are a way to find moments like the mean(μ) and the variance(σ^{2}). They are an alternative way to represent a probability distribution with a simple one-variable function. **Each probability distribution has a unique MGF**, which means they are especially useful for solving problems like finding the distribution for sums of random variables. They can also be used as a proof of the Central Limit Theorem.

There isn’t an intuitive definition for exactly what an MGF *is*; it’s merely **a computational tool**.

## How to Find an MGF

Finding an MGF for a discrete random variable involves summation; for continuous random variables, calculus is used. It’s actually very simple to create moment generating functions if you are comfortable with summation and/or differentiation and integration:

For the above formulas, f(x) is the probability density function of X and the integration range (listed as -∞ to ∞) will change depending on what range your function is defined for.

**Example**: Find the MGF for e^{-x}.

**Solution**:

Step 1: Plug e^{-x} in for fx(x) to get:

Note that I changed the lower bound to zero, because this function is only valid for values higher than zero.

Step 2: Integrate. The MGF is 1 / (1-t).

The moment generating function only works when the integral converges on a particular number. The above integral *diverges *(spreads out) for t values of 1 or more, so the MGF only exists for values of t less than 1. You’ll find that most continuous distributions aren’t defined for larger values (say, above 1). This is usually *not* an issue: in order to find expected values and variances, the mgf only needs to be found for small t values close to zero.

## Using the MGF

Once you’ve found the moment generating function, you can use it to find expected value, variance, and other moments.

M(0) = 1,

M'(0) = E(X),

M”(0) = E(X^{2}),

M”'(0) = E(X^{3})

and so on;

Var(X) = M”(0) − M'(0)^{2}.

**Example:** Find E(X^{3}) using the MGF (1-2t)^{-10}.

Step 1: Find the third derivative of the function (the list above defines M”'(0) as being equal to E(X^{3}); before you can evaluate the derivative at 0, you first need to find it):

M”'(t) = (−2)^{3}(−10)(−11)(−12)(1 − 2^{t})^{-13}

Step 2: Evaluate the derivative at 0:

M”'(0) = (−2)^{3}(−10)(−11)(−12)(1 − 2^{t})^{-13}

= (−2)^{3}(−10)(−11)(−12)(1)

= 10,560.

**Solution**: E(X^{3}) = 10,560.

If you prefer an online interactive environment to learn R and statistics, this *free R Tutorial by Datacamp* is a great way to get started. If you're are somewhat comfortable with R and are interested in going deeper into Statistics, try *this Statistics with R track*.

*Facebook page*and I'll do my best to help!

So let me see if I understand. The mean and variance are typically the first and second moments, respectively. However, for some cases, where the mean and variance don’t work, we would use a moment generating function to come up with the first and second moments in order to proceed?

If so, why wouldn’t the mean/variance work? Because of the shape of the distribution? Or because of the nature of the question we are trying to solve? Also, would we just treat the first and second moments calculated using the MGF the same as mean and variance for doing additional analysis (e.g. hypothesis testing)?

Thanks.

I wouldn’t say that finding the mean/variance wouldn’t work…more like MGFs are sometimes easier to work with(e.g. if you have a distribution with sums of random variables).

“Also, would we just treat the first and second moments calculated using the MGF the same as mean and variance”…yes, as you can use the MGF to prove that your mean is μ.

“…for doing additional analysis (e.g. hypothesis testing)?” This is a big *depends*. Could you be more explicit with what kind of hypothesis testing you are doing on what kind of data?

How can I find the mean of an MGF about the mean u?

Evaluate the first derivative of the MGF at t = 0