Statistics How To

Gauss Markov Theorem & Assumptions

Statistics Definitions >

Gauss Markov Theorem

The Gauss Markov theorem tells us that if a certain set of assumptions are met, the ordinary least squares estimate for regression coefficients gives you the best linear unbiased estimate (BLUE) possible.

Gauss Markov Assumptions

There are five Gauss Markov assumptions (also called conditions):

  1. Linearity: the parameters we are estimating using the OLS method must be themselves linear.
  2. Random: our data must have been randomly sampled from the population.
  3. Non-Collinearity: the regressors being calculated aren’t perfectly correlated with each other.
  4. Exogeneity: the regressors aren’t correlated with the error term.
  5. Homoscedasticity: no matter what the values of our regressors might be, the error of the variance is constant.

Purpose of the Assumptions

The Gauss Markov assumptions guarantee the validity of ordinary least squares for estimating regression coefficients.

Checking how well our data matches these assumptions is an important part of estimating regression coefficients. When you know where these conditions are violated, you may be able to plan ways to change your experiment setup to help your situation fit the ideal Gauss Markov situation more closely.

In practice, the Gauss Markov assumptions are rarely all met perfectly, but they are still useful as a benchmark, and because they show us what ‘ideal’ conditions would be. They also allow us to pinpoint problem areas that might cause our estimated regression coefficients to be inaccurate or even unusable.

The Gauss-Markov Assumptions In Algebra

We can summarize the Gauss-Markov Assumptions succinctly in algebra, by saying that a linear regression model represented by

yi = xi‘ β + εi

and generated by the ordinary least squares estimate is the best linear unbiased estimate (BLUE) possible if

  • E{εi} = 0, i = 1, . . . , N
  • 1……εn} and {x1…..,xN} are independent
  • cov{εi, εj} = 0, i, j = 1,…., N I ≠ j.
  • V{ε1 = σ2, i= 1, ….N

The first of these assumptions can be read as “The expected value of the error term is zero.”. The second assumption is collinearity, the third is exogeneity, and the fourth is homoscedasticity.

References

  1. Anderson, Patricia. The Gauss-Markov Theorem: Study Guide. Retrieved from http://www.dartmouth.edu/~econ20pa/StudyGuide1.doc on May 20, 2018.
  2. Lee, Q. OLS, BLUE and the Gauss Markov Theorem. Economics Society: University of Waterloo. Retrieved from http://uweconsoc.com/ols-blue-and-the-gauss-markov-theorem/ on May 20, 2018.
  3. Troeger, Vera. Gauss-Markov Assumptions, Full Ideal Conditions of OLS. Retrieved from http://uweconsoc.com/ols-blue-and-the-gauss-markov-theorem/ on May 20, 2018.
------------------------------------------------------------------------------

Need help with a homework or test question? With Chegg Study, you can get step-by-step solutions to your questions from an expert in the field. If you'd rather get 1:1 study help, Chegg Tutors offers 30 minutes of free tutoring to new users, so you can try them out before committing to a subscription.

If you prefer an online interactive environment to learn R and statistics, this free R Tutorial by Datacamp is a great way to get started. If you're are somewhat comfortable with R and are interested in going deeper into Statistics, try this Statistics with R track.

Comments? Need to post a correction? Please post a comment on our Facebook page.

Check out our updated Privacy policy and Cookie Policy

Gauss Markov Theorem & Assumptions was last modified: August 1st, 2018 by Stephanie