## What is Extra Sums of Squares?

**Extra Sums of Squares (ESS)** is the difference in the Error Sums of Squares (SSE) of two models. More specifically, ESS is a measure of the marginal reduction in Error Sums of Squares (SSE) when an additional set of predictors is added to the model.

It is a tool for model comparison, comprised of a single number. If ESS = 0, the models are identical.

## Formula and Example

The formula for Extra Sums of Squares is:

**ESS = Residual sum of squares (reduced) – Residual Sum of Squares (full).**

Let’s say your model contains one predictor variable, X_{1}. If you add a second predictor, X_{2} to the model, ESS explains the additional variation explained by X_{2}. We can write that as:

SSR (X_{2} | X_{1}).

In terms of SSE, let’s say you have a model with one predictor variable, X_{1}. You add a variable X_{2} to a model. Extra Sums of Squares explains the part of SSE not explained by the original variable (X_{1}). We can write that as:

SSR (X_{2} | X_{1}) = X_{1} – (X_{1}, X_{2})

## References

Marasinghe, M. & Kennedy, W. (2008). SAS for Data Analysis: Intermediate Statistical Methods. Springer Science and Business Media. Retrieved September 9, 2019 from: https://books.google.com/books?id=LX2v9CNhzJMC

Olbricht, G. Lecture 13: Extra Sums of Squares. Article posted on Purdue University website. Retrieved September 10, 2019 from: https://www.stat.purdue.edu/~ghobbs/

Ramsey, F. & Schafer, D. The Statistical Sleuth: A Course in Methods of Data Analysis. Retrieved September 10, 2019 from: https://books.google.com/books?id=jfoKAAAAQBAJ