James-Stein Estimator: Definition, Formulas

Estimators > James-Stein Estimator

What is the James-Stein Estimator?

It’s common in statistics to take averages to make predictions. For example, the sample mean (the average score from all samples) is used as an estimator for the population mean. James-Stein estimators improve upon these averages by shrinking them towards a more central average. The technique is named after Charles Stein and Willard James, who simplified Stein’s original 1956 method.

Calculations

The basic steps are:

  1. Calculate the sample mean (X̄).
  2. “Shrink” individual scores towards (X̄); Reduce larger values and increase smaller values. Each of these individual shrunk values is a James-Stein estimator, z.

The basic formula for the James-Stein estimator is:
z = x̄ + c(y – x̄)
Where:

  • (y – x̄) = difference between an individual score and the sample mean,
  • c = a shrinking factor.

Other formulas exist, but they all have the shrinking factor in common. For example, instead of the sample mean you could use the mean from a prior distribution (m). In that case, ȳ could be replaced by m. The shrinking factor’s value is calculate after collecting the sample data and is given by the formula:
shrinking-factor james-stein estimator


Where:

  • x = individual values,
  • x̄ = sample mean,
  • k = number of unknown means (must be 2 or more),
  • σ2 = variance.

The shrinking factor’s value should be less than 1. For example, a value of .3 would shrink values by about 70 percent.

James-Stein Estimators vs. Sample Means

The James-Stein estimator is a significant departure from the “traditional” school of thought which states that the sample mean is the best estimator for the population mean. Stein and James proved that a better estimator than the “perfect” estimator exists, which seems to be somewhat of a paradox. However, the James-Stein estimator outperforms the sample mean when there are several unknown population means — not just one. The means do not have to be related, so they have to be carefully chosen. Combining completely unrelated means will give you a result — but it will be a nonsensical one. Bradley Efron and Carl Morris (1977) offer the extreme example of combining batting averages in baseball and proportions of imported cars; You can calculate a mean for these, but it will make no sense at all.

References

Efron, B. and Morris, C. (1977), “Stein’s Paradox in Statistics.” Scientific American. 236 (5): 119–127
James, W., and Stein, C., (1961). “Estimation with Quadratic Loss.” Proceedings of the Fourth Berkeley Symposium, Vol. 1 (Berkeley, California: University of California Press), pp. 361-379.
Stein C. (1956). “Inadmissibility of the usual estimator for the mean of a multivariate normal distribution”. Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability. Vol. 1. University of California Press; Berkeley, CA, USA: pp. 197–208


Comments? Need to post a correction? Please Contact Us.