What is the Error Function?
The error function (also called the Gaussian error function or Cramp function) is one way to give us probabilities for normally distributed random variables. More specifically, the error function gives us the probability that a normally distributed random variable, with mean 0 and standard deviation 1√2, will fall into the range [-x, x].
The erf is a special function which gets its name for its importance in the study of errors. It is sometimes called the Gauss or Gaussian Error Function and occasionally a Cramp function [1].
As well as error theory, the error function is also used in probability theory, mathematical physics (where it can be expressed as a special case of the Whittaker function), and a wide variety of other theoretical and practical applications. For example, Fresnel integrals, which are derived from the error function, are used in the theory of optics.
A related function is the complementary error function, which gives us the area in two tails.
Formula and Properties
The error function is defined by the following integral: The integral is the area under the curve of a probability distribution. Thus, it gives us probabilities. The factor 2/√ ensures that the function integrates to 1, but some authors (e.g. [2]) omit this factor. The function has the following four properties:
- erf (-∞) = -1
- erf (+∞) = 1
- erf (-x) = -erf (x)
- erf (x*) = [erf (x)]*
(* is a complex conjugate, where the real and imaginary parts are equal in magnitude but opposite in sign. For example, a + bi → a – bi)
Graph of the Error Function
The error function is an odd function, which means it is symmetric about the origin.
Table of Values
For a full list of the table of values, download this pdf [3].
Approximation for Erf
If you have a programmable calculator, the following formula which serves as a good approximation to the function. It’s accurate to 1 part in 107 [4]. erf(z) = 1 – (a1 T + a2 T2 + a3 T3 + a4 T4 + a5 T5 ) ) e-z2 Where:
- T = 1 / (1 + (0.3275911 * z)),
- Z = a z-score
- a1 = 0.254829592
- a2 = -0.284496736
- a3 = 1.421413741
- a4 = -1.453152027
- a4 = 1.061405429
Complementary error function
The complementary error function gives the area under the two tails of a normal distribution curve with a mean of zero and standard deviation 1√2 (i.e., variance of ½). It is defined as [5]
The complementary error function and the error function are complements, so can be defined as
erf(x) = 1 – erfc(x).
Why is it called the “error” function?
The “error” function is so-named because it was originally used to quantify the error between a theoretical value and an experimental measurement.
Historically, the normal distribution was called the law of errors and was used by Gauss in his study of astronomical observations [6]. The law of errors rests on the hypothesis that the error of any individual observation is the result of a combination of
- many comparable and independent components and
- a comparison with frequencies in series of observations [7].
Gauss wasn’t the first to study errors: Galileo was the first scientist to note that measurement errors deserve a systematic and scientific treatment [8]. Gauss abandoned use of the normal error function in 1821 and presented an argument “making use of mathematical probability to assess uncertainty and make inferences” to justify the method [9, pg. 158].
References
[1] Cramp Function. Retrieved March 9, 2022 from: https://p-distribution.com/cramp-function-distribution/
[2] Whittaker, E. T. and Watson, G. N. A Course in Modern Analysis, 4th ed. Cambridge, England: Cambridge University Press, 1990.
[3] Washington State University. Error Function. Retrieved November 27, 2019 from: http://courses.washington.edu/overney/privateChemE530/Handouts/Error%20Function.pdf
[4] Cheung. Properties of … erf(z) And … erfc(z). Retrieved November 27, 2019 from: http://www.sci.utah.edu/~jmk/papers/ERF01.pdf
[5] Kschischang, F. (2017). The Complementary Error Function. Retrieved September 11, 2023 from: https://www.comm.toronto.edu/~frank/notes/erfc.pdf
[6] Stahl, S. The evolution of the normal distribution. Mathematics Magazine.
[7] Jeffreys, H. (1938). The law of error and the combination of observations. Retrieved September 12, 2023 from: https://royalsocietypublishing.org/doi/10.1098/rsta.1938.0008#:~:text=The%20normal%20or%20Gaussian%20law,in%20actual%20series%20of%20observations.
[8] Appendix C: Gaussian Distribution
[10] Stigler, Stephen M. 1986. The History of Statistics, The Measurement of Uncertainty before 1900. Cambridge: The Belknap Press of Harvard University Press.