Probability Distributions > Stable Distribution

## What is the Stable Distribution?

**Stable distributions** are a general family of probabilities distributions that share certain properties. They were first described by Paul Lévy (1925) and so are also sometimes informally called* Lévy distributions*. However, this can cause confusion as **a** “Lévy Distribution” is actually a specific member of the Stable Distribution family.

Most of these distributions do not have a distinct probability density function, with the exception of the Cauchy Distribution, Lévy Distribution and Normal distribution) but they do share certain properties, like skewness and heavy tails.

## Properties of Stable Distributions

The general stable distribution has four parameters (Barndorff-Nielsen et. al):**Index of stability**: α ∈ (0,2). This parameter determines the probability in the extreme tails (i.e. it tells you something about the height of the tails). A normal distribution has α = 2. Distributions below that number (i.e. with 0 < α ≤ 2) will be more tail heavy. The Cauchy distribution has α = 1.**Skewness parameter**: β. If β = 0, then the distribution is symmetrical. A normal distribution has β = 0.**Scale parameter**:γ. A measure of dispersion. For the normal distribution, γ = half of the population variance. For other symmetric distributions in the family, one suggestion is to exclude the top and bottom 28% of observations (Fama and Roll).**Location parameter**: δ. This parameter equals the median. When α > 1, it also equals the mean. For a normal distribution, the sample mean can be used as an estimate for δ. For other distributions, it may be necessary to discard extreme values in order to get a good estimate for δ. Depending on how heavy the tails are, you may need to exclude the first and last quartiles — only using the central half of observations.

One of the most important mathematical characteristics of Stable Distributions is that **the distributions retain the same α and β under convolution of random variables** (a calculus term which describes what happens when two functions f and g are combined to form a third function).

## Advantages and Disadvantages for Practical Use

A very useful property of stable distributions is that they are scalable (up to a certain factor) — a small part of the distribution looks just like the whole.

**A major drawback to the use of stable distributions is that any moment greater than α isn’t defined**. That means in general, any theory based on variance (the second moment) isn’t useful. However, it’s sometimes possible to modify the distributions (for example, truncate the tails). This does require you to have a good grasp of the particular distribution you’re dealing with as well as the discipline from which your data was drawn in the first place. For example, Paul & Baschnagel describe how a Lévy distribution can be used to model a human heart beat if the tails are truncated — but you would only know to truncate the tails if you were aware that deviations in heartbeats can’t be arbitrarily large.

**References:**

Fama, E.F. and Roll, R. (1968). Some Properties of Symmetric Stable Distributions.

Paul Lévy(1925). Calul des probabilities.

Ole E. Barndorff-Nielsen, Thomas Mikosch, Sidney I. Resnick (2001). Lévy Processes: Theory and Applications

Jitendar S. Mann, Richard G. Heifner, United States. Dept. of Agriculture. Economic Research Service. The distribution of shortrun commodity price movements, Issues 1535-1538. U.S. Dept. of Agriculture, Economic Research Service, 1976 – Business & Economics – 68 pages. Free ebook.

Wolfgang Paul, Jörg Baschnagel. Stochastic Processes: From Physics to Finance.

If you prefer an online interactive environment to learn R and statistics, this *free R Tutorial by Datacamp* is a great way to get started. If you're are somewhat comfortable with R and are interested in going deeper into Statistics, try *this Statistics with R track*.

*Facebook page*and I'll do my best to help!