Range (statistics)

In statistics, the range of a set of data is the difference between the largest and smallest values.[1]

However, in descriptive statistics, this concept of range has a more complex meaning. The range is the size of the smallest interval which contains all the data and provides an indication of statistical dispersion. It is measured in the same units as the data. Since it only depends on two of the observations, it is most useful in representing the dispersion of small data sets.[2]

Independent identically distributed continuous random variables

For n independent and identically distributed continuous random variables X1, X2, ..., Xn with cumulative distribution function G(x) and probability density function g(x). Let T denote the range of a sample of size n from a population with distribution function G(x).


The range has cumulative distribution function[3][4]

Gumbel notes that the "beauty of this formula is completely marred by the facts that, in general, we cannot express G(x + t) by G(x), and that the numerical integration is lengthy and tiresome."[3]

If the distribution of each Xi is limited to the right (or left) then the asymptotic distribution of the range is equal to the asymptotic distribution of the largest (smallest) value. For more general distributions the asymptotic distribution can be expressed as a Bessel function.[3]


The mean range is given by[5]

where x(G) is the inverse function. In the case where each of the Xi has a standard normal distribution, the mean range is given by[6]