# Moment (mathematics)

*See also moment (physics).*

The concept of **moment** in mathematics evolved from the concept of **moment** in physics. The *n*th moment of a real-valued function *f*(*x*) of a real variable about a value *c* is

It is possible to define moments for random variables in a more general fashion than moments for real values. See Moments in metric spaces.

The moments about zero are usually referred to simply as the moments of a function. Usually, except in the special context of the problem of moments below, the function will be a probability density function. The *n*^{th} moment (about zero) of a probability density function *f*(*x*) is the expected value of *X*^{n}. The moments about its mean μ are called *central* moments; these describe the shape of the function, independently of translation.

If *f* is a probability density function, then the value integral above is called the *n*th moment of the probability distribution. More generally, if *F* is a cumulative probability distribution function of any probability distribution, which may not have a density function, then the *n*th moment of the probability distribution is given by the Riemann-Stieltjes integral

where *X* is a random variable that has this distribution and **E** the expectation operator.

When

then the moment is said not to exist. If the *n*th moment about any point exists, so does (*n* − 1)th moment, and all lower-order moments, about every point.

## Significance of the moments

The first moment about zero, if it exists, is the expectation of *X*, i.e. the mean of the probability distribution of *X*, designated *μ*. In higher orders, the central moments are more interesting than the moments about zero.

The *n*th **central moment** of the probability distribution of a random variable *X* is

The first central moment is thus 0.

### Variance

The second central moment is the variance, the positive square root of which is the standard deviation, σ.

#### Normalized moments

The *normalised* *n*th central moment is the *n*th central moment divided by σ^{n}; the *n*th moment of *t* = (*x* − μ)/σ. These normalised central moments are dimensionless quantities, which represent the distribution independently of any linear change of scale.

### Skewness

The third central moment is a measure of the lopsidedness of the distribution; any symmetric distribution will have a third central moment, if defined, of zero. The normalised third central moment is called the skewness, often γ. A distribution that is skewed to the left (the tail of the distribution is heavier on the left) will have a negative skewness. A distribution that is skewed to the right (the tail of the distribution is heavier on the right), will have a positive skewness.

For distributions that are not too different from the normal (or "Gaussian") distribution, the median will be somewhere near μ − γσ/6; the mode about μ − γσ/2.

### Kurtosis

The fourth central moment is a measure of whether the distribution is tall and skinny or short and squat, compared to the normal distribution of the same variance. Since it is the expectation of a fourth power, the fourth central moment, where defined, is always positive (except for a point distribution); the fourth central moment of a normal distribution is 3σ^{4}.

The kurtosis κ is defined to be the normalized fourth central moment minus 3. (Equivalently, as in the next section, it is the fourth cumulant divided by the square of the variance.) Some authorities do not subtract three, but it is usually more convenient to have the normal distribution at the origin of coordinates. If a distribution has a peak at the mean and long tails, the fourth moment will be high and the kurtosis positive; and conversely; thus, bounded distributions tend to have low kurtosis.

The kurtosis can be positive without limit, but κ must be greater than or equal to γ^{2} − 2; equality only holds for binary distributions. For unbounded skew distributions not too far from normal, κ tends to be somewhere in the area of γ^{2} and 2γ^{2}.

The inequality can be proven by considering

where *T* = (*X* − μ)/σ. This is the expectation of a square, so it is non-negative whatever *a* is; on the other hand, it's also a quadratic equation in *a*. Its discriminant must be non-positive, which gives the required relationship.

## Cumulants

The first moment and the second and third *unnormalized central* moments are linear in the sense that if *X* and *Y* are independent random variables then

and

and

(These can also hold for variables that satisfy weaker conditions than independence. The first always holds; if the second holds, the variables are called uncorrelated).

This is true because these moments are the first three cumulants; the fourth cumulant is the kurtosis times σ^{4}.

All the cumulants are polynomials in the moments; so are the factorial moments. The central moments are polynomials in the moments about zero, and conversely.

## Sample moments

The moments of a population can be estimated using the sample *k*-th moment

applied to a sample *X*_{1},*X*_{2},..., *X*_{n} drawn from the population.

It can be trivially shown that the expected value of the sample moment is equal to the *k*-th moment of the population, if that moment exists, for any sample size *n*. It is thus an unbiased estimator.

## Problem of moments

The **problem of moments** seeks characterizations of sequences { μ′_{n} : *n* = 1, 2, 3, ... } that are sequences of moments of some function *f*.

## Partial moments

Partial moments are sometimes referred to as "one-sided moments." The *n*th order lower and upper partial moments with respect to a reference point *r* may be expressed as

Partial moments are normalized by being raised to the power 1/*n*. The upside potential ratio may be expressed as a ratio of a first-order upper partial moment to a normalized second-order lower partial moment.

## Moments in metric spaces

Let (*M*, *d*) be a metric space, and let B(*M*) be the Borel σ-algebra on *M*, the σ-algebra generated by the *d*-open subsets of *M*. (For technical reasons, it is also convenient to assume that *M* is a separable space with respect to the metric *d*.) Let 1 ≤ *p* ≤ +∞.

The ** p^{th} moment** of a measure

*μ*on the measurable space (

*M*, B(

*M*)) about a given point

*x*

_{0}in

*M*is defined to be

*μ* is said to have **finite p^{th} moment** if the

*p*

^{th}moment of

*μ*about

*x*

_{0}is finite for some

*x*

_{0}∈

*M*.

This terminology for measures carries over to random variables in the usual way: if (Ω, Σ, **P**) is a probability space and *X* : Ω → *M* is a random variable, then the ** p^{th} moment** of

*X*about

*x*

_{0}∈

*M*is defined to be

and *X* has **finite p^{th} moment** if the

*p*

^{th}moment of

*X*about

*x*

_{0}is finite for some

*x*

_{0}∈

*M*.

## See also

- Binomial distribution
- Cumulant
- Hamburger moment problem
- Hausdorff moment problem
- Method of moments
- Moment about the mean
- Moment-generating function
- Normal distribution
- Standardized moment
- Stieltjes moment problem
- Taylor expansions for the moments of functions of random variables

## External links

de:Moment (Statistik) it:Momento (statistica) sv:Moment (matematik)