The **variance** of a random variable or distribution is the expectation, or mean, of the squared deviation of that variable from its expected value or mean. Thus the variance is a measure of the amount of variation of the values of that variable, taking account of all possible values and their probabilities or weightings (not just the extremes which give the range).

For example, a perfect six-sided die, when thrown, has expected value of

Its expected absolute deviation—the mean of the equally likely absolute deviations from the mean—is

But its expected *squared* deviation—its variance (the mean of the equally likely squared deviations)—is

As another example, if a coin is tossed twice, the number of heads is: 0 with probability 0.25, 1 with probability 0.5 and 2 with probability 0.25. Thus the expected value of the number of heads is:

and the variance is:

Read more about Variance: Definition, Properties, Approximating The Variance of A Function, Population Variance and Sample Variance, Generalizations, History, Moment of Inertia

### Famous quotes containing the word variance:

“There is an untroubled harmony in everything, a full consonance in nature; only in our illusory freedom do we feel at *variance* with it.”

—Fyodor Tyutchev (1803–1873)