NextPrevious

Applied Mathematics

Statistics

What is the variance?

The variance is the average of the squares of a set’s deviations. It is used to characterize the spread among the measures of a given population. First, calculate the mean of the scores; then measure the amount that each score deviates from the mean. Finally, square that deviation (in other words, multiply it by itself, add all of them together, then divide by the total number of scores). An even easier way is to square the numbers first. (Note: Taking the square root of the variance gives the standard deviation.)

For example, take the numbers 3, 5, 8, and 9, with a mean of 6.25 (the sum of the numbers divided by the total number of numbers). To calculate the variance, determine the deviation of each number from 6.25 (3.25, 1.25, 1.75, 2.75), square each deviation (10.5625, 1.5625, 3.0625, 7.5625), then take the average 22.75/4 = 5.6875, which is the variance. An easier way to calculate the variance is to square all the numbers first (9, 25, 64, 81) and determine the mean (9 + 25 + 64 + 81 divided by 4 = 44.75). Then subtract the square of the first mean (6.252 = 39.0625)—or 44.75 - 39.0625 = 5.6875.



Close

This is a web preview of the "The Handy Math Answer Book" app. Many features only work on your mobile device. If you like what you see, we hope you will consider buying. Get the App