Use the definition of the mean to show the sum of the deviation of the observations from their mean is always zero. This is one reason why variance and standard deviation use squared deviations.

Think of the mean as a fulcrum (balance point) of a distribution. To balance, the weights and distances from the fulcrum must be equal, therefore subtracting one side from the other will always = 0 (assuming that you did your calculations correctly).

Sorry for the repeat. However, you need to indicate your subject in the "School Subject" box, so those with expertise in the area will respond to the question. You just lucked out with me responding.

To show that the sum of the deviations of the observations from their mean is always zero, we first need to understand the definition of the mean.

The mean, also known as the average, is calculated by summing up all the observations and dividing the total by the number of observations. Mathematically, the mean is denoted by the symbol μ (mu) and can be calculated as:

μ = (x₁ + x₂ + x₃ + ... + xₙ) / n

where x₁, x₂, x₃, ..., xₙ represent the individual observations, and n represents the total number of observations.

Now, let's consider the deviation of each observation from the mean. The deviation of an observation represents how far it is from the mean. Mathematically, the deviation of an individual observation xᵢ from the mean μ can be calculated as:

dᵢ = xᵢ - μ

where dᵢ represents the deviation of the i-th observation.

To prove that the sum of the deviations is always zero, we need to sum up all the deviations for all observations and see what happens:

d₁ + d₂ + d₃ + ... + dₙ = (x₁ - μ) + (x₂ - μ) + (x₃ - μ) + ... + (xₙ - μ)

Let's simplify this expression:

= (x₁ + x₂ + x₃ + ... + xₙ) - (μ + μ + μ + ... + μ)

Since μ is the mean and represents the sum of all observations divided by the total number of observations, we can rewrite this expression as:

= (x₁ + x₂ + x₃ + ... + xₙ) - n(μ)

Now, we can substitute the mean value from the definition:

= (x₁ + x₂ + x₃ + ... + xₙ) - n[(x₁ + x₂ + x₃ + ... + xₙ) / n]

By rearranging terms, we get:

= (x₁ + x₂ + x₃ + ... + xₙ) - (x₁ + x₂ + x₃ + ... + xₙ)

And simplifying further, we find:

= 0

Therefore, the sum of the deviations of the observations from their mean is always zero.

This is one reason why variance and standard deviation use squared deviations. When deviations are squared, both positive and negative deviations contribute positively to the sum, ensuring that the sum of squared deviations is always positive. This prevents the cancellation of positive and negative deviations when calculating variance or standard deviation, providing a measure of variability that is always non-negative.