How do you calculate the standard deviation of a sample or set of numbers?

*sort by* **best** latest

The mathematical explanation:

http://mathworld.wolfram.com/StandardDeviation.htm...

A more practical explanation using an example:

First take the average of your set. If you have {1, 1, 3, 6, 9} the average is 4. Next, subtract the average from each data point in your set. Using the same example set you get {-3, -3, -1, 2, 5}. Now square each of these values. This gives you {9, 9, 1, 4, 25}.

Now take the average of those numbers. In this example, (9+9+1+4+25)/5 = 9.6. And finally, take the square root of this number ... sqrt(9.6) = 3.098. This is the standard deviation of your set.

If your set happens to be a sample taken from a larger population, the procedure is a little different. The second time you take the average you would divide by 4 instead of 5, but not a big deal. Anyway, it's easier to use an online calculator or program for the task rather than do it by hand.

http://www.had2know.com/academics/variance-standar...

I also have some stats tutorials if you want to check out my hubs ;)

You can help the HubPages community highlight top quality content by ranking this answer up or down.

Thanks!