*sort by* **best** latest

### Best Answer buckleupdorothy says

From what I remember of Social Statistics, standard deviations are used to determine "normal" - i.e. the very middle of the standard deviation graph is the median and 98% of your cases are expected to be within 2 standard deviations (both left and right) from this median. The remaining 2% are three standard deviations away from the median and they are also called outliers. About 50% of the cases are within 1 standard deviation of the median.

One common use for standard deviations used to be (and maybe still is?) calculating grade curves. If you have ever had to put your class on a grading curve, then the median was usually a B- or so, with 50% of the class at that level (clustered around the median), the remaining 46% 2 standard deviations away (Cs and B+/A-s) and the last 2% in the 3rd standard deviation - As and Fs.

### TripleAMom says

Buckleupdorothy is right as I remember as well. Another example is IQ testing. The average (mean) IQ score is 100 with each standard deviation from the mean on either side being 15. So one standard deviation higher is 115, 2 standard deviations is 130 and so on. Negatively it goes down. One standard deviation lower would be 85, etc. The "normal" IQ range would be between one standard deviation on either side of the average, so between 85 and 115. Anything higher or lower is significant as the test goes.

Standard deviation is mostly for statistics purposes and is used in testing. I am not sure what types of real life scenarios you may be referring to besides the two listed.

Hope this helps.