Standard deviation is a measure of how spread out the values for a numeric attribute are from the attribute's numeric average value. The dispersion range (that is, how spread out the numbers are) is measured in levels. If the average value (mean) of a set of numbers is M and the standard deviation is S, then all values falling in the range M-S and M+S are said to fall within 1 standard deviation of the mean.
When the data has a normal distribution, about 68% of the values fall within 1 standard deviation, 95% of the values fall within 2 standard deviations, and 99.7% of values fall within 3 standard deviations. For example, if you have input data of 1 2 3 4 3 2 1 2 3 5 6 1, it has a mean of 2.75 and a standard deviation of 1.534. Data that falls more toward the maximum range of deviations are called outliers.
How Standard Deviation is Calculated
Standard deviation is calculated as a square root of variance.
Standard deviation is calculated as a square root of variance when you load numeric values into an attribute with an inferred numeric datatype of either integer or decimal. You can also calculate standard deviation for attributes containing numeric data but that have an inferred datatype of string when you run an analysis on one or more attributes in a real entity and select the Std. Deviation option.