About Standard Deviation - trillium_discovery - trillium_quality - Latest

Trillium Control Center

Product type
Software
Portfolio
Verify
Product family
Trillium
Product
Trillium > Trillium Discovery
Version
Latest
Language
English
Product name
Trillium Quality and Discovery
Title
Trillium Control Center
Copyright
2024
First publish date
2008
Last updated
2024-10-18
Published on
2024-10-18T15:02:04.502478

Standard deviation is a measure of how spread out the values for a numeric attribute are from the attribute's numeric average value. The dispersion range (that is, how spread out the numbers are) is measured in levels. If the average value (mean) of a set of numbers is M and the standard deviation is S, then all values falling in the range M-S and M+S are said to fall within 1 standard deviation of the mean.

When the data has a normal distribution, about 68% of the values fall within 1 standard deviation, 95% of the values fall within 2 standard deviations, and 99.7% of values fall within 3 standard deviations. For example, if you have input data of 1 2 3 4 3 2 1 2 3 5 6 1, it has a mean of 2.75 and a standard deviation of 1.534. Data that falls more toward the maximum range of deviations are called outliers.

How Standard Deviation is Calculated

Standard deviation is calculated as a square root of variance.

Standard deviation is calculated as a square root of variance when you load numeric values into an attribute with an inferred numeric datatype of either integer or decimal. You can also calculate standard deviation for attributes containing numeric data but that have an inferred datatype of string when you run an analysis on one or more attributes in a real entity and select the Std. Deviation option.

Note: Standard deviation is calculated for numeric attributes only.