Standard Deviation - trillium_discovery - Latest

Trillium Discovery Center

Product type
Software
Portfolio
Verify
Product family
Trillium
Product
Trillium > Trillium Discovery
Version
Latest
Language
English
Product name
Trillium Discovery
Title
Trillium Discovery Center
Copyright
2024
First publish date
2008
Last updated
2024-10-18
Published on
2024-10-18T14:55:05.094442

Standard deviation is a measure of how dispersed the values for a numeric attribute are from the attribute's numeric average value. The dispersion range (that is, how spread out the numbers are) is measured in levels. If the average value (mean) of a set of numbers is M and the standard deviation is S, then all values falling in the range M-S and M+S are said to fall within 1 standard deviation of the mean.

When the data has a normal distribution, about 68% of the values fall within 1 standard deviation, 95% of the values fall within 2 standard deviations, and 99.7% of values fall within 3 standard deviations. For example, if you have input data of 1 2 3 4 3 2 1 2 3 5 6 1, it has a mean of 2.75 and a standard deviation of 1.534. Data that falls more toward the maximum range of deviations are called outliers.

Standard deviation is calculated as a square root of variance when you load numeric values into an attribute with an inferred numeric datatype of either integer or decimal.

Note: Standard deviation is calculated for numeric attributes only.