# Difference between revisions of "Statistics (Introduction to Radiochemistry)"

Line 87: | Line 87: | ||

*Multiplication/dividation with a number ''c'' without uncertainty: | *Multiplication/dividation with a number ''c'' without uncertainty: | ||

− | <math>x\pm \sigma | + | <math> |

+ | x\pm \sigma gives | ||

\left\{ | \left\{ | ||

\begin{matrix} | \begin{matrix} |

## Revision as of 12:26, 3 July 2012

In statistics data is often characterized by being exact and reproducible. An exact result is close to the true value, while a reproducible result gives information about how precise the measurements are. However a reproducible data set does not necessarrily imply that the result is exact. The average of a data set is given by:

eqn 1 |

Where N is the number of measurements.

To describe a data set the variance or standard deviation must be specified as well. This shows the spread of the values in the data set around the average value. The variance (σ^{2}) is given by;

eqn 2 |

and the standard deviation (σ) is given by:

eqn 3 |

A coin toss has two different outcomes. For instance the probability of 10 heads in a row can be calculated as *(1/2) ^{10} 0.00098*, where every coin toss is independent of each other

*.*This is a binary process that can be described by a binomial probability distribution function (which gives the probability for each possible result of the measurement).

Over a periode of time a given radioactive nucleus has 2 outcomes. It can either disintegrate or remain unchanged and, as with a coin toss, can be classified as a binary process. Within the half-life of a nuclide, half of the observed atoms will have undergone disintegration, exactly as the distribution of an exceedingly large number of coin tosses would be 0.5 head and 0.5 tail. The binary process applies to whether or not a given radioactive particle (α,β,...) will be registered by a detector as well.

Radioctive decay usually involves systems with so many atoms, that calculation of the probability for individual atoms is rather impractical. When the number of atoms is large (*N>>1*) and the time of observation is short compared to the half-life (σ*t<<1*) Poisson distribution is normally used. Poisson distribution describes rare random events. *P(x) *is the probability to get a certain amount of counts x when is the expected average value.

eqn 4 |

The number of events *N* is large and

The middle value and standard deviation is given by:

eqn 5 |

eqn 6 |

*p *the time *t*.

Normal distribution (Gaussian distribution) is another common probability function. It can be used as a simplification when *x* is large and . Gaussian distribution is symmetrical around and applies to the standard deviation.

In a series of measurements where the middle value *x* is given with uncertainty the interval of and is called the confidence interval. The probability of the true value to be within the confidence interval is 68.3%. The table below shows how the probability of the true value to be within the confidence interval increases with the number of .

Interval around the measurement number | Probability (%) |

50.0 | |

68.3 | |

95.0 | |

99.7 |

Calculations with values that each have uncertainty is often conducted, e.g subtraction of the background from measurements. The following rules apply for calculations in the experimental part:

- Multiplication/dividation with a number
*c*without uncertainty:

- Addition/substraction of a number with uncertainty: