An Exploration of the Exact Distribution and Probabilities for Sample Means for Small Random Samples
Published Web Locationhttps://doi.org/10.5070/T561000179
The computer algebra system, MathematicaTM, is used to determine the exact distributions for sums and means of small random samples taken from a specific probability density function. The method used is the Inverse Laplace Transform for real-valued functions. These distributions are used to compare exact probabilities for probability interval statements for sums and means with normal approximations for these probabilities using the Central Limit Theorem. The maximum normal approximation errors are determined for probability intervals for various sample sizes.