Authors: Pushpa N. Rathie (University of Brasilia, Brazil), Luan C.S.M. Ozelim (ITA, Brazil), and Miodrag Lovric (Radford University, USA)
This article offers an extensive and accessible overview of the normal distribution, its mathematical foundations, key properties, statistical applications, and philosophical reflections. It begins with a historical account tracing the evolution of the normal curve from De Moivre (1733) and Laplace (1774) to Gauss (1809), and highlights the formulation of the central limit theorem by Lyapunov (1900) and Lindeberg (1922).
The article presents a detailed enumeration of the normal distribution’s core properties: the bell-shaped probability density function, symmetry about the mean, standardization to the standard normal distribution, and its relation to related distributions (e.g., Cauchy, folded normal). It explores both classical results—such as moment-generating and characteristic functions—and lesser-known metrics like L-moments, Shannon and Rényi entropy, Fisher information, and sufficiency.
A large section discusses the normal distribution’s foundational role in statistical theory and practice, including parameter estimation, confidence intervals, hypothesis testing, and Bayesian inference. It contrasts real-world phenomena that approximate normality—such as height, blood pressure, reaction times, and test scores—with historical skepticism from Pearson and Geary, who cautioned against treating the normal distribution as an exact law of nature.
The article critically reflects on the use of normality in science and methodology. In quality control, finance, and education, it serves as a benchmark despite real-world deviations. The authors stress that normality is a mathematical idealization and emphasize the need for careful testing, interpretation, and model selection in applied settings.
Recent computational and algorithmic developments are also reviewed, including the role of the normal distribution in deep learning (e.g., variational autoencoders) and simulation techniques such as the Box–Muller transform. Approximations based on logistic and generalized gamma distributions are introduced to handle non-normal settings effectively.
For foundational theory, proofs, philosophical insights, and modern applications, refer to the complete article in the International Encyclopedia of Statistical Science.