Degree
Doctor of Philosophy
Program
Statistics and Actuarial Sciences
Supervisor
David Bellhouse
2nd Supervisor
Duncan Murdoch
Joint Supervisor
Abstract
Some elementary data smoothing techniques emerged during the eighteenth century. At that time, smoothing techniques consisted of simple interpolation of the data and eventually evolved into more complex modern methods. Some of the significant milestones of smoothing or graduation of population data will be described including the smoothing methods of W.F. Sheppard in the early twentieth century. Sheppard's statistical interests focused on data smoothing, the construction of mathematical tables and education. Throughout his career, Sheppard consulted Karl Pearson for advice pertaining to his statistical research. An examination of his correspondence to Pearson will be presented and his smoothing methods will be described and compared to modern methods such as local polynomial regression and Bayesian smoothing models.
In the second part of the thesis, the development of Bayesian smoothing will be presented and a simulation-based Bayesian model will be implemented using historical data. The object of the Bayesian model is to predict the probability of life using grouped mortality data. A Metropolis-Hastings MCMC application will be employed and the results will then be compared to the original eighteenth-century analysis.
Recommended Citation
Murray, Lori L., "Data Smoothing Techniques: Historical and Modern" (2016). Electronic Thesis and Dissertation Repository. 3679.
https://ir.lib.uwo.ca/etd/3679