Inference involves drawing conclusions about some general phenomenon from limited empirical observations in the face of random variability. In a scientific context, the general must include the completely unforeseen if all possibilities are to be considered. Many of the statistical models most used to describe such phenomena belong to one of a small number of families--the exponential, transformation, and stable families. In the past 25 years, the likelihood function has been recognized as the fundamental element of approach to drawing scientific conclusions. This book brings together for the first time these two components of statistics as the central themes of statistical inference. Chapters focus on model building, approximations, and examples. There are also appendices on the elements of measure theory, probability theory, and numerical methods. The discussions are appropriate for students of mathematical statistics.
Part 1: Model Building 1. Model building 2. Exponential family Part 2: Inference 3. Likelihood 4. Goodness of it Part 3: Approximations 5. Asymptotics 6. Factoring the likelihood function Part 4: Decisions 7. Frequentist decision-making 8. Bayesian decision-making Part 5: Examples 9. Poisson regression 10. Binomial regression Part 6: Appendices A Elements of measure theory B Review of probability theory C Normal distribution statistics D Numerical methods Bibliography Index