Bayesian Methods For Statistical Analysis

2y ago
16 Views
3 Downloads
6.06 MB
697 Pages
Last View : 20d ago
Last Download : 3m ago
Upload by : Mya Leung
Transcription

BAYESIAN METHODSforStatistical Analysis

BAYESIAN METHODSforStatistical AnalysisBY BOREK PUZA

Published by ANU eViewThe Australian National UniversityActon ACT 2601, AustraliaEmail: enquiries.eview@anu.edu.auThis title is also available online at http://eview.anu.edu.auNational Library of Australia Cataloguing-in-Publication entryCreator:Puza, Borek, author.Title:Bayesian methods for statistical analysis / Borek Puza.ISBN:9781921934254 (paperback) 9781921934261 (ebook)Subjects:Bayesian statistical decision theory.Statistical decision.Dewey Number:519.542All rights reserved. No part of this publication may be reproduced, stored in a retrieval systemor transmitted in any form or by any means, electronic, mechanical, photocopying or otherwise,without the prior permission of the publisher.Cover design and layout by ANU PressPrinted by Griffin PressThis edition 2015 ANU eView

ContentsAbstract. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ixAcknowledgements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiPreface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiiiOverview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvChapter 1: Bayesian Basics Part 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Bayes’ rule. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Bayes factors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91.4 Bayesian models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111.5 The posterior distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121.6 The proportionality formula. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131.7 Continuous parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191.8 Finite and infinite population inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221.9 Continuous data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231.10 Conjugacy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241.11 Bayesian point estimation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251.12 Bayesian interval estimation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261.13 Inference on functions of the model parameter . . . . . . . . . . . . . . . . . . . . . . . . . . 311.14 Credibility estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34Chapter 2: Bayesian Basics Part 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.1 Frequentist characteristics of Bayesian estimators. . . . . . . . . . . . . . . . . . . . . . . . .2.2 Mixture prior distributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.3 Dealing with a priori ignorance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.4 The Jeffreys prior. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.5 Bayesian decision theory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.6 The posterior expected loss . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2.7 The Bayes estimate. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6161748081869398Chapter 3: Bayesian Basics Part 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1093.1 Inference given functions of the data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1093.2 Bayesian predictive inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1163.3 Posterior predictive p-values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1303.4 Bayesian models with multiple parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135

Chapter 4: Computational Tools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1534.1 Solving equations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1534.2 The Newton-Raphson algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1534.3 The multivariate Newton-Raphson algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . 1614.4 The Expectation-Maximisation (EM) algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . 1654.5 Variants of the NR and EM algorithms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1754.6 Integration techniques. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1884.7 The optim() function. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194Chapter 5: Monte Carlo Basics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2015.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2015.2 The method of Monte Carlo integration for estimating means. . . . . . . . . . . . . 2025.3 Other uses of the MC sample. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2055.4 Importance sampling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2095.5 MC estimation involving two or more random variables. . . . . . . . . . . . . . . . . . 2135.6 The method of composition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2145.7 Monte Carlo estimation of a binomial parameter. . . . . . . . . . . . . . . . . . . . . . . . 2165.8 Random number generation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2275.9 Sampling from an arbitrary discrete distribution. . . . . . . . . . . . . . . . . . . . . . . . . 2285.10 The inversion technique. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2315.11 Random number generation via compositions . . . . . . . . . . . . . . . . . . . . . . . . . . 2345.12 Rejection sampling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2365.13 Methods based on the rejection algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2405.14 Monte Carlo methods in Bayesian inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . 2415.15 MC predictive inference via the method of composition. . . . . . . . . . . . . . . . . . 2515.16 Rao-Blackwell methods for estimation and prediction. . . . . . . . . . . . . . . . . . . . 2535.17 MC estimation of posterior predictive p-values. . . . . . . . . . . . . . . . . . . . . . . . . . 258Chapter 6: MCMC Methods Part 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2636.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2636.2 The Metropolis algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2636.3 The batch means method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2746.4 Computational issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2856.5 Non-symmetric drivers and the general Metropolis algorithm. . . . . . . . . . . . . 2866.6 The Metropolis-Hastings algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2906.7 Independence drivers and block sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3056.8 Gibbs steps and the Gibbs sampler. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306

Chapter 7: MCMC Methods Part 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3217.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3217.2 Data augmentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321Chapter 8: Inference via WinBUGS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3658.1 Introduction to BUGS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3658.2 A first tutorial in BUGS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3678.3 Tutorial on calling BUGS in R. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389Chapter 9: Bayesian Finite Population Theory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4079.1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4079.2 Finite population notation and terminology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4089.3 Bayesian finite population models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4099.4 Two types of sampling mechanism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4119.5 Two types of inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4129.6 Analytic inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4139.7 Descriptive inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414Chapter 10: Normal Finite Population Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46710.1 The basic normal-normal finite population model . . . . . . . . . . . . . . . . . . . . . . . 46710.2 The general normal-normal finite population model . . . . . . . . . . . . . . . . . . . . . 47710.3 Derivation of the predictive distribution of the nonsample vector. . . . . . . . . . 48010.4 Alternative formulae for the predictive distribution of the nonsample vector.48110.5 Prediction of the finite population mean and other linear combinations. . . . . 48310.6 Special cases including ratio estimation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48410.7 The normal-normal-gamma finite population model. . . . . . . . . . . . . . . . . . . . . 49410.8 Special cases of the normal-normal-gamma finite population model. . . . . . . . 49710.9 The case of an informative prior on the regression parameter. . . . . . . . . . . . . 501Chapter 11: Transformations and Other Topics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51511.1 Inference on complicated quantities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51511.2 Data transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52611.3 Frequentist properties of Bayesian finite population estimators. . . . . . . . . . . . 536Chapter 12: Biased Sampling and Nonresponse. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55912.1 Review of sampling mechanisms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55912.2 Nonresponse mechanisms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56012.3 Selection bias in volunteer surveys. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57812.4 A classical model for self-selection bias. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57812.5 Uncertainty regarding the sampling mechanism . . . . . . . . . . . . . . . . . . . . . . . . . 58312.6 Finite population inference under selection bias in volunteer surveys . . . . . . . 588

Appendix A: Additional Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 609Exercise A.1 Practice with the Metropolis algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . 609Exercise A.2 Practice with the MH algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 619Exercise A.3 Practice with a Bayesian finite population regression model. . . . . . . . . . 626Exercise A.4 Case study in Bayesian finite population modelswith biased sampling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 638Appendix B: Distributions and Notation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 667B.1 The normal distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 667B.2 The gamma distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 668B.3 The exponential distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 669B.4 The chi-squared distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 669B.5 The inverse gamma distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 670B.6 The t distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 670B.7 The F distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 671B.8 The (continuous) uniform distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 671B.9 The discrete uniform distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 671B.10 The binomial distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 672B.11 The Bernoulli distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 672B.12 The geometric distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 672Appendix C: Abbreviations and Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 673Bibliography. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 677

Abstract‘Bayesian Methods for Statistical Analysis’ is a book on statisticalmethods for analysing a wide variety of data. The book consists of 12chapters, starting with basic concepts and covering numerous topics,including Bayesian estimation, decision theory, prediction, hypothesistesting, hierarchical models, Markov chain Monte Carlo methods, finitepopulation inference, biased sampling and nonignorable nonresponse.The book contains many exercises, all with worked solutions, includingcomplete computer code. It is suitable for self-study or a semester-longcourse, with three hours of lectures and one tutorial per week for 13 weeks.ix

Acknowledgements‘Bayesian Methods for Statistical Analysis’ derives from the lecture notesfor a four-day course titled ‘Bayesian Methods’, which was presented tostaff of the Australian Bureau of Statistics, at ABS House in Canberra, in2013. Lectures of three hours each were held in the mornings of 11, 18and 25 November and 9 December, and three-hour tutorials were held inthe mornings of 14, 20 and 27 November and 11 December.Of the 30-odd participants, some of whom attended via video link fromregional ABS offices, special thanks go to Anura Amarasinghe, RachelBarker, Geoffrey Brent, Joseph Chien, Alexander Hanysz, SebastienLucie, Peter Radisich and Anthony Russo, who asked insightful questions,pointed out errors, and contributed to an improved second edition of thelecture notes. Thanks also to Siu-Ming Tam, First Australian Statisticianof the Methodology and Data Management Division at ABS, for usefulcomments, and for inviting the author to present the course in the firstplace, after having read Puza (1995). Last but not least, special thanks goto Kylie Johnson for her excellent work as the course administrator.Further modifications to ‘Bayesian Methods’ led to the present work,which is published as an eView textbook by the ANU Press, Canberra.Many thanks to David Gardiner, Brian Kennett, Teresa Prowse, EmilyTinker and two anonymous referees for useful comments and suggestionswhich helped to further improve the quality of the book. Thanks also toYi (James) Zhang for his proofreading of the book whilst learning thematerial as part of his Honours studies.xi

Preface‘Bayesian Methods for Statistical Analysis’ is a book which can be usedas the text for a semester-long course and is suitable for anyone who isfamiliar with statistics at the level of ‘Mathematical Statistics withApplications’ by Wackerly, Mendenhall and Scheaffer (2008). The bookdoes not attempt to cover all aspects of Bayesian methods but to providea ‘guided tour’ through the subject matter, one which naturally reflects theauthor's particular interests gained over years of research and teaching.For a more comprehensive account of Bayesian methods, the reader isreferred to the very extensive literature on this subject, including ‘Theoryof Probability’ by Jeffreys (1961), ‘Bayesian Inference in StatisticalAnalysis’ by Box and Tiao (1973), ‘Markov Chain Monte Carlo inPractice’ by Gilks et al. (1996), ‘Bayesian Statistics: An Introduction’ byLee (1997), ‘Bayesian Methods: An Analysis for Statisticians andInterdisciplinary Researchers’ by Leonard and Hsu (1999), ‘BayesianData Analysis’ by Gelman et al. (2004), ‘Computational BayesianStatistics’ by Bolstad (2009) and ‘Handbook of Markov Chain MonteCarlo’ by Brooks et al. (2011). See also Smith and Gelfand (1992) andO'Hagan and Forster (2004).The software packages which feature in this book are R and WinBUGS.R is a general software environment for statistical computing and graphicswhich compiles and runs on UNIX platforms, Windows and MacOS. Thissoftware is available for free at www.r-project.org/ Two useful guides toR are ‘Bayesian Computation With R’ by Albert (2009) and ‘DataAnalysis and Graphics Using R: An Example-Based Approach’ byMaindonald and Braun (2010).BUGS stands for ‘Bayesian Inference Using Gibbs Sampling’ and is aspecialised software environment for the Bayesian analysis of complexstatistical models using Markov chain Monte Carlo methods. WinBUGS,a version of BUGS for Microsoft Windows, is available for free atwww.mrc-bsu.cam.ac.uk/software/bugs/ Two useful guides to WinBUGSare ‘Bayesian Modeling Using WinBUGS’ by Ntzoufras (2009) and‘Bayesian Population Analysis Using WinBUGS’ by Kéry and Schaub(2012).xiii

Bayesian Methods for Statistical AnalysisThe present book includes a large number of exercises, interspersedthroughout and each followed by a detailed solution, including completecomputer code. A student should be able to reproduce all of the numericaland graphical results in the book by running the provided code. Althoughmany of the exercises are straightforward, some are fairly involved, and afew will be of interest only to the particularly keen or advanced student.All of the code in this book is also available in the form of an electronictext document which can be obtained from the same website as the book.This book is in the form of an Adobe PDF file saved from Microsoft Word2013 documents, with the equations as MathType 6.9 objects. The figuresin the book were created using Microsoft Paint, the Snipping Tool inWindows, WinBUGS and R. In the few instances where color is used, thisis only for additional clarity. Thus, the book can be printed in black andwhite with no loss of essential information.The following chapter provides an overview of the book. Appendix Acontains several additional exercises with worked solutions, Appendix Bhas selected distributions and notation, and Appendix C lists someabbreviations and acronyms. Following the appendices is a bibliographyfor the entire book.The last four of the 12 chapters in this book constitute a practicalcompanion to ‘Monte Carlo Methods for Finite Population Inference’, alargely theoretical manuscript written by the author (Puza, 1995) duringthe last year of his employment at the Australian Bureau of Statistics inCanberra.xiv

OverviewChapter 1: Bayesian Basics Part 1 (pages 1–60)Introduces Bayes’ rule, Bayes factors, Bayesian models, posteriordistributions, and the proportionality formula. Also covered are thebinomial-beta model, the Jeffreys’ famous tramcar problem, thedistinction between finite population inference and superpopulationinference, conjugacy, point and interval estimation, inference on functionsof parameters, credibility estimation, the normal-normal model, and thenormal-gamma model.Chapter 2: Bayesian Basics Part 2 (pages 61–108)Covers the frequentist characteristics of Bayesian estimators includingbias and coverage probabilities, mixture priors, uninformative priorsincluding the Jeffreys prior, and Bayesian decision theory including theposterior expected loss and Bayes risk.Chapter 3: Bayesian Basics Part 3 (pages 109–152)Covers inference based on functions of the data including censoring androunded data, predictive inference, posterior predictive p-values,multiple-parameter models, and the normal-normal-gamma modelincluding an example of Bayesian finite population inference.Chapter 4: Computational Tools (pages 153–200)Covers the Newton-Raphson (NR) algorithm including its multivariateversion, the expectation-maximisation (EM) algorithm, hybrid searchalgorithms, integration techniques including double integration,optimisation in R, and specification of prior distributions.Chapter 5: Monte Carlo Basics (pages 201–262)Covers Monte Carlo integration, importance sampling, the method ofcomposition, Buffon’s needle problem, testing the coverage of MonteCarlo confidence intervals, random number generation including theinversion technique, rejection sampling, and applications to Bayesianinference including prediction in the normal-normal-gamma model, RaoBlackwell estimation, and estimation of posterior predictive p-values.xv

Bayesian Methods for Statistical AnalysisChapter 6: MCMC Methods Part 1 (pages 263–320)Covers Markov chain Monte Carlo (MCMC) methods including theMetropolis-Hastings algorithm, the Gibbs sampler, specification of tuningparameters, the batch means method, computational issues, andapplications to the normal-normal-gamma model.Chapter 7: MCMC Methods Part 2 (pages 321–364)Covers stochastic data augmentation, a comparison of classical andBayesian methods for linear regression and logistic regression,respectively, and a Bayesian model for correlated Bernoulli data.Chapter 8: MCMC Inference via WinBUGS(pages 365–406)Provides a detailed tutorial in the WinBUGS computer package includingrunning WinBUGS within R, and shows how WinBUGS can be used forlinear regression, logistic regression and ARIMA time series analysis.Chapter 9: Bayesian Finite Population Theory(pages 407–466)Introduces notation and terminology for Bayesian finite populationinference in the survey sampling context, and discusses ignorable andnonignorable sampling mechanisms. These concepts are illustrated byway of examples and exercises, some of which involve MCMC methods.Chapter 10: Normal Finite Population Models(pages 467–514)Contains a generalisation of the normal-normal-gamma model to the finitepopulation context with covariates. Useful vector and matrix formulae areprovided, special cases such as ratio estimation are treated in detail, and itis shown how MCMC methods can be used for both descriptive andanalytic inferences.xvi

OverviewChapter 11: Transformations and Other Topics(pages 515–558)Shows how MCMC methods can be used for inference on complicatedfunctions of superpopulation and finite population quantities, as well forinference based on transformed data. Frequentist characteristics ofBayesian estimators are discussed in the finite population context, withexamples of how Monte Carlo methods can be used to estimate modelbias, design bias, model coverage and design coverage.Chapter 12: Biased Sampling and Nonresponse(pages 559–608)Discusses and provides examples of ignorable and nonignorable responsemechanisms, with an exercise involving follow-up data. The topic of selfselection bias in volunteer surveys is studied from a frequentistperspective, then treated using Bayesian methods, and finally extended tothe finite population context.Appendix A: Additional Exercises (pages 609–666)Provides practice at applying concepts in the last

Two useful guides to WinBUGS are ‘Bayesian Modeling Using WinBUGS’ by Ntzoufras (2009) and ‘Bayesian Population Analysis Using WinBUGS’ by Kéry and Schaub (2012). Bayesian Methods for Statistical Analysis xiv The presen

Related Documents:

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

Intro — Introduction to Bayesian analysis . Bayesian analysis is a statistical analysis that answers research questions about unknown parameters of statistical models by using probability statements. Bayesian analysis rests on the assumption that all . Proportion infected in the population, q p(q) p(q y)

10 tips och tricks för att lyckas med ert sap-projekt 20 SAPSANYTT 2/2015 De flesta projektledare känner säkert till Cobb’s paradox. Martin Cobb verkade som CIO för sekretariatet för Treasury Board of Canada 1995 då han ställde frågan

service i Norge och Finland drivs inom ramen för ett enskilt företag (NRK. 1 och Yleisradio), fin ns det i Sverige tre: Ett för tv (Sveriges Television , SVT ), ett för radio (Sveriges Radio , SR ) och ett för utbildnings program (Sveriges Utbildningsradio, UR, vilket till följd av sin begränsade storlek inte återfinns bland de 25 största

Hotell För hotell anges de tre klasserna A/B, C och D. Det betyder att den "normala" standarden C är acceptabel men att motiven för en högre standard är starka. Ljudklass C motsvarar de tidigare normkraven för hotell, ljudklass A/B motsvarar kraven för moderna hotell med hög standard och ljudklass D kan användas vid

LÄS NOGGRANT FÖLJANDE VILLKOR FÖR APPLE DEVELOPER PROGRAM LICENCE . Apple Developer Program License Agreement Syfte Du vill använda Apple-mjukvara (enligt definitionen nedan) för att utveckla en eller flera Applikationer (enligt definitionen nedan) för Apple-märkta produkter. . Applikationer som utvecklas för iOS-produkter, Apple .

Mathematical statistics uses two major paradigms, conventional (or frequentist), and Bayesian. Bayesian methods provide a complete paradigm for both statistical inference and decision mak-ing under uncertainty. Bayesian methods may be derived from an axiomatic system, and hence provideageneral, coherentmethodology.

Homework #4 - Answer key 1. Bargaining with in–nite periods and N 2 players. Consider the in–nite-period alternating-o er bargaining game presented in class, but let us allow for N 2 players. Player 1 is the proposer in period 1, period N 1, period 2N 1, and so on. Similarly, player 2 is the proposer in period 2, period N 2, period 2N 2, and so on. A similar argument applies to any .