Bayesian Statistics - An Introduction

2y ago
15 Views
3 Downloads
4.13 MB
114 Pages
Last View : 18d ago
Last Download : 3m ago
Upload by : Bria Koontz
Transcription

Bayesian StatisticsStochastic Simulation - Gibbs samplingBayesian Statistics - an IntroductionDr Lawrence PettitSchool of Mathematical Sciences, Queen Mary, University of LondonJuly 22, 2008Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWhat is Bayesian Statistics?IBased on an idea of subjective probability.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWhat is Bayesian Statistics?IBased on an idea of subjective probability.IHave knowledge, beliefs about matter in hand.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWhat is Bayesian Statistics?IBased on an idea of subjective probability.IHave knowledge, beliefs about matter in hand.IExpress these as a (prior) probability distribution.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWhat is Bayesian Statistics?IBased on an idea of subjective probability.IHave knowledge, beliefs about matter in hand.IExpress these as a (prior) probability distribution.ICollect some data (likelihood).Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWhat is Bayesian Statistics?IBased on an idea of subjective probability.IHave knowledge, beliefs about matter in hand.IExpress these as a (prior) probability distribution.ICollect some data (likelihood).IUse Bayes theorem to combine prior knowledge and newinformation to find new (posterior) probability distribution.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWhat is Bayesian Statistics?IBased on an idea of subjective probability.IHave knowledge, beliefs about matter in hand.IExpress these as a (prior) probability distribution.ICollect some data (likelihood).IUse Bayes theorem to combine prior knowledge and newinformation to find new (posterior) probability distribution.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryToday’s posterior is tomorrow’s prior.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIToday’s posterior is tomorrow’s prior.IAlthough different people will start with different priors withenough data opinions will converge.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIToday’s posterior is tomorrow’s prior.IAlthough different people will start with different priors withenough data opinions will converge.IOne coherent paradigm for all problems.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIToday’s posterior is tomorrow’s prior.IAlthough different people will start with different priors withenough data opinions will converge.IOne coherent paradigm for all problems.ICan handle more realistic complex models.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIToday’s posterior is tomorrow’s prior.IAlthough different people will start with different priors withenough data opinions will converge.IOne coherent paradigm for all problems.ICan handle more realistic complex models.IMakes predictions.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIToday’s posterior is tomorrow’s prior.IAlthough different people will start with different priors withenough data opinions will converge.IOne coherent paradigm for all problems.ICan handle more realistic complex models.IMakes predictions.IThe interpretation of interval estimates is more natural.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIToday’s posterior is tomorrow’s prior.IAlthough different people will start with different priors withenough data opinions will converge.IOne coherent paradigm for all problems.ICan handle more realistic complex models.IMakes predictions.IThe interpretation of interval estimates is more natural.INuisance parameter and constraints on parameters can beeasily dealt with.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBayes theoremIBayes Theorem is named after Rev. Thomas Bayes, anonconformist minister who lived in England in the first halfof the eighteenth century.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBayes theoremIBayes Theorem is named after Rev. Thomas Bayes, anonconformist minister who lived in England in the first halfof the eighteenth century.IThe theorem was published posthumously in 1763 in ‘Anessay towards solving a problem in the doctrine of chances’.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBayes theoremILet Ω be a sample space and B1 , B2 , . . . , Bk be mutuallyexclusive and exhaustive events in Ω (i.e.Bi Bj , i 6 j, ki 1 Bi Ω; the Bi form a partition of Ω.)Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBayes theoremILet Ω be a sample space and B1 , B2 , . . . , Bk be mutuallyexclusive and exhaustive events in Ω (i.e.Bi Bj , i 6 j, ki 1 Bi Ω; the Bi form a partition of Ω.)ILet A be any event with Pr[A] 0.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBayes theoremILet Ω be a sample space and B1 , B2 , . . . , Bk be mutuallyexclusive and exhaustive events in Ω (i.e.Bi Bj , i 6 j, ki 1 Bi Ω; the Bi form a partition of Ω.)ILet A be any event with Pr[A] 0.IPr[Bi A] Pr[Bi ] Pr[A Bi ]Pr[Bi ] Pr[A Bi ] PkPr[A]j 1 Pr[Bj ] Pr[A Bj ]Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryExampleIA diagnostic test for a disease gives a correct result 99% ofthe time. Suppose 2% of the population have the disease. If aperson selected at random from the population is given thetest and produces a positive result what is the probability thatthe person has the disease? Suppose they are given a secondindependent test and it is also positive, what is the probabilitythat they have the disease now?Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryExampleIA diagnostic test for a disease gives a correct result 99% ofthe time. Suppose 2% of the population have the disease. If aperson selected at random from the population is given thetest and produces a positive result what is the probability thatthe person has the disease? Suppose they are given a secondindependent test and it is also positive, what is the probabilitythat they have the disease now?ILet ‘ ’ and ‘ ’ denote the events that a test is positive ornegative, respectively. Let D denote the event that the personhas the disease.IWe require p(D ) and p(D ).Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theorySolutionIILet ‘ ’ and ‘ ’ denote the events that a test is positive ornegative, respectively. Let D denote the event that the personhas the disease.We require p(D ) and p(D ). We are told thatDr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theorySolutionIILet ‘ ’ and ‘ ’ denote the events that a test is positive ornegative, respectively. Let D denote the event that the personhas the disease.We require p(D ) and p(D ). We are told thatp( D) 0.99,p( D̄) 0.99,Dr Lawrence Pettitp(D) 0.02.Bayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theorySolutionIILet ‘ ’ and ‘ ’ denote the events that a test is positive ornegative, respectively. Let D denote the event that the personhas the disease.We require p(D ) and p(D ). We are told thatp( D) 0.99,Ip( D̄) 0.99,p(D) 0.02.By Bayes theoremp( D)p(D)p( D)p(D) p( D̄)p(D̄)0.99 0.02 0.99 0.02 0.01 0.98 0.6689p(D ) Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theorySolution for a second testISuppose that we have a second positive test resultp( D)p(D)p( D)p(D) p( D̄)p(D̄)0.992 0.02 0.992 0.02 0.012 0.98 0.9950p(D ) Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theorySolution for a second testISuppose that we have a second positive test resultp( D)p(D)p( D)p(D) p( D̄)p(D̄)0.992 0.02 0.992 0.02 0.012 0.98 0.9950p(D ) IThus although the test is very accurate we need two positiveresults before we can say with confidence that the person hasthe disease.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryDensity form of Bayes TheoremILet X , θ be two continuous r.v.’s (possibly multivariate).f (θ x) f (θ)f (x θ)f (θ)f (x θ) Rf (x)f (θ0 )f (x θ0 )dθ0Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryDensity form of Bayes TheoremILet X , θ be two continuous r.v.’s (possibly multivariate).f (θ x) If (θ)f (x θ)f (θ)f (x θ) Rf (x)f (θ0 )f (x θ0 )dθ0Posterior Likelihood PriorDr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryDensity form of Bayes TheoremILet X , θ be two continuous r.v.’s (possibly multivariate).f (θ x) If (θ)f (x θ)f (θ)f (x θ) Rf (x)f (θ0 )f (x θ0 )dθ0Posterior Likelihood PriorIp(θ x) p(x θ)p(θ)Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryThe Likelihood PrincipleITo illustrate the difference between the classical and Bayesianapproaches we start with an example.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryThe Likelihood PrincipleIITo illustrate the difference between the classical and Bayesianapproaches we start with an example.Suppose we toss a drawing pin and get 9 ‘ups’ and 3 ‘downs’.We denote ‘up’ by U and ‘down’ by D. Is the pin unbiased?Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryThe Likelihood PrincipleIIITo illustrate the difference between the classical and Bayesianapproaches we start with an example.Suppose we toss a drawing pin and get 9 ‘ups’ and 3 ‘downs’.We denote ‘up’ by U and ‘down’ by D. Is the pin unbiased?Classically we might test H0 : p 12 versus H1 : p 12 wherep p(U). The probability of the observed result or somethingmore extreme (tail area) if H0 is true is 12121212121 32102which level.2994096 7.3%. Thus we would accept H0 at the 5%Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryHowever this assumes that we did the experiment bydeciding to toss the drawing pin 12 times.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryHowever this assumes that we did the experiment bydeciding to toss the drawing pin 12 times.What if we decided to toss the pin until we achieved 3 D’s?Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryHowever this assumes that we did the experiment bydeciding to toss the drawing pin 12 times.What if we decided to toss the pin until we achieved 3 D’s?Now the probability of the observed result or somethingmore extreme if H0 is true is 12 13 14121131111 ···222222We may calculate the probability of the complement of thisevent by 11 10 31019121 ··· 222222It follows that the P-value isreject H0 at the 5% level.Dr Lawrence Pettit1344096 3.25%. So we wouldBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIn order to perform a significance test we are required tospecify a sample space i.e. the space of all possibleoutcomes.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIIn order to perform a significance test we are required tospecify a sample space i.e. the space of all possibleoutcomes.IPossibilities for the drawing pin are:(i) {(u, d) : u d 12},(ii) {(u, d) : d 3},or if I carry on tossing the pin until my coffee is ready, sothat there is a random stopping point(iii) {all (u, d)}.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryThe Bayesian analysis of this problem is somewhat different.Let θ be the chance that the pin lands up.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIThe Bayesian analysis of this problem is somewhat different.Let θ be the chance that the pin lands up.Iθ is a “long run frequency” of U’s. It is an objective propertyof the pin. It does not depend on You.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIThe Bayesian analysis of this problem is somewhat different.Let θ be the chance that the pin lands up.Iθ is a “long run frequency” of U’s. It is an objective propertyof the pin. It does not depend on You.IYou have beliefs about θ which you express in the form of aprobability density function (pdf) p(θ). You use Bayestheorem to update your beliefs.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIThe Bayesian analysis of this problem is somewhat different.Let θ be the chance that the pin lands up.Iθ is a “long run frequency” of U’s. It is an objective propertyof the pin. It does not depend on You.IYou have beliefs about θ which you express in the form of aprobability density function (pdf) p(θ). You use Bayestheorem to update your beliefs.Ip(θ data) p(data θ)p(θ)θ9 (1 θ)3 p(θ)The sampling rule is irrelevant.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIn deriving the posterior distribution the only contributionfrom the data is through the likelihood p(data θ). Thus aBayesian inference, which will depend only on the posteriordistribution, obeys the likelihood principle.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIIn deriving the posterior distribution the only contributionfrom the data is through the likelihood p(data θ). Thus aBayesian inference, which will depend only on the posteriordistribution, obeys the likelihood principle.IThis roughly says that if two experiments give the samelikelihoods then the inferences we make should be the same ineach case.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIIn deriving the posterior distribution the only contributionfrom the data is through the likelihood p(data θ). Thus aBayesian inference, which will depend only on the posteriordistribution, obeys the likelihood principle.IThis roughly says that if two experiments give the samelikelihoods then the inferences we make should be the same ineach case.IClassical hypothesis tests or confidence intervals violate thelikelihood principle.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWe need to give a prior distribution for θ.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWe need to give a prior distribution for θ.Suppose we takep(θ) Γ(a b) a 1θ (1 θ)b 1 a, b 0Γ(a)Γ(b)ie a beta distribution with mean a/(a b) and varianceab1.a ba ba b 1We shall write this distribution as Be(a, b).Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryWe need to give a prior distribution for θ.Suppose we takep(θ) Γ(a b) a 1θ (1 θ)b 1 a, b 0Γ(a)Γ(b)ie a beta distribution with mean a/(a b) and varianceab1.a ba ba b 1IWe shall write this distribution as Be(a, b).It then follows thatp(θ data) θ9 a 1 (1 θ)3 b 1 .That is Be(9 a, 3 b). Thus if we take a beta prior for θ weshall obtain a beta posterior.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryThe choice a b 1 gives a uniform distribution as the prior,ie we think any value of θ is equally likely. More realisticallyfor a pin we might take a b 2, reflecting a belief that wethink θ is more likely to be near 0.5 than 0 or 1 but not beingvery sure.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIThe choice a b 1 gives a uniform distribution as the prior,ie we think any value of θ is equally likely. More realisticallyfor a pin we might take a b 2, reflecting a belief that wethink θ is more likely to be near 0.5 than 0 or 1 but not beingvery sure.IOthers might choose an asymmetric prior, perhaps arguingthat a pin with a very long point would very likely land downso a pin with any point would land down more often than up. Iam not convinced by this argument but it shows that differentpeople do have different beliefs and one of the advantages ofthe Bayesian approach is that the analysis can reflect these.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIf we were throwing a coin rather than a pin then we wouldalmost certainly choose a different prior. We have much moreexperience throwing coins than pins and are much more surethat θ, the chance the coin will land heads, is close to 0.5.Thus we might take a b 50 say.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIf we were throwing a coin rather than a pin then we wouldalmost certainly choose a different prior. We have much moreexperience throwing coins than pins and are much more surethat θ, the chance the coin will land heads, is close to 0.5.Thus we might take a b 50 say.The posteriors we get for the pin and the coin will be verydifferent. For the pin the posterior mean will be 11/16 0.6875 and the posterior variance11/16 5/16 1/17 0.0126. For the coin the posteriormean will be 59/112 0.527 which is close to 0.5.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryIf we were throwing a coin rather than a pin then we wouldalmost certainly choose a different prior. We have much moreexperience throwing coins than pins and are much more surethat θ, the chance the coin will land heads, is close to 0.5.Thus we might take a b 50 say.The posteriors we get for the pin and the coin will be verydifferent. For the pin the posterior mean will be 11/16 0.6875 and the posterior variance11/16 5/16 1/17 0.0126. For the coin the posteriormean will be 59/112 0.527 which is close to 0.5.The classical unbiased estimate is 9/12 0.75 if the numberof throws is fixed, or 9/11 0.818 if we continue until wehave three ‘failures’. The classical answers are the samewhether for pins or coins, ignoring the extra information thatwe have.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryPlot for pinsDr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryPlot for coinsDr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBy taking mixtures of conjugate priors we can represent morerealistic beliefs.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBy taking mixtures of conjugate priors we can represent morerealistic beliefs.As an example consider the result when a coin is spun on itsedge. Experience has shown that when spinning a coin theproportion of heads is more likely to be 1/3 or 2/3 than 1/2.Therefore a bimodal prior seems appropriate. Since spinningcoins will be Bernoulli trials the beta distribution will beconjugate.Dr Lawrence PettitBayesian Statistics - an Introduction

Bayesian StatisticsStochastic Simulation - Gibbs samplingIIIWhat is Bayesian Statistics?Bayes TheoremThe Likelihood PrincipleMixtures of conjugate priorsPoisson examplePredictive distributionsA little decision theoryBy taking mixtures of conjugate priors we can represent morerealistic beliefs.As an example consider the result when a coin is spun on itsedge. Experience has shown tha

Bayesian Statistics Stochastic Simulation - Gibbs sampling Bayesian Statistics - an Introduction Dr Lawrence Pettit School of Mathematical Sciences, Queen Mary, University of London July 22, 2008 Dr Lawrence Pettit Bayesian Statistics - an Introduction

Related Documents:

Computational Bayesian Statistics An Introduction M. Antónia Amaral Turkman Carlos Daniel Paulino Peter Müller. Contents Preface to the English Version viii Preface ix 1 Bayesian Inference 1 1.1 The Classical Paradigm 2 1.2 The Bayesian Paradigm 5 1.3 Bayesian Inference 8 1.3.1 Parametric Inference 8

value of the parameter remains uncertain given a nite number of observations, and Bayesian statistics uses the posterior distribution to express this uncertainty. A nonparametric Bayesian model is a Bayesian model whose parameter space has in nite dimension. To de ne a nonparametric Bayesian model, we have

outrightly rejected the idea of Bayesian statistics By the start of WW2, Bayes’ rule was virtually taboo in the world of Statistics! During WW2, some of the world’s leading mathematicians resurrected Bayes’ rule in deepest secrecy to crack the coded messages of the Germans Dr. Lee Fawcett MAS2317/3317: Introduction to Bayesian Statistics

Christiana Kartsonaki Introduction to Bayesian Statistics February 11th, 2015 19 / 28. Posterior distribution Conclusions can be summarized using for example posterior mean posterior variance credible intervals Christiana Kartsonaki Introduction to Bayesian Statistics February 11th, 2015 20

The Centre for Bayesian Statistics in Health Economics (CHEBS) The Centre for Bayesian Statistics in Health Economics (CHEBS) is a research centre of the University of Sheffield. It was created in 2001 as a collaborative ini-tiative of the Department of Probability and Statistics

2.2 Bayesian Cognition In cognitive science, Bayesian statistics has proven to be a powerful tool for modeling human cognition [23, 60]. In a Bayesian framework, individual cognition is modeled as Bayesian inference: an individual is said to have implicit beliefs

Mathematical statistics uses two major paradigms, conventional (or frequentist), and Bayesian. Bayesian methods provide a complete paradigm for both statistical inference and decision mak-ing under uncertainty. Bayesian methods may be derived from an axiomatic system, and hence provideageneral, coherentmethodology.

paper no.1( 2 cm x 5 cm x 0.3 mm ) and allowed to dry sera samples at 1: 500 dilution and their corresponding at room temperature away from direct sun light after filter paper extracts at two-fold serial dilutions ranging that stored in screw-capped air tight vessels at – 200C from 1: 2 up to 1: 256.