Some Discrete Distributions

2y ago
4 Views
2 Downloads
301.30 KB
14 Pages
Last View : 4m ago
Last Download : 3m ago
Upload by : Konnor Frawley
Transcription

CHAPTER 6Some discrete distributions6.1. Examples: Bernoulli, binomial, Poisson, geometric distributionsBernoulli distributionA random variablebe aXsuch thatBernoulli random variableP(X 1) pwith parameterVar X p p2 p(1 p).We denote such a random variable byBinomial distributionA random variable X has a k) nk pk (1 p)n k .nwith parametersnandpcumbersome calculations one can deriveX Y1 · · · Yn ,EX EY1 · · · EYn np.soifP(X X Binom (n, p).Bernoulli trials is a binomial random variable.is binomial, thentoX Bern (p).binomial distributionWe denote such a random variable byThe number of successes inP(X 0) 1 p is saidp. Note EX p and EX 2 p,andEX np. An easier wayYi are independentwhere theAfter someis to realize that ifXBernoulli variables, soWe have not de ned yet what it means for random variables to be independent, but here wemean that the events such as(Yi 1)are independent.Proposition 6.1X : Y1 · · · Yn , where {Yi }ni 1 are independent Bernoulli random variablesparameter p, thenEX np, Var X np(1 p).SupposewithProof. First we use the de nition of expectation to see that nnXXn in in ip (1 p) ip (1 p)n i .EX iiii 0i 1Then81

826. SOME DISCRETE DISTRIBUTIONSEX nXii 1 npn!pi (1 p)n ii!(n i)!nXi 1(n 1)!pi 1 (1 p)(n 1) (i 1)(i 1)!((n 1) (i 1))!n 1X(n 1)!pi (1 p)(n 1) ii!((n 1) i)!i 0n 1X n 1 nppi (1 p)(n 1) i np,ii 0 npwhere we used the Binomial Theorem (Theorem 1.1).To get the variance ofX,we rst observe thatEX 2nXi 1EYi2 XEYi Yj .i6 jNowEYi Yj 1 · P(Yi Yj 1) 0 · P(Yi Yj 0) P(Yi 1, Yj 1) P(Yi 1)P(Yj 1) p2n22using independence of random variables {Yi }i 1 . Expanding (Y1 · · · Yn ) yields n terms,22of which n are of the form Yk . So we have n n terms of the form Yi Yj with i 6 j . HenceVar X EX 2 (EX)2 np (n2 n)p2 (np)2 np(1 p). Later we will see that the variance of the sum of independent random variables is the sumof the variances, so we could quickly get Var X np(1 p). Alternatively, one can computeE(X 2 ) EX E(X(X 1)) using binomial coe cients and derive the variance of X fromthat.Poisson distributionA random variableXhas thePoisson distributionP(X i) e λWe denote such a random variable byX Pois(λ). Xi 0so the probabilities add up to one.with parameterλi.i!Note thatλi /i! eλ ,λif

6.1. EXAMPLES: BERNOULLI, BINOMIAL, POISSON, GEOMETRIC DISTRIBUTIONS83Proposition 6.2SupposeXis a Poisson random variable with parameterλ,thenEX λ,Var X λ.Proof. We start with the expectationEX Xiei λ λi 0i! λ e Xλi 1λ λ.(i 1)!i 1Similarly one can show thatE(X ) EX EX(X 1) 2 λ2 e λ Xi 22 λ ,soEX 2 E(X 2 X) EX λ2 λ,Example 6.1. Xi 0i(i 1)e λλii!i 2λ(i 2)!and hence Var X λ.Suppose on average there are 5 homicides per month in a given city. Whatis the probability there will be at most 1 in a certain month?Solution :IfXEX 5. Since the expectationP(X 0) P(X 1) e 5 5e 5 .is the number of homicides, we are given thatfor a Poisson isExample 6.2.λ,thenλ 5.ThereforeSuppose on average there is one large earthquake per year in California.What's the probability that next year there will be exactly 2 large earthquakes?Solution : λ EX 1, so P(X 2) e 1 ( 21 ).We have the following proposition connecting binomial and Poisson distributions.Proposition 6.3 (Binomial approximation of Poisson distribution)Xn is a binomial random variable with parameters n and pnP(Xn i) P(Y i), where Y is Poisson with parameter λ.Ifandnpn λ,then

846. SOME DISCRETE DISTRIBUTIONS6.1 (Approximation of Poisson by binomials)Note that by settingpn : λ/nforn λwe can approximate the Poisson distribution with parametertions with parametersnandλby binomial distribu-pn .This proposition shows that the Poisson distribution models binomials when the probabilityof a success is small. The number of misprints on a page, the number of automobile accidents,the number of people entering a store, etc. can all be modeled by a Poisson distribution.Proof. For simplicity, let us suppose thatcan useλn npn λ.n λ npnforn λ.In the general case weWe writeP(Xn i) n!pi (1 pn )n ii!(n i)! n i n iλn(n 1) · · · (n i 1) λ1 i!nnnin(n 1) · · · (n i 1) λ (1 λ/n) .nii! (1 λ/n)iObserve that the following three limits existn(n 1) · · · (n i 1) 1,n nii(1 λ/n) 1,n (1 λ/n) e λ ,nn which completes the proof.In Section 2.2.3 we consideredk 1, 2, . . . , n.discrete uniform distributionsP(X k) n1die (with n 6),withThis is the distribution of the number showing on aforforexample.Geometric distributionA random variableXhas the geometric distribution with parameterP(X i) (1 p)i 1pforp, 0 p 1,ifi 1, 2, . . . .Using a geometric series sum formula we see that Xi 1 XP(X i) (1 p)i 1 p In Bernoulli trials, if we leti 1X1p 1.1 (1 p)get a heads, thenXX will be a geometricX is the rst time webe the rst time we have a success, thenrandom variable. For example, if we toss a coin over and over andwill have a geometric distribution. To see this, to have the rst success

6.1. EXAMPLES: BERNOULLI, BINOMIAL, POISSON, GEOMETRIC DISTRIBUTIONSk thoccur on thek 1(1 p)k 1 p.trial, we have to havesuccess. The probability of that isProposition 6.4failures in the rstk 1p, 0 p 1,If X is a geometric random variable with parameter85trials and then athen1EX ,p1 p,p2Var X FX (k) P (X 6 k) 1 (1 p)k .Proof. We will use X1 nrn 1(1 r)2n 0which we can show by di erentiating the formula for geometric seriesThenEX Xi 1Then the variancei · P(X i) Xi 11/(1 r) P 112 ·p .p(1 (1 p))i · (1 p)i 1 p 2 X 2 11Var X E (X EX) E X i · P(X i)ppi 12To nd the variance we will use another sum. First Xrnrn , (1 r)2n 0which we can di erentiate to see that X1 r n2 rn 1 .3(1 r)n 1ThenEX 2 Xi 1i · P(X i) 2 Xi 1i2 · (1 p)i 1 p Thus2 pVar X EX (EX) p2222 p(1 (1 p)).3 ·p p2(1 (1 p)) 211 p .pp2n 0rn .

866. SOME DISCRETE DISTRIBUTIONSThe cumulative distribution function (CDF) can be found by using the geometric series sumformula1 FX (k) P (X k) XP(X i) i k 1 Xi 1(1 p)i k 1(1 p)kp (1 p)k .p 1 (1 p) Negative binomial distributionA random variableXhasnegative binomial distributionP(X n) with parameters n 1 rp (1 p)n r , n r, r 1, . . . .r 1randpifA negative binomial represents the number of trials until r successes. To get the aboveththsuccess in the ntrial, we must exactly have r 1 successes in theformula, to have the rthrst n 1 trials and then a success in the ntrial.Hypergeometric distributionA random variableXhas hypergeometric distribution with parameters mN min i .P(X i) Nnm, nandNifN balls, of which m are oneN m are another, and we choose n balls at random without replacement,the probability of having i balls of the rst color.This comes up in sampling without replacement: if there arecolor and the otherthenXrepresentsAnother model where the hypergeometric distribution comes up is the probability of a successchanges on each draw, since each draw decreases the population, in other words, when weconsider sampling without replacement from a nite population). Thensize,mis the number of success states in the population,quantity drawn in each trial,inNis the populationis the number of draws, that is,is the number of observed successes.

6.2. FURTHER EXAMPLES AND APPLICATIONS876.2. Further examples and applications6.2.1. Bernoulli and binomial random variables.Example 6.3.A company prices its hurricane insurance using the following assumptions:(i) In any calendar year, there can be at most one hurricane.(ii) In any calendar year, the probability of a hurricane is 0.05.(iii) The numbers of hurricanes in di erent calendar years are mutually independent.Using the company's assumptions, nd the probability that there are fewer than 3 hurricanesin a 20-year period.Solution :X the number of hurricanesX Binom (20, 0.05), thereforedenote bywe see thatin a 20-year period. From the assumptionsP (X 3) P (X 6 2) 202020119020 (0.05) (0.95) (0.05) (0.95) (0.05)2 (0.95)18012 0.9245.Example 6.4.Phan has a0.6 probability of making a free throw.Suppose each free throwis independent of the other. If he attempts 10 free throws, what is the probability that hemakes at least 2 of them?Solution :IfX Binom (10, 0.6),thenP (X 2) 1 P (X 0) P (X 1) 1010010 1 (0.6) (0.4) (0.6)1 (0.4)901 0.998.6.2.2. The Poisson distribution.Recall that a Poisson distribution models well eventsthat have a low probability and the number of trials is high. For example, the probability ofa misprint is small and the number of words in a page is usually a relatively large numbercompared to the number of misprints.(1) The number of misprints on a random page of a book.(2) The number of people in community that survive to age100.(3) The number of telephone numbers that are dialed in an average day.(4) The number of customers entering post o ce on an average day.Example 6.5.Levi receives an average of two texts every3minutes. If we assume thatthe number of texts is Poisson distributed, what is the probability that he receives ve ormore texts in a 9-minuteperiod?Copyright 2017 Phanuel Mariano, Patricia Alonso Ruiz, Copyright 2020 Masha Gordina.

886. SOME DISCRETE DISTRIBUTIONSSolution :LetXbe the number of texts in a9 minuteperiod. ThenP (X 5) 1 P (X 6 4) 1 λ 3·2 6and4X6n e 6n 0n! 1 0.285 0.715.Example 6.6.tationλ.Solution :LetX1 ,.,Xkbe independent Poisson random variables, each with expec-What is the distribution of the random variableThe distribution ofYis Poisson with the expectationuse Proposition 6.3 and (6.1) to choosepn kλ1 /n λ1 /m λ/nY : X1 . Xk ?n mkλ kλ.To show this, weBernoulli random variables with parameterto approximation the Poisson random variables.them all together, the limit asn If we sumgives us a Poisson distribution with expectationlim npn λ. However, we can re-arrange the same n mk Bernoulli random variablesn in k groups, each group having m Bernoulli random variables. Then the limit gives us thedistribution ofX1 . Xk .This argument can be made rigorous, but this is beyond thescope of this course. Note that we do not show that the we have convergence in distribution.Example 6.7.Letλ1 , . . . , λk ,X1 . Xk ?pectationSolution :X1 , . . . , X kbe independent Poisson random variables, each with ex-respectively.Y λ λ1 . λk . To showthis, we again use Proposition 6.3 and (6.1) with parameter pn λ/n. If n is large, we canseparate these n Bernoulli random variables in k groups, each having ni λi n/λ Bernoullirandom variables. The result follows if lim ni /n λi for each i 1, ., k .The distribution ofYWhat is the distribution of the random variableis Poisson with expectationn This entire set-up, which is quite common, involves what is calleddistributed Bernoulli random variablesExample 6.8.independent identically(i.i.d. Bernoulli r.v.).Can we use binomial approximation to nd the mean and the variance ofa Poisson random variable?Solution :Yes, and this is really simple. Recall again from Proposition 6.3 and (6.1) that wecan approximate Poissonwherepn λ/n.Ywith parameterλby a binomial random variableEach such a binomial random variable is a sum onrandom variables with parameterpn .n n independent BernoulliThereforeEY lim npn lim nn λ λ,nλVar(Y ) lim npn (1 pn ) lim nn n nBinom (n, pn ), λ1 λ.n

6.2. FURTHER EXAMPLES AND APPLICATIONS6.2.3. Table of distributions.The following table summarizes the discrete distribu-tions we have seen in this chapter.N0 N {0}HereNstands for the set of positive integers, andis the set of nonnegative integers.PMF (kNameNotationParametersBernoulliBern(p)p [0, 1]1kBinomialBinom(n, p)n Np [0, 1]nkPoissonPois(λ)λ 0GeometricGeo(p)p (0, 1)NegativeNBin(r, p)r Np (0, 1)binomialHypergeometric89Hyp(N, m, n) N N0n, m N0 N0 )E[X] Var(X)pk (1 p)1 kpp(1 p)pk (1 p)n knpnp(1 p) ke λ λk!λ((1 p)k 1 p, for k 1,1p0,else.( k 1 rp (1 p)k r , if k r, rr 1p0,else. m(mk)(Nn k)N(n)nmNλ1 pp2r(1 p)p2nm(N n)m(1 N)N (N 1)

906. SOME DISCRETE DISTRIBUTIONS6.3. ExercisesExercise 6.1.A UConn student claims that she can distinguish Dairy Bar ice cream fromFriendly's ice cream. As a test, she is given ten samples of ice cream (each sample is eitherfrom the Dairy Bar or Friendly's) and asked to identify each one. She is right eight times.What is the probability that she would be right exactly eight times if she guessed randomlyfor each sample?Exercise 6.2.A Pharmaceutical company conducted a study on a new drug that is sup-posed to treat patients su ering from a certain disease. The study concluded that the drugdid not help 25% of those who participated in the study. What is the probability that of 6randomly selected patients, 4 will recover?Exercise 6.3.20% of all students are left-handed. A class of size 20 meets in a room with18 right-handed desks and 5 left-handed desks. What is the probability that every studentwill have a suitable desk?Exercise 6.4.A ball is drawn from an urn containing 4 blue and 5 red balls. After theball is drawn, it is replaced and another ball is drawn. Suppose this process is done7times.(a) What is the probability that exactly 2 red balls were drawn in the 7 draws?(b) What is the probability that at least 3 blue balls were drawn in the 7 draws?Exercise 6.5.The expected number of typos on a page of the new Harry Potter book is0.2. What is the probability that the next page you read contains(a) 0 typos?(b) 2 or more typos?(c) Explain what assumptions you used.Exercise 6.6.The monthly average number of car crashes in Storrs, CT is 3.5. What isthe probability that there will be(a) at least 2 accidents in the next month?(b) at most 1 accident in the next month?(c) Explain what assumptions you used.Exercise 6.7.Suppose that, some time in a distant future, the average number of bur-glaries in New York City in a week is 2.2.Approximate the probability that there willbe(a) no burglaries in the next week;(b) at least 2 burglaries in the next week.Exercise 6.8.The number of accidents per working week in a particular shipyard is Poissondistributed with mean0.5.Find the probability that:(a) In a particular week there will be at least 2 accidents.

6.3. EXERCISES91(b) In a particular two week period there will be exactly 5 accidents.(c) In a particular month (i.e. 4 week period) there will be exactly 2 accidents.Exercise 6.9.Jennifer is baking cookies. She mixes 400 raisins and 600 chocolate chipsinto her cookie dough and ends up with 500 cookies.(a) Find the probability that a randomly picked cookie will have three raisins in it.(b) Find the probability that a randomly picked cookie will have at least one chocolate chipin it.(c) Find the probability that a randomly picked cookie will have no more than two bits init (a bit is either a raisin or a chocolate chip).Exercise 6.10.A roulette wheel has 38 numbers on it: the numbers 0 through 36 and a00. Suppose that Lauren always bets that the outcome will be a number between 1 and 18(including 1 and 18).(a) What is the probability that Lauren will lose her rst 6 bets.(b) What is the probability that Lauren will rst win on her sixth bet?Exercise 6.11.In the US, albinism occurs in about one in 17,000 births. Estimate theprobabilities no albino person, of at least one, or more than one albino at a football game with5,000 attendants. Use the Poisson approximation to the binomial to estimate the probability.Exercise 6.12.An egg carton contains 20 eggs, of which 3 have a double yolk. To make apancake, 5 eggs from the carton are picked at random. What is the probability that at least2 of them have a double yolk?Exercise 6.13.Around 30,000 couples married this year in CT. Approximate the proba-bility that at least in one of these couples(a) both partners have birthday on January 1st.(b) both partners celebrate birthday in the same month.Exercise 6.14.A telecommunications company has discovered that users are three timesas likely to make two-minute calls as to make four-minute calls. The length of a typical call(in minutes) has a Poisson distribution. Find the expected length (in minutes) of a typicalcall.

926. SOME DISCRETE DISTRIBUTIONS6.4. Selected solutionsSolution to Exercise 6.1:This should be modeled using a binomial random variableX,since there is a sequence of trials with the same probability of success in each one. If1she guesses randomly for each sample, the probability that she will be right each time is .2Therefore, 8 2145101 10 .P (X 8) 2228Solution to Exercise 6.2:Solution to Exercise 6.3: 64(0.75)4 (0.25)2For each student to have the kind of desk he or she prefers, theremust be no more than 18 right-handed students and no more than 5 left-handed students, sothe number of left-handed students must be between 2 and 5 (inclusive). This means thatwe want the probability that there will be 2, 3, 4, or 5 left-handed students. We use thebinomial distribution and get5 i 20 iX2014i 2Solution to Exercise 6.4(A):i55. 2 5754299Solution to Exercise 6.4(B):P (X 3) 1 P (X 6 2) 0 7 1 6 2 5574574574 1 991992990Solution to Exercise 6.5(A): e 0.2Solution to Exercise 6.5(B): 1 e 0.2 0.2e 0.2 1 1.2e 0.2 .Solution to Exercise 6.5(C): Since each word has a small probability of being a typo, thenumber of typos should be approximately Poisson distributed.Solution to Exercise 6.6(A): 1 e 3.5 3.5e 3.5 1 4.5e 3.5Solution to Exercise 6.6(B): 4.5e 3.5Solution to Exercise 6.6(C): Since each accident has a small probability it seems reasonable to suppose that the number of car accidents is approximately Poisson distributed.Solution to Exercise 6.7(A): e 2.2Solution to Exercise 6.7(B): 1 e 2.2 2.2e 2.2 1 3.2e 2.2 .

6.4. SELECTED SOLUTIONS93Solution to Exercise 6.8(A): We haveP (X 2) 1 P (X 6 1) 1 e 0.5(0.5)0(0.5)1 e 0.5.0!1!Solution to Exercise 6.8(B): In two weeks the average number of accidents will be λ 0.5 0.5 1.Then5P (X 5) e 1 15! .Solution to Exercise 6.8(C): In a 4 week period the average number of accidents will beλ 4 · (0.5) 2.Then2P (X 2) e 2 22! .Solution to Exercise 6.9(A):number of raisins per cookie is3 0.8 (0.8)which is e 0.0383.3!This calls for a Poisson random variable0.8,Solution to Exercise 6.9(B):so we take this as ourwhich is1.2, so we take this01 P (C 0) 1 e 1.2 (1.2) 0.6988.0!Solution to Exercise 6.9(C):The average. We are asking forThis calls for a Poisson random variablenumber of chocolate chips per cookie isP (C 1),λR.as ourλ.C.P(R 3),The averageWe are asking forThis calls for a Poisson random variableB.The average0.8 1.2 2, so we take this as our λ. We are asking for012P (B 0) P (B 1) P (B 2) e 2 20! e 2 21! e 2 22! .6767.number of bits per cookie isP (B 6 2),which isSolution to Exercise 6.10(A): 1 1838Solution to Exercise 6.10(B): 1 1838 6 51838Solution to Exercise 6.11 Let X denote the number of albinos at the game.We have thatX Binom(5000, p) with p 1/17000 0.00029. The binomial distribution gives us 5000 5000P(X 0) 16999 0.745P(X 1) 1 P(X 0) 1 16999 0.2551700017000P(X 1) P(X 1) P(X 1) 1 16999 500017000 5000 1 16999 499917000Approximating the distribution of 5P(Y 0) exp 17 0.745X 1117000 0.035633by a Poisson with parameterλ 500017000 5gives17 5P(Y 1) 1 P(Y 0) 1 exp 17 0.255 555P(Y 1) P(Y 1) P(Y 1) 1 exp 17 exp 17 0.03563817Solution to Exercise 6.12:LetXbe the random variable that denotes the number ofX Hyp(20, 3, 5) and we 17 17 33··P(X 2) P(X 2) P(X 3) 2 20 3 3 20 2 .eggs with double yolk in the set of chosen 5. Then5Solution to Exercise 6.13:We will use Poisson approximation.5have that

946. SOME DISCRETE DISTRIBUTIONS1. If X3652denotes the number of married couples where this is the case, we can approximate the 2distribution of X by a Poisson with parameter λ 30, 000 · 365 0.2251. Hence, 0.2251P(X 1) 1 P(X 0) 1 e.(a) The probability that both partners have birthday on January 1st isp (b) In this case, the probability of both partners celebrating birthday in the same month1/12 and therefore we approximate the distribution by a Poissonλ 30, 000/12 2500. Thus, P(X 1) 1 P(X 0) 1 e 2500 .isSolution to Exercise 6.14:tion, X Pois(λ) forE[X] λ. In addition,LetXdenote the duration (in minutes) of a call. By assump-λ 0, so that the expected durationP(X 2) 3P(X 4), which meanssome parameterwe know thatλ2λ4 3e λ .2!4!hence E[X] λ 2.e λFrom here we deduce thatλ2 4with parameterandof a call is

m i N m n i N n : This comes up in sampling without replacement: if there are N balls, of which mare one color and the other N mare another, and we choose nballs at random without replacement, then Xrepresents the probability of having iballs of the rst color. Another model where the hypergeometric d

Related Documents:

2.1 Sampling and discrete time systems 10 Discrete time systems are systems whose inputs and outputs are discrete time signals. Due to this interplay of continuous and discrete components, we can observe two discrete time systems in Figure 2, i.e., systems whose input and output are both discrete time signals.

6 POWER ELECTRONICS SEGMENTS INCLUDED IN THIS REPORT By device type SiC Silicon GaN-on-Si Diodes (discrete or rectifier bridge) MOSFET (discrete or module) IGBT (discrete or module) Thyristors (discrete) Bipolar (discrete or module) Power management Power HEMT (discrete, SiP, SoC) Diodes (discrete or hybrid module)

Computation and a discrete worldview go hand-in-hand. Computer data is discrete (all stored as bits no matter what the data is). Time on a computer occurs in discrete steps (clock ticks), etc. Because we work almost solely with discrete values, it makes since that

What is Discrete Mathematics? Discrete mathematics is the part of mathematics devoted to the study of discrete (as opposed to continuous) objects. Calculus deals with continuous objects and is not part of discrete mathematics. Examples of discrete objects: integers, distinct paths to travel from point A

Definition and descriptions: discrete-time and discrete-valued signals (i.e. discrete -time signals taking on values from a finite set of possible values), Note: sampling, quatizing and coding process i.e. process of analogue-to-digital conversion. Discrete-time signals: Definition and descriptions: defined only at discrete

2.1 Discrete-time Signals: Sequences Continuous-time signal - Defined along a continuum of times: x(t) Continuous-time system - Operates on and produces continuous-time signals. Discrete-time signal - Defined at discrete times: x[n] Discrete-time system - Operates on and produces discrete-time signals. x(t) y(t) H (s) D/A Digital filter .

Discrete Mathematics is the part of Mathematics devoted to study of Discrete (Disinct or not connected objects ) Discrete Mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous . As we know Discrete Mathematics is a back

2. Benefits of Discrete Event Simulation Discrete Event Simulation has evolved as a powerful decision making tool after the appearance of fast and inexpensive computing capacity. (Upadhyay et al., 2015) Discrete event simulation enables the study of systems which are discrete, dynamic and stoc