Lecture 3 Gaussian Probability Distribution Introduction

2y ago
18 Views
2 Downloads
866.38 KB
7 Pages
Last View : 29d ago
Last Download : 3m ago
Upload by : Mara Blakely
Transcription

Lecture 3Gaussian Probability DistributionIntroductionllGaussian probability distribution is perhaps the most used distribution in all of science.u also called “bell shaped curve” or normal distributionUnlike the binomial and Poisson distribution, the Gaussian is a continuous distribution:P(y) †l1es 2p-( y-m) 22s 2m mean of distribution (also at the same place as mode and median)s2 variance of distributiony is a continuous variable (- y )Probability (P) of y being in the range [a, b] is given by an integral:P(a y b) ub -1Úes 2p a( y-m) 22s 2 dyKarl Friedrich Gauss 1777-1855The integral for arbitrary a and b cannot be evaluated analytically The value of the integral has to be looked up in a table (e.g. Appendixes A and B of Taylor).†1p(x) es 2pP(x)(x - m )222sgaussianPlot of Gaussian pdfxK.K. GanL3: Gaussian Probability Distribution1

lThe total area under the curve is normalized to one. the probability integral:†( y-m) 22s 2 dy 11Úes 2p - We often talk about a measurement being a certain number of standard deviations (s) awayfrom the mean (m) of the Gaussian. We can associate a probability for a measurement to be m - ns from the mean just by calculating the area outside of this region.ns Prob. of exceeding ns0.670.5It is very unlikely ( 0.3%) that a10.32measurement taken at random from a20.05Gaussian pdf will be more than 3s30.003from the true mean of the distribution.40.00006P(- y ) l -Relationship between Gaussian and Binomial distributionllThe Gaussian distribution can be derived from the binomial (or Poisson) assuming:u p is finiteu N is very largeu we have a continuous variable rather than a discrete variableAn example illustrating the small difference between the two distributions under the above conditions:u Consider tossing a coin 10,000 time.p(heads) 0.5N 10,000K.K. GanL3: Gaussian Probability Distribution2

For a binomial distribution:mean number of heads m Np 5000standard deviation s [Np(1 - p)]1/2 50 The probability to be within 1s for this binomial distribution is:5000 5010 4 !m10 4 -mP 0.50.5 0.69Â4m 5000-50 (10 - m)!m!n For a Gaussian distribution:( y-m) 2m s21P(m - s y m s ) Ú e 2s dy ª 0.68s 2p m-s† Both distributions give about the sameprobability!nCentral Limit TheoremlllGaussiandistribution is important because of the Central Limit Theorem†A crude statement of the Central Limit Theorem:u Things that are the result of the addition of lots of small effects tend to become Gaussian.A more exact statement:Actually, the Y’s canu Let Y1, Y2,.Yn be an infinite sequence of independent random variablesbe from different pdf’s!each with the same probability distribution.u Suppose that the mean (m) and variance (s2) of this distribution are both finite. For any numbers a and b:È Y Y .Yn - nm 1 b - 12 y 2lim PÍa 1 2 b dyÚe nÆ Îs n2p a C.L.T. tells us that under a wide range of circumstances the probability distributionthat describes the sum of random variables tends towards a Gaussian distributionas the number of terms in the sum Æ .† K.K. GanL3: Gaussian Probability Distribution3

Alternatively:È Y -m È Y -m1 b - 12 y 2lim PÍa b lim PÍa b dyÚenÆ Î nÆ Îs/ nsm2p a n sm is sometimes called “the error in the mean” (more on that later).l For CLT to be valid:u m and s of pdf must be finite.†u No one term in sum should dominate the sum.l A random variable is not the same as a random number.u Devore: Probability and Statistics for Engineering and the Sciences: A random variable is any rule that associates a number with each outcome in Sn S is the set of possible outcomes.l Recall if y is described by a Gaussian pdf with m 0 and s 1 thenthe probability that a y b is given by: 1 b - 12 y 2P ( a y b) dyÚe2p alThe CLT is true even if the Y’s are from different pdf’s as long asthe means and variances are defined for each pdf!u See Appendix of Barlow for a proof of the Central Limit Theorem.K.K. GanL3: Gaussian Probability Distribution4

lExample: A watch makes an error of at most 1/2 minute per day.After one year, what’s the probability that the watch is accurate to within 25 minutes?u Assume that the daily errors are uniform in [-1/2, 1/2].n For each day, the average error is zero and the standard deviation 1/ 12 minutes.n The error over the course of a year is just the addition of the daily error.n Since the daily errors come from a uniform distribution with a well defined mean and variance Central Limit Theorem is applicable:È Y1 Y2 .Yn - nm 1 b - 12 y 2lim PÍa b dyÚe nÆ Îs n2p a The upper limit corresponds to 25 minutes:Y Y .Yn - nm 25 - 365 0b 1 2 4.51 365s n† The lower limit corresponds to12-25 minutes:Y Y .Yn - nm -25 - 365 0a 1 2 -4.51 365s n12† The probability to be within 25 minutes:1 4.5 - 12 y 2P dy 0.999997 1- 3 10 -6Ú e2p† less than three-4.5in a million chance that the watch will be off by more than 25 minutes in a year!†K.K. GanL3: Gaussian Probability Distribution5

lExample: Generate a Gaussian distribution using random numbers.u Random number generator gives numbers distributed uniformly in the interval [0,1]n m 1/2 and s2 1/12u Procedure:n Take 12 numbers (ri) from your computer’s random number generatorn Add them togethern Subtract 6 Get a number that looks as if it is from a Gaussian pdf!È Y Y2 .Yn - nm PÍa b Î s nA) 5000 random numbersB) 5000 pairs (r1 r2)of random numbers12È 1Â ri -12 2Í i 1 PÍa b 1 12Í 12ÍÎ 12È PÍ-6 Â ri - 6 6 Î C) 5000 triplets (r1 r2 r3) D) 5000 12-plets (r1 r2 r12)i 1of random numbers1 6 - 12 y 2 dyÚe2p -6E) 5000 12-pletsEThus the sum of 12 uniform randomnumbers minus 6 is distributed as if it camefrom† a Gaussian pdf with m 0 and s 1.K.K. Ganof random numbers.L3: Gaussian Probability Distribution(r1 r2 r12 - 6) ofrandom numbers.Gaussianm 0 and s 1-60 66

lExample: The daily income of a "card shark" has a uniform distribution in the interval [- 40, 50].What is the probability that s/he wins more than 500 in 60 days?u Lets use the CLT to estimate this probability:È Y Y .Yn - nm 1 b - 12 y 2lim PÍa 1 2 b dyÚe nÆ Îs n2p au The probability distribution of daily income is uniform, p(y) 1. need to be normalized in computing the average daily winning (m) and its standard deviation (s).50Ú yp(y)dy†m -40 50Ú p(y)dy1 [50 22- (-40)2 ]50 - (-40) 5-405022Ú y p(y)dy2s -4050-m Ú p(y)dy1 [50 3 - (-40) 3 ]3- 25 67550 - (-40)-40u†u† The lower limit of the winning is 500:Y Y .Yn - nm 500 - 60 5 200a 1 2 1s n675 60201The upper limit is the maximum that the shark could win (50 /day for 60 days):Y Y .Yn - nm 3000 - 60 5 2700b 1 2 13.4s n675 602011 13.4 - 12 y 21 - 12 y 2P edyªdy 0.16ÚÚe2p 12p 116% chance to win 500 in 60 daysK.K. Gan†L3: Gaussian Probability Distribution7

n Take 12 numbers (ri) from your computer’s random number generator n Add them together n Subtract 6 Get a number that looks as if it is from a Gaussian pdf!-6 0 6 Thus the sum of 12 uniform random numbers minus 6 is distributed as if it came from a Gaussian pdf with m 0 and s 1. E A) 5000

Related Documents:

Introduction of Chemical Reaction Engineering Introduction about Chemical Engineering 0:31:15 0:31:09. Lecture 14 Lecture 15 Lecture 16 Lecture 17 Lecture 18 Lecture 19 Lecture 20 Lecture 21 Lecture 22 Lecture 23 Lecture 24 Lecture 25 Lecture 26 Lecture 27 Lecture 28 Lecture

Joint Probability P(A\B) or P(A;B) { Probability of Aand B. Marginal (Unconditional) Probability P( A) { Probability of . Conditional Probability P (Aj B) A;B) P ) { Probability of A, given that Boccurred. Conditional Probability is Probability P(AjB) is a probability function for any xed B. Any

Gaussian filters might not preserve image brightness. 5/25/2010 9 Gaussian Filtering examples Is the kernel a 1D Gaussian kernel?Is the kernel 1 6 1 a 1D Gaussian kernel? Give a suitable integer-value 5 by 5 convolution mask that approximates a Gaussian function with a σof 1.4. .

Outline of the talk A bridge between probability theory, matrix analysis, and quantum optics. Summary of results. Properties of log-det conditional mutual information. Gaussian states in a nutshell. Main result: the Rényi-2 Gaussian squashed entanglement coincides with the Rényi-2 Gaussian entanglement of formation for Gaussian states. .

Properties of Gaussian Distributions Is valid probability distribution. Z 1 1 1 p 2ˇ e (x )2 2 2 dx 1 Central Limit Theorem: Sums of large numbers of identically distributed random variables tend to Gaussian. Lots of different type

Pros and cons Option A: - 80% probability of cure - 2% probability of serious adverse event . Option B: - 90% probability of cure - 5% probability of serious adverse event . Option C: - 98% probability of cure - 1% probability of treatment-related death - 1% probability of minor adverse event . 5

Lecture 1: A Beginner's Guide Lecture 2: Introduction to Programming Lecture 3: Introduction to C, structure of C programming Lecture 4: Elements of C Lecture 5: Variables, Statements, Expressions Lecture 6: Input-Output in C Lecture 7: Formatted Input-Output Lecture 8: Operators Lecture 9: Operators continued

ANSI/AAMI HE74 (2001-2010) “Human factors design process for medical devices” ANSI/AAMI HE75 (2009- ) “Human factors engineering - Design of medical devices” (a Tutorial to HE-74) 37 . US & FDA FDA Human Factors Draft Guidance Document: Agency Expectations for Human Factors Data in Premarket Submissions Applying Human Factors and Usability Engineering to Optimize .