Reading: Karlin And Taylor Sec. 1.1E We Will Prepare To .

2y ago
9 Views
3 Downloads
3.47 MB
9 Pages
Last View : 4d ago
Last Download : 3m ago
Upload by : Jerry Bolanos
Transcription

Probability Generating Functions and Branching ProcessesThursday, November 05, 20152:02 PMReading:Karlin and Taylor Sec. 1.1EResnick Secs. 1.3, 1.4Lawler Sec. 2.4Homework 3 will be posted by tomorrow morning; due Monday, November 23 at 2PM.We will prepare to discuss a special kind of countable state Markov chain, whose probabilitytransition matrix is awkward to write down, but can be analyzed efficiently because it involves moreindependence than standard Markov chains do.We will develop, for this purpose, the technique of probability generating functions, which are oftenuseful when dealing with sums of independent random variables.Probability generating function of a discrete random variableis defined as follows:Here we have a dummy variable(Can also extend the concept to thecase when takes on negative values, but we'll leave this out of ourconsideration.) The probability generating function so defined is related to, but not the sameas moment generating function or the characteristic function in probabilitytheory. The purpose of a generating function is analogous to transforms that are usedto solve or analyze differential equations by converting the formulation to onewhere calculations are simpler Probability generating function : z-transform Characteristic function: Fourier transform Moment generating function : Laplace transform Characteristic and moment generating functions are versions of theprobability generating function that work for a wider class of randomvariables; probability generating function is well suited only to discreterandom variables. Essential to all these concepts is that the transforms are invertible; from thetransformed function, one can recover uniquely the function which transformsinto it.What that means for the probability generating function, is that it is atransform of the probability distribution or probability mass functionforStoch15 Page 1

forHow do we invert the probability generatingfunction to get back the probability mass function:Well, that assumes the probability generating function is a convergentTaylor series about, but the fact that,means thatthe Taylor series will converge at least on the interval.The purpose of generating functions is they make some calculations simpler.If we know the probability generating function, we can compute any momentof the random variable by differentiation (rather than summation of theprobability mass function):Example: Compute mean and variance of the binomial distribution.Stoch15 Page 2

Let's instead approach this by using the probabilitygenerating function:Stoch15 Page 3

Actually for computing moments, the moment generating function is a little biteasier to work with, but that's not our ultimate objective here.More importantly than this, probability generating functions expedite the analysisof sums of independent random variables.with as independent random variables with known probabilitydistributions. The probability distribution for is related in a somewhatawkward way to the probability distributions for , namely by convolution(which means you have a (n-1)-fold sum).However, the probability generating function ofprobability generating functions of :Stoch15 Page 4is easily related to the

probability generating functions of:Probability generating functions also work well with "random sums" meaning asum of a random number of random variables.So let us definewhere the are iid random variables, andnonnegative integer random variable, which is independent of the .Then again, the probability generating function forprobability generating functions for , :is acan be related to theWe now use the law of total expectation because the calculation simplifies if we"know" the value of the random variable . Partition on the value ofStoch15 Page 5

A mathematical shorthand for this conditionalexpectation calculation:Stoch15 Page 6

More details on generating functions in Resnick Ch. 0.Branching Processes (Galton-Watson model)This is a special version of a countable state Markov chain with state spacethat is modeled as follows:At any epoch ,represents the number of active "agents" at that epoch. At eachepoch, each agent gives rise to a random number of offspring. If the agent survives tothe next epoch, we just count itself as one of its own offspring. Each agent's offspringis assumed to be iid. The process continues at each epoch.Stoch15 Page 7

This describes a Markov chain because it combines the current state withindependent noise at each epoch. The probability transition matrix is awkward, butthe stochastic update rule can be compactly represented as a random recursion,namely a random sum:whererepresents the number of offspring of agent at epochare taken as independent, identically distributed random variables. Thewith a common probability mass function:for anyThis is known as the offspring distribution.And as always we need to supply the probability distribution forthe number of agents that are present at epoch , call this:forWe have described the simplest type of branching process, but the ideas can be generalizedin a number of ways: multiple types of agents age structure continuous time see Branching Processes in Biology, Kimmel and AxelrodApplications of branching processes: genealogy genetics biomolecular reproduction processes (cells, PCR) population cell growth to tumors queueing models dynamics on networks social network network of computers neuronal networkStoch15 Page 8

social networknetwork of computersneuronal networkOne often uses an approximation for thedynamics on a network in which the spread of aninfection/product/buzz/virus/signal is represented by a branching process. This isappropriate so long as loops in the dynamics can be neglected, which is particularlyrelevant often at early times. The branching process approximation works betterthan you would think Melnik, Hackett, Porter, Mucha, and Gleeson, "The Unreasonable Effectivenessof Tree-Based theory for Networks with Clustering," Physical Review E 83 (2011),036112.How do we compute with branching processes? The probability transitionmatrix is awkward to construct, but the stochastic update rule looks like arandom sum. So probability generating functions should work well If we apply the random sum rule for probability generating functions, we get:whereis the pgf of the offspring distribution.Thus, we can recursively compute the probability generating function forby iterating composition of the pgf of the offspring distribution times backto the initial pgf forStoch15 Page 9

as moment generating function or the characteristic function in probability . Well, that assumes the probability generating function is a convergent Taylor series about , but the fact that , means that the Taylor series will converge at least on the interval . The purpose of generating functions is they make some calculations simpler. .

Related Documents:

rope High Knee Jog Push ups Over head Slam Dips Kettle Bell Swing Goblet Squat Split Jump Mountain Climbers Alt Lunge Crunches Plank Reverse Curl Cycle Legs Warm up: Rowing Machine 500 meters 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec 20 sec Stop watch required. Complete each

Calf Raise - standing or seated 12 reps 60 sec 10 reps 60 sec 8 reps 60 sec 6 reps 60 sec CORE Crunches 20 reps 30 sec 20 reps 30 sec 20 reps 30 sec Plank 60 sec 30 sec 60 sec 30 sec 60 sec 30 sec Plate Twist 20 reps 30 sec 20 reps 30 sec 20 reps 30 sec THE WAY TO BULK

cycle time 1/(ops/sec) required sec/op equipment capability actual sec/op actual sec/op required sec/op - happiness required sec/op actual sec/op - misery (or multiple resources) Typical cycle times: 3-5 sec manual small parts 5-10 sec small robot 1-4 sec small fixed automation 10-60 sec large robot or manual large parts

JBoss EAP (Java EE) 2 - 3 sec 3 sec 40 MB 200 - 400 MB 23K req/sec JBoss EAP (Spring) 2 - 3 sec 7 sec 40 MB 500 - 700 MB 9K req/sec JBoss WS/Tomcat (Spring) 0 - 1 sec 8 sec 40 MB 0.5 - 1.5 GB 8K req/sec Fat JAR (Spring Boot) N/A 3 sec 30 MB 0.5 - 2.0 GB 11K r

Amrapali Valley Amrapali Leisure Park Supertech Eco Village-1 P. Pump N H-2 4 T o w a r d s D e l h i T o w a r d s G h a z i a b a d Metro Station Sec. 39 Sec. 37 Sec. 44Sec. 45 Sec. 62 Sec. 63 Indian Oil Building Sec. 72 S e c. 7 5 S e c. 7 6 Sec. 120 Sec. 119 Nirala Estate Sec. 70 Amrapali Smart City Panchsheel Supertech Eco Village-1I .

OCCT SOLIDWORKS Fusion 360 Rhino 7 Creo Kompas LEGO Technic - Fire Plane (42040).stp 28MiB 568 3 min 25 sec 2 min 11 sec (CR28991) Very fast 13-16 sec 40 sec Very fast 6 sec LEGO Technic - Chevrolet Corvette ZR1 (42093).stp 40MiB 582 3 min 53 sec 2 min 50 sec (CR28991) N/A (Failed to load/show the assembly) 30 sec 60 sec Very fast

Sec. 2.3 Presiding Officer -- Generally 1-13 Sec. 2.4 Same -- Calling of Meeting to Order 1-13 Sec. 2.5 Same -- To Preserve Order; Right to Speak, etc. 1-14 Sec. 2.6 Same -- To Decide Who Is Entitled to Floor 1-14 Sec. 2.7 Time of Regular Meetings 1-14 Sec. 2.8 Special Meetings 1-14 Sec. 2.9 Quorum 1-14 Sec. 2.10 Adjournments Generally 1-14 Sec. 2.11 Process Against Absent Members 1-15

Anne Harris Sara Kirby Cari Malcolm Linda Maynard Renee McCulloch Maria McGill Jayne Grant Debbie McGirr Katrina McNamara Lis Meates Tendayi Moyo Sue Neilson Jayne Price Claire Quinn Duncan Randall Rachel Setter Katie Stevens Janet Sutherland Katie Warburton CPCet uK and ireland aCtion grouP members. CPCET Education Standard Framework 4 v1.0.07.20 The UK All-Party Parliament Group on children .