Generating And Characteristic Functions

2y ago
26 Views
2 Downloads
298.95 KB
15 Pages
Last View : 2d ago
Last Download : 3m ago
Upload by : Elise Ammons
Transcription

Generating and characteristic functionsProbability generating functionConvolution theoremGeneratingandCharacteristic FunctionsMoment generating functionPower series expansionConvolution theoremCharacteristic functionCharacteristic function and momentsConvolution and unicityInversionJoint characteristic functionsSeptember 13, 20131 / 60Probability generating function2 / 60Probability generating functionLet X be a nonnegative integer-valued random variable.If X takes a finite number of values x0 , x1 , . . . , xn , GX (s) is apolynomial:The probability generating function of X is defined to beGX (s) E(s X ) Xs k P(X k)GX (s) k 0IIf X takes a finite number of values, the above expression is afinite sum.IOtherwise, it is a series that converges at least for s 2 [ 1, 1]and sometimes in a larger interval.nXs k P(X k)k 1 P(X 0) P(X 1) s · · · P(X n) s n3 / 604 / 60

Probability generating functionExamplesIf X takes a countable number of values x0 , x1 , . . ., xk , . . ., thenXGX (s) s k P(X k)Let X be a Bernoulli random variable, X B(p).P(X 0) q,k 0P(X 1) pk P(X 0) P(X 1) s · · · P(X k) s · · ·We haveis a series that converges at least for s 1, becauseXk 0s k P(X k) Xk 0 s k P(X k) XGX (s) Xs k P(X k) q sp,k 0P(X k) 1s2Rk 05 / 60Examples6 / 60ExamplesLet X Bin(n, p). n k nP(X k) p qkX Poiss( ).k,kk 0, 1, . . . , nP(X k) eThenk!,k 0, 1, . . .ThenGX (s) Xs k P(X k) k 0n Xn (sp)k q nkk 0nXk 0k n k nksp qk (q sp)n ,kGX (s) Xs k P(X k) ek 0s2R e7 / 601X(s )kk!k 0es e(s 1),s2R8 / 60

ExamplesUnicityIf two nonnegative, integer-valued random variables have the samegenerating function, then they follow the same probability law.X Geom(p)P(X k) q k1p,k 1, 2, . . . ,0 p 1TheoremLet X and Y be nonnegative integer-valued random variables suchthatGX (s) GY (s).We haveGX (s) Xs k P(X k) k 0 sp1X(sq)k 11Xs k qk1pThenk 1k 1sp ,1 sqP(X k) P(Y k)1 s qfor all k0.The result is a special case of the uniqueness theorem for powerseries9 / 60Convolution theorem10 / 60ExampleLet X Bin(n, p), Y Bin(m, p) be independent randomvariables and letZ X YTheorem (Convolution)If X and Y are independent random variables and Z X Y ,thenGZ (s) GX (s) GY (s)We haveGZ (s) GX (s)GY (s) (q sp)n (q sp)m (q sp)n mProof:Observe that GZ (s) is the probability generating function of aBin(n m, p) random variable. By the unicity theorem,ZGZ (s) E(s ) E(s X Y ) E(s X s Y ) E(s X )E(s Y ) GX (s)GY (s)X Y Bin(n m, p)11 / 6012 / 60

Convolution theoremConvolution theoremMore generally,A case of particular importance is:TheoremLet X1 , X2 , . . . , Xn be independent, nonnegative, integer-valuedrandom variables and setCorollaryIf, in addition, X1 , X2 , . . . , Xn are equidistributed, with commonprobability generating function GX (s), thenSn X 1 X2 · · · X n .ThenGS (s) nYGS (s) (GX (s))n .GXk (s).k 113 / 60Example: Negative binomial probability lawExample: Negative binomial probability lawA biased coin such that P(heads) p is repeatedly tossed until atotal amount of k heads has been obtained.As X1 , . . . , Xk are independent and identically distributed we canapply the convolution theorem. Thus,Let X be the number of tosses.Notice thatGX (s) GX1 (s)GX2 (s) · · · GXk (s) ksp1, s (GX1 (s))k 1 sqqX X1 X2 · · · Xk ,whereXi Geom(p)is the number of tosses between the (i14 / 601)-th and the i-th head.15 / 6016 / 60

Example: Negative binomial probability lawExample: Negative binomial probability lawRecall that if 2 R, then the Taylor series expansion about 0 ofthe function (1 x) is, for x 2 ( 1, 1),1 x ( 1) 2 ( x ··· 2We can write(1 x) 1) . . . ( r!X rr 0where ( r1) . . . ( r!r 1)Consider the series expansion of GX (s):kGX (s) (sp) (1xr · · ·sq)k 1 Xk (sp)( sq)rrkr 0where xrkr r 1)1) · · · ( kr! 1r k r ( 1)k 1 k( kr 1)17 / 60Example: Negative binomial probability law18 / 60PropertiesTherefore, 1 1 Xk r 1 k r k r X nGX (s) p q s k 1kr 0n k 1 k np q1ksnIGX (0) P(X 0)IGX (1) 1We haveHence,8 0, P(X n) n :k2GX (1) 4n k 1 k np q1k, n k, k 1, · · ·Xk 03s k P(X k)5 s 1XP(X k) 1k 0This is the negative binomial probability law.19 / 6020 / 60

PropertiesPropertiesPropositionLet R be the radius of covergence of GX (s). If R 1, thenMore generally,E(X ) GX0 (1)PropositionIndeed,IGX0 (s) Xd X ks P(X k) k skdsk 0Hence,GX0 (1) 1IP(X k)E(X ) GX0 (1) lims!1 GX0 (s)E(X (X1) · · · (X(k)(k)k 1)) GX (1) lims!1 GX (s)k 1Xk P(X k) E(X )k 121 / 60Examples22 / 60ExamplesLet X Poiss( ).Let X Bin(n, p).E(X ) GX0 (1) E(X ) GX0 (1) d(q sp)nds np (q sp)n1s 1(s 1) e(s 1)s 1s 1 Analogously,s 1 np (q p)ndeds1 npE(X (X1)) GX00 (1) 2e(s 1)s 1 2Hence,E (X 2 ) 23 / 602 ,Var(X ) E(X 2 )(E(X ))2 24 / 60

ExamplesExamplesLet X be a negative binomial random variable.X Geom(p).E(X ) GX0 (1) dspds (1 sq) s 1p(1 sq)2 s 11pAnalogously, kE(X (X1)) GX00 (1) 2pq(1 sq)3 s 12qp22q1 ,2ppVar(X ) E(X 2 )(E(X ))2 sp1 sq k1 sp1 sqp(1 sq)2 ks 1 s 1kpThis result can also be obtained from X X1 · · · Xk , witheach Xi Geom(p).ThereforeE (X 2 ) ddsE(X ) GX0 (1) qp2E(X ) kXi 1E(Xi ) kp25 / 6026 / 60Moment generating functionProbability generating functionConvolution theoremThe moment generating function of a random variable X is definedas8 X tx e i P(X xi ), if X is discrete tXZi 1 X (t) E e :e tx fX (x) dx, if X is continuousMoment generating functionPower series expansionConvolution theoremCharacteristic functionCharacteristic function and momentsConvolution and unicityInversionJoint characteristic functions1provided that the sum or the integral converges.27 / 6028 / 60

UnicityExamplesThe moment generating function specifies uniquely the probabilitydistribution.Let X Bin(n, p).X (t) TheoremnXe tk P(X k) k 0Let X and Y be random variables. If there exists h 0, such thatX (t) Y (t) for t h, then X and Y are identicallydistributed.n Xnk 0 q pe tn,kpe tkqnt2R29 / 60Examplesk30 / 60ExamplesLet X Exp(µ).X (t) Z Z 111µeLet X Poiss( ).e tx fX (x) dx(µ t)xdx 0For continuos random variables,transform of fX (x).X (t) µµt,e tk P(X k) k 0t µ eX (t)1X1X(e t )k ek!k 0is related to the Laplace31 / 601Xe tk ek 0(e t 1),kk!t2R32 / 60

ExamplesExamplesLet Z N(0, 1).ZMore generally, ifZ 1z 2 2tz12e fZ (z) dz pedzZ (t) 2 11Z 1(z t)2t2t212 e2 pedz e 2 , t 2 R2 1 {z}1tzX Z m,then X N(m,We have E e tX E e t( e tm E e t Z e tmX (t)1because1p2 Z1e(zt)2212 ).dz P( 1 N(t, 1) 1) 1Z m)Z( t) e2t2 tm233 / 60Power series expansionPower series expansionNotice that0X (t) 34 / 60dE e tXdt E d tXedt E X e tXFor instace, if X Exp(µ), X (t) µtµt µ,andTherefore,0X (0) E(X )E(X ) Analogously,00X (t) ddt0X (t) d tX E Xe E X 2 e tXdtd dt µtµ t 0µ(µt)2 1/µt 0Analogously,2E(X ) Thus,00X (0)0X (0) E(X 2 )35 / 6000X (0)d dt µ(µt)2 t 02µ(µ t)3 2/µ2t 036 / 60

Power series expansionPower series expansionMore generally,X (t) E etX (t X )2(t X )k E 1 tX ··· ···2!k! 1 E(X ) t Theorem If X (t) converges on some open interval containing the origint 0, then X has moments of any order, (k)E X k X (0),E(X 2 ) 2E(X k ) kt ··· t ···2!k!andThis is the power series expansion ofX (t) 1Xk 0X (t),(k)X (0)k!X (t) 1XE(X k )k 0tkk!tk37 / 60Power series expansionConvolution theoremFor instance, let X Exp(µ). If t µ we haveX (t) Hence,Therefore,µµt 38 / 6011t 1 (t/µ)µThe convolution theorem applies also to moment generatingfunctions. 2t ···µTheoremLet X1 , X2 , . . . , Xn be independent random variables and letS X1 X2 · · · Xn . Then,E (X n )1 nn!µS (t)n!E (X ) nµ nYXk (t)k 1n39 / 6040 / 60

ExamplesExamplesX Poiss(X ), Y Poiss(X N(mX ,Y ), independent.2X ),Y N(mY ,2Y ),independent.Let Z X Y .Let Z X Y .We haveWe haveZ (t) X (t) Y (t) eX (et1)eY (et1) e(X Y )(etZ (t)1) eX (t) Y (t)2 2X t tmX2e2 2Y t tmY2 e2 2 )t 2( XY t(mX mY )2Hence,Z Poiss(X Y)ThereforeZ N(mX mY ,2X 2Y)41 / 6042 / 60Characteristic functionProbability generating functionConvolution theoremThe characteristic function of a random variable X is thecomplex-valued function of the real argument !Moment generating functionPower series expansionConvolution theoremMX : R ! C! 7! MX (!)Characteristic functionCharacteristic function and momentsConvolution and unicityInversionJoint characteristic functionsdefined as MX (!) E e i !X E (cos (!X )) i E (sin (!X ))43 / 6044 / 60

Characteristic functionPropertiesTherefore,8 X e i !xk P(X xk ), if X is discrete kMX (!) Z 1 :e i !x fX (x) dx,if X is continuousI MX (!) E e i !X E e i !X E(1) 11IThe characteristic function exists for all ! and for all randomvariables.IIf X is continuous, MX (!) is the Fourier transform of fX (x).(Notice the change of sign from the usual definition.)IIf X is discrete, MX (!) is related to Fourier series. MX (!) MX (0) 1 for all ! 2 R.On the other hand, MX (0) E e i·0·X E(1) 145 / 60PropertiesI46 / 60ExamplesLet X Binom(n, p). Then,MX (!) MX ( !).MX (!) p e i! q MX (!) E (e i !X ) E e E (cos (!X ))i !X If X Poiss( ), theni E (sin (!X ))i!MX (!) e (e E (cos ( !X )) i E (sin ( !X )) MX ( !)InIf X N(m,MX (!) is uniformly continuous in R.2 ),thenMX (!) e i!m47 / 601)122 !248 / 60

Characteristic function and momentsCharacteristic function and momentsTheoremIndeed,If E(X n ) 1 for some n 1, 2, . . ., thenMX (!) nXE(X k )k!k 0 MX (!) E e(i !)k o( ! n ) as ! ! 0.i!X 1X(i !X )k Ek!k 0! 1 kXi E(X k ) k!k!k 0But this is the Taylor’s series expansion of MX (!):So,E(X k ) (k)MX (0)ikfor k 1, 2, . . . , n.MX (!) 1(k)XM (0)Xk 0In particular, if E(X ) 0 and Var(X ) MX (!) 1122,thenThereforek!!k(k)i k E(X k ) MX (0)2 2! o(t 2 ) as ! ! 0.49 / 60Convolution theorem50 / 60Convolution theoremTheorem MS (!) E e i!S E e i!(X1 X2 ··· Xn ) E e i!X1 e i!X2 · · · e i!Xn E e i!X1 E e i!X2 · · · E e i!XnLet X1 , X2 , . . . , Xn be independent random variables and letS X1 X2 · · · Xn .ThenMS (!) nYMXk (!) MX1 (!)MX2 (!) · · · MXn (!)k 151 / 6052 / 60

Convolution theoremUnicityIn the case n 2 we have essentially the convolution theorem forFourier transforms.TheoremIf X and Y are continuous and independent random variables andZ X Y , thenfZ fX fYThenLet X have probability distribution function FX and characteristicfunction MX . Let F X (x) (FX (x) FX (x ))/2.F X (b)This impliesF(fZ ) F(fX ) · F(fY ),that is,1T !1 2 F X (a) limZTTeia!ei!ib!MX (!) d!.MX specifies uniquely the probability law of X . Two randomvariables have the same characteristic function if and only if theyhave the same distribution function.MZ (!) MX (!)MY (!)53 / 60Inversion54 / 60InversionTheorem (Inversion of the Fourier transform)In the discrete case, MX (!) is related to Fourier series.Let X be a continuous r.v. with density fX and characteristicfunction MX . ThenZ 11fX (x) e i!x MX (!) d!2 1TheoremIf X is an integer-valued random variable, thenZ 1P(X k) e ik! MX (!) d!2 at every point x at which fX is di erentiable.55 / 6056 / 60

Joint characteristic functionsJoint momentsThe joint characteristic function of the random variables X1 , X2 ,. . ., Xn is defined to be MX (!1 , !2 , . . . , !n ) E e i(!1 X1 !2 X2 ··· !n Xn )The joint characteristic function allows as to calculate jointmoments.For instance, given X , Y :Using vectorial notation one can writet! (!1 , !2 , · · · , !n ) ,andX (X1 , X2 , · · · , Xn ) 1 @ k l MXY (!1 , !2 )mkl E X k Y l k li@ k !1 @ l !2t(!1 ,!2 ) (0,0) t MX (! t ) E e i! X57 / 60Marginal characteristic functionsIndependent random variablesTheoremMarginal characteristic functions are easily derived from the jointcharacteristic function.For instance, given X , Y : MX (!) E e i!X E e i(!1 X !2 Y )58 / 60The random variables X1 , X2 , . . . , Xn are independent if and only ifMX (!1 , !2 , . . . , !n ) MX1 (!1 )MX2 (!2 ) · · · MXn (!n )(!1 !,!2 0)If the random variables are independent, then MXY (!, 0)MX (!1 , !2 , . . . , !n ) E e i(!1 X1 !2 X2 ··· !n Xn ) E e i!1 X1 e i!2 X2 · · · e i!n Xn E e i!1 X1 E e i!2 X2 · · · E e i!n XnAnalogously,MY (!) MXY (0, !) MX1 (!1 )MX2 (!2 ) · · · MXn (!n )59 / 6060 / 60

Moment generating function Power series expansion Convolution theorem Characteristic function Characteristic function and moments Convolution and unicity Inversion Joint characteristic functions 2/60 Probability generating function Let X be a nonnegative integer-valued random variable. The probability generating function of X is defined to be .

Related Documents:

unilever unilever unilever unilever utopia repository utopia repository port sunlight hammond simcoe jefferson city - - class characteristics name characteristic value characteristic value characteristic value characteristic value characteristic value characteristic value encoder type absol

12.2 Operations on Generating Functions The magic of generating functions is that we can carry out all sorts of manipulations on sequences by performing mathematical operations on their associated generating functions. Let’s experiment with various operations and charact

as moment generating function or the characteristic function in probability . Well, that assumes the probability generating function is a convergent Taylor series about , but the fact that , means that the Taylor series will converge at least on the interval . The purpose of generating functions is they make some calculations simpler. .

74 Chapter 4: Generating Functions This chapter looks at Probability Generating Functions (PGFs) for discrete random variables.

Aug 13, 2020 · exponential functions. Unit 5.1 –Exponential Functions & Their Graphs So far, this text has dealt mainly with algebraic functions, which include polynomial functions and rational functions. In this chapter, you will study two types of nonalgebraic functions –exponential funct

Method of moment generating functions. There is a theorem (Casella [2, p. 65] ) stating that if two random variables have identical moment generating functions, then they possess the same probability distribution. The procedure is to find the moment generating function for Φ and then co

The ordinary generating function for the infinite sequence g 0,g 1,g 2,g 3. is the power series: G(x) g 0 g 1x g 2x2 g 3x3 ··· . Not all generating functions are ordinary, but those are the only kind we’ll consider here. A

American Revolution, students are exposed to academic, domain-specific vocabulary and the names and brief descriptions of key events. Lesson 2 is a simulation in which the “Royal Tax Commissioners” stamp all papers written by students and force them to pay a “tax” or imprisonment.