Martingale Problems And Stochastic Equations For Markov .

1y ago
23 Views
3 Downloads
1.89 MB
424 Pages
Last View : 10d ago
Last Download : 3m ago
Upload by : River Barajas
Transcription

Martingale problems and stochastic equations for Markovprocesses1. Basics of stochastic processes2. Markov processes and generators3. Martingale problems4. Exisence of solutions and forward equations5. Stochastic integrals for Poisson random measures6. Weak and strong solutions of stochastic equations7. Stochastic equations for Markov processes in Rd8. Convergence for Markov processes characterized by martingaleproblems First Prev Next Go To Go Back Full Screen Close Quit1

9. Convergence for Markov processes characterized by stochasticdifferential equations10. Martingale problems for conditional distributions11. Equivalence of stochastic equations and martingale problems12. Genealogies and ordered representations of measure-valued processes13. Poisson representations14. Stochastic partial differenctial equations15. Information and conditional expectation16. Technical lemmas17. Exercises18. Stochastic analysis exercises19. References First Prev Next Go To Go Back Full Screen Close Quit2

http://www.math.wisc.edu/ kurtz/FrankLect.htm First Prev Next Go To Go Back Full Screen Close Quit3

1.Basics of stochastic processes Filtrations Stopping times Martingales Optional sampling theorem Doob’s inequalities Stochastic integrals Local martingales Semimartingales Computing quadratic variations Covariation Itô’s formula First Prev Next Go To Go Back Full Screen Close Quit4

Conventions and caveatsState spaces are always complete, separable metric spaces (sometimes called Polish spaces), usually denoted (E, r).All probability spaces are complete.All identities involving conditional expectations (or conditional probabilities) only hold almost surely (even when I don’t say so).If the filtration {Ft } involved is obvious, I will say adapted, ratherthan {Ft }-adapted, stopping time, rather than {Ft }-stopping time,etc.All processes are cadlag (right continuous with left limits at each t 0), unless otherwise noted.A process is real-valued if that is the only way the formula makessense. First Prev Next Go To Go Back Full Screen Close Quit5

ReferencesKurtz, Lecture Notes for Math 735http://www.math.wisc.edu/ kurtz/m735.htmEthier and Kurtz, Markov Processes: Characterization and ConvergenceProtter, Stochastic Integration and Differential Equations, Second Edition First Prev Next Go To Go Back Full Screen Close Quit6

Filtrations(Ω, F, P ) a probability spaceAvailable information is modeled by a sub-σ-algebra of FFt information available at time t{Ft } is a filtration.t s implies Ft Fs{Ft } is complete if F0 contains all subsets of sets of probability zero.A stochastic process X is adapted to {Ft } if X(t) is Ft -measurable foreach t 0.An E-valued stochastic process X adapted to {Ft } is {Ft }-Markov ifE[f (X(t r)) Ft ] E[f (X(t r)) X(t)],t, r 0, f B(E) First Prev Next Go To Go Back Full Screen Close Quit7

Measurability for stochastic processesA stochastic process is an indexed family of random variables, but ifthe index set is [0, ), then we may want to know more about X(t, ω)than that it is a measurable function of ω for each t. For example, fora R-valued process X, when areZ bX(s, ω)ds and X(τ (ω), ω)arandom variables?X is measurable if (t, ω) [0, ) Ω X(t, ω) E is B([0, )) Fmeasurable.RbRbLemma 1.1 If X is measurable and a X(s, ω) ds , then a X(s, ω)dsis a random variable.If, in addition, τ is a nonnegative random variable, then X(τ (ω), ω) is arandom variable. First Prev Next Go To Go Back Full Screen Close Quit8

Proof. The first part is a standard result for measurable functionson a product space. Verify the result for X(s, ω) 1A (s)1B (ω), A B[0, ), B F and apply the Dynkin class theorem to extend theresult to 1C , C B[0, ) F.If τ is a nonnegative random variable, then ω Ω (τ (ω), ω) [0, ) Ω is measurable. Consequently, X(τ (ω), ω) is the composition of two measurble functions. First Prev Next Go To Go Back Full Screen Close Quit9

Measurability continuedA stochastic process X is {Ft }-adapted if for all t 0, X(t) is Ft measurable.If X is measurable and adapted, the restriction of X to [0, t] Ω isB[0, t] F-measurable, but it may not be B[0, t] Ft -measurable.X is progressive if for each t 0, (s, ω) [0, t] Ω X(s, ω) E isB[0, t] Ft -measurable.LetW {A B[0, ) F : A [0, t] Ω B[0, t] Ft , t 0}.Then W is a σ-algebra and X is progressive if and only if (s, ω) X(s, ω) is W-measurable.Since pointwise limits of measurable functions are measurable, pointwise limits of progressive processes are progressive. First Prev Next Go To Go Back Full Screen Close Quit10

Stopping timesLet {Ft } be a filtration. τ is a Ft -stopping time if and only if {τ t} Ft for each t 0.If τ is a stopping time, Fτ {A F : A {τ t} Ft , t 0}.If τ1 and τ2 are stopping times with τ1 τ2 , then Fτ1 Fτ2 .If τ1 and τ2 are stopping times then τ1 and τ1 τ2 are Fτ1 -measurable. First Prev Next Go To Go Back Full Screen Close Quit11

A process observed at a stopping timeIf X is measurable and τ is a stopping time, then X(τ (ω), ω) is a random variable.Lemma 1.2 If τ is a stopping time and X is progressive, then X(τ ) is Fτ measurable.Proof. ω Ω (τ (ω) t, ω) [0, t] Ω is measurable as a mappingfrom (Ω, Ft ) to ([0, t] Ω, B[0, t] Ft ). Consequently,ω X(τ (ω) t, ω)is Ft -measurable, and{X(τ ) A} {τ t} {X(τ t) A} {τ t} Ft . First Prev Next Go To Go Back Full Screen Close Quit12

Right continuous processesMost of the processes you know are either continuous (e.g., Brownian motion) or right continuous (e.g., Poisson process).Lemma 1.3 If X is right continuous and adapted, then X is progressive.Proof. If X is adapted, then[ns] 1 t, ω)(s, ω) [0, t] Ω Yn (s, ω) X(nXk 1 t, ω)1[ k , k 1 ) (s)X(n nnkis B[0, t] Ft -measurable. By the right continuity of X, Yn (s, ω) X(s, ω) on B[0, t] Ft , so (s, ω) [0, t] Ω X(s, ω) is B[0, t] Ft measurable and X is progressive. First Prev Next Go To Go Back Full Screen Close Quit13

Examples and propertiesDefine Ft s t Fs . {Ft } is right continuous if Ft Ft for all t 0.If {Ft } is right continuous, then τ is a stopping time if and only if{τ t} Ft for all t 0.Let X be cadlag and adapted. If K E is closed, τKh inf{t :X(t) or X(t ) K} is a stopping time, but inf{t : X(t) K} maynot be; however, if {Ft } is right continuous and complete, then forany B B(E), τB inf{t : X(t) B} is an {Ft }-stopping time. Thisresult is a special case of the debut theorem, a very technical resultfrom set theory. Note that{ω : τB (ω) t} {ω : s t 3 X(s, ω) B} projΩ {(s, ω) : X(s, ω) B, s t} First Prev Next Go To Go Back Full Screen Close Quit14

Piecewise constant approximations 0, τ0 0, τi 1 inf{t τi : r(X(t), X(τi )) r(X(t ), X(τi )) } Define X (t) X(τi ), τi t τi 1. Then r(X(t), X (t)) .If X is adapted to {Ft }, then the {τi } are {Ft }-stopping times and X is {Ft }-adapted. See Exercise 4. First Prev Next Go To Go Back Full Screen Close Quit15

MartingalesAn R-valued stochastic process M adapted to {Ft } is an {Ft }-martingaleifE[M (t r) Ft ] M (t), t, r 0Every martingale has finite quadratic variation:X[M ]t lim(M (t ti 1 ) M (t ti ))2where 0 t0 t1 · · ·, ti , and the limit is in probability asmax(ti 1 ti ) 0. More precisely, for 0 and t0 0,Xlim P {sup [M ]t lim(M (t ti 1 ) M (t ti ))2 } 0.t t0For standard Brownian motion W , [W ]t t. First Prev Next Go To Go Back Full Screen Close Quit16

Optional sampling theoremA real-valued process is a submartingale if E[ X(t) ] , t 0, andE[X(t s) Ft ] X(t),t, s 0.If τ1 and τ2 are stopping times, thenE[X(t τ2 ) Fτ1 ] X(t τ1 τ2 ).If τ2 is finite a.s. E[ X(τ2 ) ] and limt E[ X(t) 1{τ2 t} ] 0, thenE[X(τ2 ) Fτ1 ] X(τ1 τ2 ).Of course, if X is a martingaleE[X(t τ2 ) Fτ1 ] X(t τ1 τ2 ). First Prev Next Go To Go Back Full Screen Close Quit17

Square integrable martingalesM a martingale satisfying E[M (t)2 ] . ThenM (t)2 [M ]tis a martingale. In particular, for t sE[(M (t) M (s))2 ] E[[M ]t [M ]s ]. First Prev Next Go To Go Back Full Screen Close Quit18

Doob’s inequalitiesLet X be a submartingale. Then for x 0,P {sup X(s) x} x 1 E[X(t) ]s tP {inf X(s) x} x 1 (E[X(t) ] E[X(0)])s tIf X is nonnegative and α 1, then α ααE[X(t)α ].E[sup X(s) ] α 1s tNote that by Jensen’s inequality, if M is a martingale, then M is asubmartingale. In particular, if M is a square integrable martingale,thenE[sup M (s) 2 ] 4E[M (t)2 ].s t First Prev Next Go To Go Back Full Screen Close Quit19

Stochastic integralsDefinition 1.4 For cadlag processes X, Y ,Z tX · Y (t) X(s )dY (s)0X limX(ti )(Y (ti 1 t) Y (ti t))max ti 1 ti 0whenever the limit exists in probability.Sample paths of bounded variation: If Y is a finite variation process, the stochastic integral exists (apply dominated convergence theorem) andZZtX(s )dY (s) 0X(s )αY (ds)(0,t]αY is the signed measure withαY (0, t] Y (t) Y (0) First Prev Next Go To Go Back Full Screen Close Quit20

Existence for square integrable martingalesIf M is a square integrable martingale, thenE[(M (t s) M (t))2 Ft ] E[[M ]t s [M ]t Ft ]For partitions {ti } and {ri }h XX(ti )(M (ti 1 t) M (ti t))E 2 X X(ri )(M (ri 1 t) M (ri t)) Z t2 E(X(t(s )) X(r(s ))) d[M ]s0 Z E(X(t(s )) X(r(s )))2 α[M ] (ds)(0,T ]t(s) ti for s [ti , ti 1 ) r(s) ri for s [ri , ri 1 ) First Prev Next Go To Go Back Full Screen Close Quit21

Cauchy propertyLet X be bounded by a constant. As sup(ti 1 ti ) sup(ri 1 ri ) 0, the right side converges to zero by the dominated convergencetheorem.P{t }MX i (t) X(ti )(M (ti 1 t) M (ti t)) is a square integrablemartingale, so XX(ti )(M (ti 1 t) M (ti t))E supt T 2 X X(ri )(M (ri 1 t) M (ri t)) Z 4E(X(t(s )) X(r(s )))2 α[M ] (ds)(0,t]A completeness argument gives existence of the stochastic integraland the uniformity implies the integral is cadlag. First Prev Next Go To Go Back Full Screen Close Quit22

Local martingalesDefinition 1.5 M is a local martingale if there exist stopping times {τn }satisfying τ1 τ2 · · · and τn a.s. such that M τn defined byM τn (t) M (τn t) is a martingale. M is a local square-integrablemartingale if the τn can be selected so that M τn is square integrable.{τn } is called a localizing sequence for M .Remark 1.6 If {τn } is a localizing sequence for M , and {γn } is anothersequence of stopping times satisfying γ1 γ2 · · ·, γn a.s. then theoptional sampling theorem implies that {τn γn } is localizing. First Prev Next Go To Go Back Full Screen Close Quit23

Local martingales with bounded jumpsRemark 1.7 If M is a continuous, local martingale, thenτn inf{t : M (t) n}will be a localizing sequence. More generally, if M (t) cfor some constant c, thenτn inf{t : M (t) M (t ) n}will be a localizing sequence.Note that M τn n c, so M is local square integrable. First Prev Next Go To Go Back Full Screen Close Quit24

SemimartingalesDefinition 1.8 Y is an {Ft }-semimartingale if and only if Y M V ,where M is a local square integrable martingale with respect to {Ft } and Vis an {Ft }-adapted finite variation process.In particular,if X is cadlag and adapted and Y is a semimartingale,Rthen X dY exists. First Prev Next Go To Go Back Full Screen Close Quit25

Computing quadratic variationsLet Z(t) Z(t) Z(t ).Lemma 1.9 If Y is finite variation, thenX[Y ]t Y (s)2s tLemma 1.10 If Y is a semimartingale, X is adapted, and Z(t) thenZRt0X(s )dY (s)tX(s )2 d[Y ]s .[Z]t 0Proof. Check first for piecewise constant X and then approximategeneral X by piecewise constant processes. First Prev Next Go To Go Back Full Screen Close Quit26

CovariationThe covariation of Y1 , Y2 is defined byX(Y1 (ti 1 t) Y1 (ti t)) (Y2 (ti 1 t) Y2 (ti t))[Y1 , Y2 ]t limi First Prev Next Go To Go Back Full Screen Close Quit27

Itô’s formulaIf f : R R is C 2 and Y is a semimartingale, thenZ tZ t1 00f (Y (s))d[Y ]csf (Y (t)) f (Y (0)) f 0 (Y (s ))dY (s) 0 2X0 (f (Y (s)) f (Y (s )) f 0 (Y (s )) Y (s)s twhere [Y ]c is the continuous part of the quadratic variation given byX[Y ]ct [Y ]t Y (s)2 .s t First Prev Next Go To Go Back Full Screen Close Quit28

Itô’s formula for vector-valued semimartingalesIf f : Rm R is C 2 , Y1 , . . . , Ym are semimartingales, and Y (Y1 , . . . , Ym ),then definingX[Yk , Yl ]ct [Yk , Yl ]t Yk (s) Yl (s),s tm ZXf (Y (t)) f (Y (0)) k,l 1 Xs t12 k f (Y (s )) dYk (s)0k 1mXtZt k l f (Y (s )) d[Yk , Yl ]cs0(f (Y (s)) f (Y (s )) mX k f (Y (s )) Yk (s)).k 1 First Prev Next Go To Go Back Full Screen Close Quit29

ExamplesW standard Brownian motion1Z(t) exp{W (t) t} 2ZtZ0 t 1Z(s)d(W (s) s) 2Z0t1Z(s)ds2Z(s)dW (s)0 First Prev Next Go To Go Back Full Screen Close Quit30

2.Markov processes and generators Time homogeneous Markov processes Markov processes and semigroups Semigroup generators Martingale properties Dynkin’s identity Strongly continuous contraction semigroups Resolvent operator Transition functions First Prev Next Go To Go Back Full Screen Close Quit31

Time homogeneous Markov processesA process X is Markov with respect to a filtration {Ft } providedE[f (X(t r)) Ft ] E[f (X(t r)) X(t)]for all t, r 0 and all f B(E).The conditional expectation on the right can be written as gf,t,r (X(t))for a measurable funtion gf,t,r depending on f , t, and r.If the function can be selected independently of t, that isE[f (X(t r)) X(t)] gf,r (X(t)),then the Markov process is time homogeneous. A time inhomogeneousMarkov process can be made time homogeneous by including timein the state. That is, set Z(t) (X(t), t).Note that gf,r will be linear in f , so we can write gf,r T (r)f , whereT (r) is a linear operator on B(E) (the bounded measurable functionson E). The Markov property then implies T (r s)f T (r)T (s)f . First Prev Next Go To Go Back Full Screen Close Quit32

Markov processes and semigroups{T (t) : B(E) B(E), t 0} is an operator semigroup if T (t)T (s)f T (t s)fX is a Markov process with operator semigroup {T (t)} if and only ifE[f (X(t s)) FtX ] T (s)f (X(t)),T (s r)f (X(t)) t, s 0, f B(E).E[f (X(t s r)) FtX ]XE[E[f (X(t s r)) Ft s] FtX ]E[T (r)f (X(t s)) FtX ]T (s)T (r)f (X(t)) First Prev Next Go To Go Back Full Screen Close Quit33

Semigroup and finite dimensional distributionsLemma 2.1 If X is a Markov process corresponding to {T (t)}, then thefinite dimensional distributions of X are determined by {T (t)} and the distribution of X(0).Proof.For 0 t1 t2 ,E[f1 (X(t1 ))f2 (X(t2 ))] E[f1 (X(t1 ))T (t2 t1 )f2 (X(t1 ))] E[T (t1 )[f1 T (t2 t1 )f2 ](X(0))] First Prev Next Go To Go Back Full Screen Close Quit34

Semigroup generatorsf is in the domain of the strong generator of the semigroup if thereexists g B(E) such thatlim kg t 0 T (t)f fk 0.tThen Af g.e (see Dynkin (1965)), iff is in the domain of the weak generator Asupt kt 1 (T (t)f f )k and there exists g B(E) such thatT (t)f (x) f (x)e (x), x E. g(x) Aft 0 tb (see Ethier and Kurtz (1986)) isThe full generator AZ tb {(f, g) B(E) B(E) : T (t)f f AT (s)gdslim0e A.bA A First Prev Next Go To Go Back Full Screen Close Quit35

Martingale propertiesLemma 2.2 If X is a progressive Markov process corresponding to {T (t)}b thenand (f, g) A,Z tMf (t) f (X(t)) f (X(0)) g(X(s))ds0is a martingale (not necessarily right continuous).Proof.E[Mf (t r) Mf (t) Ft ]Z E[f (X(t r)) f (X(t)) Z T (r)f (X(t)) f (X(t)) t rg(X(s))ds Ft ]tt rT (s t)g(X(t))dst 0 First Prev Next Go To Go Back Full Screen Close Quit36

Dynkin’s identityb for g, if (f, g) A.bChange of notation: Simply write AfIf Mf is right continuous, the optional sampling theorem impliesZ t τb (X(s))ds].E[f (X(t τ ))] E[f (X(0))] E[Af0 First Prev Next Go To Go Back Full Screen Close Quit37

Exit timesAssume D is open and X is right continuous. Let τDh inf{t :X(t) or X(t ) / D}. Write Ex for expectations under the conditionthat X(0) x.b 0, and τD a.s.Suppose f is bounded and continuous, AfThenf (x) Ex [f (X(τDh ))].b (x) 1, x D, and f (y) 0,If f is bounded and continuous, Afy / D, and P {X(τDh ) D} 0, thenf (x) Ex [τDh ] First Prev Next Go To Go Back Full Screen Close Quit38

Exit distributions in one dimensionFor a one-dimensional diffusion process1Lf (x) a(x)f 00 (x) b(x)f 0 (x).2Find f such that Lf (x) 0 (i.e., solve the linear first order differentialequation for f 0 ). Then f (X(t)) is a local martingale.Fix a b, and define τ inf{t : X(t) / (a, b)}. If supa x b f (x) ,then Ex [f (X(t τ ))] f (x).Moreover, if τ a.s. Ex [f (X(τ ))] f (x).Hence f (a)Px (X(τ ) a) f (b)Px (X(τ ) b) f (x),and therefore the probability of exiting the interval at the right endpoint is given byPx (X(τ ) b) f (x) f (a)f (b) f (a)(2.1) First Prev Next Go To Go Back Full Screen Close Quit39

Exit timeTo find conditions under which Px (τ ) 1, or more precisely,under which Ex [τ ] , solve Lg(x) 1. Theng(X(t)) g((X(0)) t,is a local martingale and C supa x b g(x) ,Ex [g(X(t τ ))] g(x) Ex [t τ ]and 2C E[t τ ], so 2C E[τ ], which implies τ a.s. By (2.1),Ex [τ ] Ex [g(X(τ ))] g(x)f (b) f (x)f (x) f (a) g(a) g(x) g(b)f (b) f (a)f (b) f (a) First Prev Next Go To Go Back Full Screen Close Quit40

Strongly continuous contraction semigroupSemigroups associated with Markov processes are contraction semigroups, i.e.,kT (t)f k kf k, f B(E).Let L0 {f B(E) : limt 0 kT (t)f f k 0. Then D(A) is dense in L0 . kλf Af k λkf k,f D(A), λ 0. R(λ A) L0 , λ 0. First Prev Next Go To Go Back Full Screen Close Quit41

The resolventLemma 2.3 For λ 0 and h L0 ,Z 1(λ A) h e λt T (t)hdt0R e λt T (t)hdt. ThenZ Z 1 1 λtr (T (r)f f ) r (e T (t r)hdt e λt T (t)hdt)0 ZZ 0 r 1 (eλre λt T (t)hdt e λt T (t)hdt)Proof. Let f 0r0 λf h First Prev Next Go To Go Back Full Screen Close Quit42

Hille-Yosida theoremTheorem 2.4 The closure of A is the generator of a strongly continuouscontraction semigroup on L0 if and only if D(A) is dense in L0 . kλf Af k λkf k,f D(A), λ 0. R(λ A) is dense in L0 .Proof. Necessity is discussed above. Assuming A is closed (otherwise, replace A by its closure), the conditions imply R(λ A) L0and the semigroup is obtained by1T (t)f lim (I A) [nt] f.n n(One must show that the right side is Cauchy.) First Prev Next Go To Go Back Full Screen Close Quit43

Probabilistic interpretation of the limitIf T (t) corresponds to a Markov process X, then11(I A) 1 f (x) Ex [f (X( ))],nnwhere is a unit exponential independent of X, and[nt]1X1 i ))](I A) [nt] f (x) Ex [f (X(nn i 1 First Prev Next Go To Go Back Full Screen Close Quit44

Transition functionsDefinition 2.5 P (t, x, Γ) defined on [0, ) E B(E) is a transition function if P (·, ·, Γ) is Borel measurable for each Γ B(E), P (t, x, ·) P(E)for each (t, x) [0, ) E, and P satisfies the Chapman-KolmogorovrelationZP (t s, x, Γ) P (s, y, Γ)P (t, x, dy).EA Markov process X corresponds to a transition function P providedP {X(t) Γ X(0) x} P (t, x, Γ).T (t)f (x) REf (y)P (t, x, dy) defines a semigroup on B(E). First Prev Next Go To Go Back Full Screen Close Quit45

The resolvent for the full generatorLemma 2.6 SupposeR T (t) : B(E) B(E) is given by a transition function, T (t)f (x) E f (y)P (t, x, dy). For h B(E), defineZ f (x) e λt T (t)h(x)dt.0bThen (f, λf h) A.Proof.ZtZ tZ T (s)(λf h)ds λ0Z λute T (s u)hduds T (s)hds000Z tZ Z tλs λu λee T (u)hduds T (s)hds00Z sZ eλte λu T (u)hdu e λu T (u)hdut0 T (t)f f First Prev Next Go To Go Back Full Screen Close Quit46

A convergence lemmaLemma 2.7 Let E be compact and suppose {fk } C(E) separates points.If {xn } satisfies limn fk (xn ) exists for every fk , then limn xn exists.Proof. If x and x0 are limit points of {xn }, we must have fk (x) fk (x0 )for all k. But then x x0 , since {fk } separates points. First Prev Next Go To Go Back Full Screen Close Quit47

Feller processesLemma 2.8 Assume E is compact, T (t) : C(E) C(E), andlim T (t)f (x) f (x),t 0x E, f C(E).If X is a Markov process corresponding to {T (t)}, then X has a modification with cadlag sample paths.R Proof. For h C(E), f Rλ h 0 e λt T (t)hdt C(E), so settingg λf h,Ztf (X(t)) f (X(0)) g(X(s))ds0is a martingale. By the upcrossing inequality, there exists a set Ωf Ω with P (Ωf ) 1 such that for ω Ωf , lims t ,s Q f (X(s, ω)) existsfor each t 0 and lims t ,s Q f (X(s, ω)) exists for each t 0.Suppose {hk , k 1} C(E) is dense. Then {Rλ hk : λ Q (0, ), k 1} separates points in E. First Prev Next Go To Go Back Full Screen Close Quit48

3.Martingale problems Definition Equivalent formulations Uniqueness of 1-dimensional distributions implies uniqueness offdd Uniqueness under the Hille-Yosida conditions Markov property Quasi-left continuityhttp://www.math.wisc.edu/ kurtz/FrankLect.htm First Prev Next Go To Go Back Full Screen Close Quit49

Martingale problems: DefinitionE state space (a complete, separable metric space)A generator (a linear operator with domain and range in B(E)µ P(E)X is a solution of the martingale problem for (A, µ) if and only ifµ P X(0) 1 and there exists a filtration {Ft } such thatZ tAf (X(s))dsMf (t) f (X(t)) 0is an {Ft }-martingale for each f D(A). First Prev Next Go To Go Back Full Screen Close Quit50

Examples of generatorsStandard Brownian motion (E Rd )1Af f, D(A) Cc2 (Rd )2Poisson process (E {0, 1, 2 . . .}, D(A) B(E))Af (k) λ(f (k 1) f (k))Pure jump process (E arbitrary)ZAf (x) λ(x) (f (y) f (x))µ(x, dy)EDiffusion (E Rd , D(A) Cc2 (Rd ))X1X 2 Af (x) aij (x)f (x) bi (x)f (x)2 i,j xi xj xii(3.1) First Prev Next Go To Go Back Full Screen Close Quit51

Equivalent formulationsSuppose, without loss of generality, that D(A) is closed under addition of constants (A1 0). Then the following are equivalent:a) X is a solution of the martingale problems for (A, µ).b) P X(0) 1 µ and there exists a filtration {Ft } such that for eachλ 0 and each f D(A),Z te λt f (X(t)) e λs (λf (X(s)) Af (X(s)))ds0is a {Ft }-martingale.c) P X(0) 1 µ and there exists a filtration {Ft } such that for eachf D(A) with inf x E f (x) 0,Z tAf (X(s))f (X(t))exp{ ds}Rf (t) f (X(0)0 f (X(s))is a {Ft }-martingale. First Prev Next Go To Go Back Full Screen Close Quit52

Proof. For Part (c), assume D(A) Cb (E) and X is right continuous.Z tAf (X(s))f (X(t)) exp{ ds}0 f (X(s))Z tZ rAf (X(s))ds}df (X(r))exp{ f (X(0)) f (X(s))00Z rZ tAf (X(s))Af (X(r))exp{ ds}dr f (X(r))f (X(r))f (X(s))00Z tZ rAf (X(s)) f (X(0)) ds}dMf (r)exp{ f(X(s))00so if Mf is a martingale, then Rf is a martingale. First Prev Next Go To Go Back Full Screen Close Quit53

Conversely, if Rf is a martingale, thenZ tZ rAf (X(s))Mf (t) f (X(0)) ds}dRf (r)exp{f (X(s))00is a martingale.Note that considering only f that are strictly positive is no restrictionsince we can always add a constant to f . First Prev Next Go To Go Back Full Screen Close Quit54

Conditions for the martingale propertyLemma 3.1 For (f, g) A, h1 , . . . , hm C(E), and t1 t2 · · · tm 1 , letη(Y ) η(Y, (f, g), {hi }, {ti })Ztm 1 (f (Y (tm 1 ) f (Y (tm )) g(Y (s)ds)tmmYhi (Y (ti )).i 1Then Y is a solution of the martingale problem for A if and only if E[η(Y )] 0 for all such η.The assertion that Y is a solution of the martingale problem for A isan assertion about the finite dimensional distributions of Y . First Prev Next Go To Go Back Full Screen Close Quit55

Uniqueness of 1-dimensional distributions implies uniqueness of fddTheorem 3.2 If any two solutions of the martingale problem for A satisfying P X1 (0) 1 P X2 (0) 1 also satisfy P X1 (t) 1 P X2 (t) 1 for allt 0, then the f.d.d. of a solution X are uniquely determined by P X(0) 1Proof. If X is a solution of the MGP for A and Xa (t) X(a t),then Xa is a solution of the MGP for A. Further more, for positivefi B(E) and 0 t1 t2 · · · tm a, defineQE[1B (Xa ) mf (X(ti ))]Qm i 1 iQ(B) E[ i 1 fi (X(ti ))]defines a probability measure on F σ(Xa (s), s 0) and under Q,Xa is a solution of the martingale problem for A with initial distributionQE[1Γ (X(a)) mf (X(ti ))]Qm i 1 iµ(Γ) .E[ i 1 fi (X(ti ))] First Prev Next Go To Go Back Full Screen Close Quit56

QProceeding by induction, fix m, suppose E[ mi 1 fi (X(ti ))] is uniquelydetermined for all 0 t1 t2 · · · tm and all fi . The µ is uniquelydetermined and the one dimensional distributions of Xa under Q areuniquely determined, that isQE[fm 1 (X(tm 1 )) mi 1 fi (X(ti ))]QmE[ i 1 fi (X(ti ))]is uniquely determined for tm 1 a. Since a is arbitrary and thedenominator is uniquely determined, the numerator is uniquely determined completing the induction step. First Prev Next Go To Go Back Full Screen Close Quit57

Adding a time componentLemma 3.3 Suppose that g(t, x) has the property that g(t, ·) D(A) foreach t and that g, t g, and Ag are all bounded in t and x and are continuousfunctions of t. If X is a solution of the martingale problem for A, thenZ tg(t, X(t)) ( s g(x, X(s)) Ag(s, X(s)))ds0is a martingale. First Prev Next Go To Go Back Full Screen Close Quit58

Proof.E[g(t r, X(t r)) g(t, X(t)) Ft ]X E[g(t sk 1 , X(t sk 1 )) g(t sk , X(t sk )) Ft ] kXE[g(t sk 1 , X(t sk 1 )) g(t sk 1 , X(t sk )) Ft ]k XE[g(t sk 1 , X(t sk )) g(t sk , X(t sk )) Ft ]k XZE[t sk 1Ag(t sk 1 , X(t r))dr Ft ]t skk XkZE[t sk 1 r g(t r, X(t sk ))dr Ft ]t skTo complete the proof, see Exercise 14. First Prev Next Go To Go Back Full Screen Close Quit59

Uniqueness under the Hille-Yosida conditionsTheorem 3.4 If A satisfies the conditions of Theorem 2.4 and D(A) is separating, then there is at most one solution to the martingale problem.Proof. If X is a solution of the martingale problem for A, then byLemma 3.3, for each t 0 and each f D(A), T (t s)f (X(s)) is amartingale. This martingale property extends to all f in the closureof D(A). Consequently,E[f (X(t)) Fs ] T (t s)f (X(s)),and E[f (X(t))] E[T (t)f (X(0))] which determines the one dimensional distributions implying uniqueness. First Prev Next Go To Go Back Full Screen Close Quit60

Markov propertyTheorem 3.5 Suppose the conclusion of Theorem 3.2 holds. If X is a solution of the martingale problem for A with respect to a filtration {Ft }, thenX is Markov with respect to {Ft }.Proof. Assuming that P (F ) 0, let F Fr and for B F, defineP1 (B) E[1F E[1B Fr ]],P (F )P2 (B) E[1F E[1B X(r)]].P (F )Define Y (t) X(r t). ThenE[1F E[1{Y (0) Γ} Fr ]] E[1F E[1{X(r) Γ} Fr ]] P (F )P (F )E[1F 1{X(r) Γ} ] E[1F E[1{X(r) Γ} X(r)]] P2 {Y (0) Γ}P (F )P (F )P1 {Y (0) Γ} Check that E P1 [η(Y )] E P2 [η(Y )] 0 for all η(Y ) as in Lemma 3.1. First Prev Next Go To Go Back Full Screen Close Quit61

ThereforeE[1F E[f (X(r t)) Fr ]] P (F )E P1 [f (Y (t))] P (F )E P2 [f (Y (t))] E[1F E[f (X(r t)) X(r)]]Since F Fr is arbitrary, E[f (X(r t)) Fr ] E[f (X(r t) X(r)] andthe Markov property follows. First Prev Next Go To Go Back Full Screen Close Quit62

Cadlag versionsLemma 3.6 Suppose E is compact and A C(E) B(E). If D(A) isseparating, then any solution of the martingale problem for A has a cadlagmodification.Proof. See Lemma 2.8 First Prev Next Go To Go Back Full Screen Close Quit63

Quasi-left continuityX is quasi-left continuous if and only if for each sequence of stoppingtimes τ1 τ2 · · · such that τ limn τn a.s.,lim X(τn ) X(τ ) a.s.n Lemma 3.7 Let A C(E) B(E), and suppose that D(A) is separating.Let X be a cadlag solution of the martingale problems for A. Then X isquasi-left continuousProof. For (f, g) A,Zτ tlim f (X(τn t)) lim E[f (X(τ t)) n n g(X(s))ds Fτn ]τn t E[f (X(τ t)) n Fτn ] . First Prev Next Go To Go Back Full Screen Close Quit64

Since X is cadlag, X(τ t)lim X(τn t) n X(τ t )if τn t τ t for n sufficiently largeif τn t τ t for all nTo complete the proof, see Exercise 6. First Prev Next Go To Go Back Full Screen Close Quit65

Continuity of diffusion processLemma 3.8 Suppose E Rd andX 21Xbi (x)f (x) f (x),aij (x)Af (x) 2 i,j xi xj xiiD(A) Cc2 (Rd ).If X is a solution of the martingale problem for A, then X has a modificationthat is cadlag in Rd { }. If X is cadlag, then X is continuous.Proof. The existence of a cadlag modification follows by Lemma 3.6.To show continuity, it is enough to show that for f Cc (Rd ), f Xis continuous. To show f X is continuous, it is enough to showXlim(f (X(ti 1 t) f (X(ti t)))4 0.max ti 1 ti 0 First Prev Next Go To Go Back Full Screen Close Quit66

From the martingale properties,E[(f (X(t h)) f (X(t)))4 ]Z t h E[Af 4 (X(s)) 4f (X(t))Af 3 (X(s))t 6f 2 (X(t))Af 2 (X(s)) 4f 3 (X(t))Af (X(s))]dsCheck thatAf 4 (x) 4f (x)Af 3 (x) 6f 2 (x)Af 2 (x) 4f

processes 1.Basics of stochastic processes 2.Markov processes and generators 3.Martingale problems 4.Exisence of solutions and forward equations 5.Stochastic integrals for Poisson random measures 6.Weak and strong solutions of stochastic equations 7.Stochastic equations for Markov processes in Rd 8.Convergenc

Related Documents:

Lecture 21: Stochastic Differential Equations In this lecture, we study stochastic di erential equations. See Chapter 9 of [3] for a thorough treatment of the materials in this section. 1. Stochastic differential equations We would like to solve di erential equations of the form dX (t;X(t))dtX (t; (t))dB(t)

STOCHASTIC CALCULUS AND STOCHASTIC DIFFERENTIAL EQUATIONS 5 In discrete stochastic processes, there are many random times similar to (2.3). They are non-anticipating, i.e., at any time n, we can determine whether the cri-terion for such a random time is met or not solely by the “history” up to time n.

EQUATIONS AND INEQUALITIES Golden Rule of Equations: "What you do to one side, you do to the other side too" Linear Equations Quadratic Equations Simultaneous Linear Equations Word Problems Literal Equations Linear Inequalities 1 LINEAR EQUATIONS E.g. Solve the following equations: (a) (b) 2s 3 11 4 2 8 2 11 3 s

are times when the fast stochastic lines either cross above 80 or below 20, while the slow stochastic lines do not. By slowing the lines, the slow stochastic generates fewer trading signals. INTERPRETATION You can see in the figures that the stochastic oscillator fluctuates between zero and 100. A stochastic value of 50 indicates that the closing

Jul 09, 2010 · Stochastic Calculus of Heston’s Stochastic–Volatility Model Floyd B. Hanson Abstract—The Heston (1993) stochastic–volatility model is a square–root diffusion model for the stochastic–variance. It gives rise to a singular diffusion for the distribution according to Fell

7.4 Worked problems on partial fractions with quadratic factors 58 8 Simple equations 60 8.1 Expressions, equations and identities 60 8.2 Worked problems on simple equations 60 8.3 Further worked problems on simple equations 62 8.4 Practical problems involving simple equations 64 8.5 Further practical problems involving simple equations 65 .

Young and Zhou [30]. To handle stochastic optimal control problems, Bismut [3] in-troduced the linear backward stochastic differential equations (BSDEs). Pardoux and Peng [19] introduced the nonlinear BSDEs. Peng [20] first examined the stochastic recursive optimal control problems and derived a stoc

Principles of Animal Nutrition Applied Animal Science Research Techniques for Bioscientists Principles of Animal Health and Disease 1 Optional Physiology of Electrically Excitable Tissues Animal Behaviour Applied Agricultural and Food Marketing Economic Analysis for Agricultural and Environmental Sciences Physiology and Biotechnology option Core Endocrine Control Systems Reproductive .