Nonlinear Time Series Modeling - Columbia University

1y ago
2 Views
1 Downloads
702.69 KB
94 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Kaden Thurman
Transcription

Nonlinear Time Series ModelingRichard A. DavisColorado State University(http://www.stat.colostate.edu/ rdavis/lectures)MaPhySto WorkshopCopenhagenSeptember 27 — 30, 2004MaPhySto Workshop 9/041

Part I: Introduction to Linear and Nonlinear Time Series1. Introduction2. Examples3. Linear processes3.1 Preliminaries3.2 Wold Decomposition3.3 Reversibility3.4 Identifiability3.5 Linear tests3.6 Prediction4. Allpass models4.1 Application of allpass Noninvertible MA model fitting Microsoft Muddy Creek Seisomogram deconvolution4.2 EstimationMaPhySto Workshop 9/042

Part II: Time Series Models in Finance1. Classification of white noise2. Examples3. “Stylized facts” concerning financial time series4. ARCH and GARCH models5. Forecasting with GARCH6. IGARCH7. Stochastic volatility models8. Regular variation and application to financial TS8.1 univariate case8.2 multivariate case8.3 applications of multivariate regular variation8.4 application of multivariate RV equivalence8.5 examples8.6 Extremes for GARCH and SV models8.7 Summary of results for ACF of GARCH & SV modelsMaPhySto Workshop 9/043

Part III: Nonlinear and NonGaussian State-Space Models1. Introduction1.1 Motivation examples1.2 Linear state-space models1.3 Generalized state-space models2. Observation-driven models2.1 GLARMA models for TS of counts2.2 GLARMA extensions3.3 Other3. Parameter-driven models3.1 Estimation3.2 Simulation and Application3.3 How good is the posterior approximationMaPhySto Workshop 9/044

Part IV: Structural Break Detection in Time Series1. Piecewise AR models2. Minimum description length (MDL)3. Genetic algorithm (GA)4. Simulation examples5. Applications (EEG and speech examples)6. Application to nonlinear modelsMaPhySto Workshop 9/045

References: Brockwell and Davis (1991). Time Series: Theory and Methods Brockwell and Davis (2001). Introduction to Time Series andForecasting. Durbin and Koopman (2001). Time Series Analysis by State-Space Models. Embrechts, Klüppelberg, and Mikosch (1997). ModellingExtremal Events. Fan and Yao (2001). Nonlinear Time Series. Frances and van Dijk (2000). Nonlinear Time Series Models inEmpirical Finance. Harvey (1989). Forecasting, Structural Time Series Models andthe Kalman Filter. Rosenblatt (2000). Gaussian and Non-Gaussian Linear TimeSeries and Random Fields. Subba-Rao and Gabr (1984). An Introduction to BispectralAnalysis and Bilinear Time Series Models. Tong (2000). Nonlinear Time Series Models; a dynamicalsystems approach.MaPhySto Workshop 9/046

1. IntroductionWhy nonlinear time series models? What are the limitations of linear time series models? What key features in data cannot be captured by linear time seriesmodels?What diagnostic tools (visual or statistical) suggest incompatibility ofa linear model with the data?MaPhySto Workshop 9/047

Example: Z1, . . . , Zn IID(0,σ2)Series2.1.0.-1.-2.04080120160200Sample autocorrelation function (ACF):n h γˆ ( h ) 1where γˆ ( h ) n ( Z t Z )( Z t h Z )ρˆ Z ( h ) γˆ (0)t 1is the sample autocovariance function (ACVF).MaPhySto Workshop 9/048

Theorem. If {Zt} IID(0,σ2), then(ρˆ Z (1), K, ρˆ Z ( h ))' is approximately IID N(0,1/n).Proof: (see problem 6.24 TSTM)Sample ACF1.00Sample -.40-.60-.60-.80-.80-1.00-1.000510MaPhySto Workshop 9/0415202530354005101520253035409

Cor. If {Zt} IID(0,σ2) and E Z1 4 , then(ρˆ Z 2 (1),K, ρˆ Z 2 ( h ))' is approximately IID N(0,1/n).Residual ACF: Abs values1.00Residual ACF: .40-.40-.60-.60-.80-.80-1.00-1.00051015MaPhySto Workshop 9/042025303540051015202530354010

What if E Z1 2 ? For example, suppose {Zt} IID Cauchy.SeriesSample 2025303540Result (see TSTM 13.3): If {Zt} IID Cauchy, thennnSS11ˆρρˆZZ((hh)) ,,lnSS.5.5lnnnS1 and S.5 are independent stable random variables.MaPhySto Workshop 9/0411

How about the ACF of the squares?Residual ACF: Abs values1.00Residual ACF: 510152025303540Result: If {Zt} IID Cauchy, then2S1 / 2 n ˆρ() ,h Z2S.25 ln n S.5 and S.25 are independent stable random variables.MaPhySto Workshop 9/0412

Reversibility. The stationary sequence of random variables {Xt} istime-reversible if (X1, . . . ,Xn) d (Xn, . . . ,X1).Result: IID sequences {Zt} are time-reversible.Application: If plot of time series does not look time- reversible,then it cannot be modeled as an IID sequence. Use the “flip andcompare” inspection test!Series2.1.0.-1.-2.04080MaPhySto Workshop 9/0412016020013

Reversibility. Does the following series look Residual SampleACF: Abs values1.001.00120.80.60 .60.60.40 .40.40.20 .20.20.00 esidual ACF: Squares1.00.80 .80MaPhySto Workshop 9/04160-1.00051015202530354014

2. Examples02040closing price6080100120140Closing Price for IBM MaPhySto Workshop 9/0415

-20100*log(returns)-10010Log returns for IBM 1/3/62-11/3/00 (blue 1961-1981)1962MaPhySto Workshop 9/041967197219771982time19871992199716

Sample ACF IBM (a) 1962-1981, (b) 1982-2000(b) ACF of IBM (2nd half)0.00.00.20.20.40.4ACFACF0.60.60.80.81.01.0(a) ACF of IBM (1st half)010203040010Lag203040LagRemark: Both halves look like white noise?MaPhySto Workshop 9/0417

Sample ACF of abs values for IBM (a) 1961-1981, (b) ) ACF, Abs Values of IBM (2nd half)1.0(a) ACF, Abs Values of IBM (1st half)010203040010Lag203040LagRemark: Series are not independent white noise?MaPhySto Workshop 9/0418

ACF of squares for IBM (a) 1961-1981, (b) ) ACF, Squares of IBM (2nd half)1.0(a) ACF, Squares of IBM (1st half)01020Lag3040010203040LagRemark: Series are not independent white noise? Try GARCH or astochastic volatility model.MaPhySto Workshop 9/0419

ACF0.20.40.6200.0-2log returns (exchange rates)0.841.0Example: Pound-Dollar Exchange Rates(Oct 1, 1981 – Jun 28, 1985; Koopman website)0200400600800010304030401.00.80.6ACF of abs values0.00.20.40.80.60.40.00.2ACF of squares20lag1.0day01020lagMaPhySto Workshop 9/04304001020lag20

0614Example: Daily Asthma Presentations (1990:1993) FebMarAprMayJunJulAugYear 1990SepOctNovDec0614Jan FebMarAprMay14JanJunJulAugYear 1991SepOctNovDec 06 FebMarAprMayJunJulAugYear 1992SepOctNovDec0614Jan JanFebMarAprMayJunJulAugYear 1993SepOctNovDecRemark: Usually marginal distribution of a linear process is continuous.MaPhySto Workshop 9/0421

Muddy Creek- tributary to Sun River in Central MontanaMuddy Creek: surveyed every 15.24 meters, total of 5456m; 2bed elevation102410261028Degree AICc010002000300040005000distance (m)MaPhySto Workshop 9/0422

0.5Minimum AICc ARMA model:ARMA(1,1) Yt .574 Yt-1 εt – .311 εt-1,0.0{εt} WN(0,.0564)Some theory:Noncausal ARMA(1,1) model: YLSestimates of trend parameters aret 1.743 Yt-1 εt – .311 εt-1asymptotically efficient.-0.5100020003000distance (m)40005000 LS estimates are asymptotically indepof cov parameter estimates.1.01.000.4Red model0.6acf0.6Red model0.00.00.20.2acfBlue sample0.80.8Blue sample0.4residuals deg 4Muddy Creek: residuals from poly(d 4) fit0100200MaPhySto Workshop 9/04 lag (m)3004000100200lag (m)30040023

Muddy Creek (cont)Summary of models fitted to Muddy Creek bed elevation:MaPhySto Workshop 9/04DegreeAICcARMA 0347.1(1,1)7.12434.0(1,1)2.78535.5(1,1)4.6824

Example: NEE Net Ecosystem Exchange in Harvard Forest About half of the CO2 emitted by humans accumulates in theatomosphere Other half is absorbed by “sink” processes on land andin the oceansNEE (Rh Ra) – GPP (carbon flux)GPP Gross Primary Production (photosysynthesis)Rh Heterotrophic (microbial) respirationRa autotrophic (plant) respiration.The NEE data from the Harvard Forest consists of hourlymeasurements. We will aggregate over the day and consider dailydata from Jan 1, 1992 to Dec 31, 2001.Go to ITSM DemoMaPhySto Workshop 9/0425

3. Linear Processes3.1 PreliminariesDef: The stochastic process {Xt , t 0, 1, 2, . . .} defined on aprobability space is called a discrete-time time series.Def: {Xt} is stationary or weakly stationary ifi.E Xt 2 , for all t.ii.EXt m, for all t.iii. Cov(Xt, Xt h) γ(h) depends on h only.Def: {Xt} is strictly stationary if (X1, . . . ,Xn) d (X1 h, . . . ,Xn h)for all n 1 and h 0, 1, 2, Remarks:i.SS (E Xt 2 ) weak stationarityii.WS SS (think of an example)iii. WS Gaussian SS (why?)MaPhySto Workshop 9/0426

3.1 Preliminaries (cont)Def: {Xt} is a Gaussian time series if(Xm, . . . ,Xn) is multivariate normalfor all integers m n, i.e., all finite dimensional distributions are normal.Remark: A Gaussian time series is completely determined by the meanfunction and covariance functions,m(t) EXt and γ(s,t) Cov(Xs, Xt).If follows that a Gaussian TS is stationary (SS or WS) if and only ifm(t) m and γ(s,t) γ(t-s) depends only on the time lag t-s.MaPhySto Workshop 9/0427

3.1 Preliminaries (cont)Def: {Xt} is a linear time series with mean 0 ifXt where {Zt} WN(0,σ2) ψ Zj andjt j ψ2j, .j Important remark: As a reminder WN means uncorrelated randomvariables and not necessarily independent noise nor independentGaussian noise.Proposition: A linear TS is stationary withi.EXt 0, for all t.ii.γ (h) σ 2 ψ jψ j h and ρ(h) j ψ jψ j h /j 2ψ jj If {Zt} IID(0,σ2), then the linear TS is strictly stationary.MaPhySto Workshop 9/0428

Is the converse to the previous proposition true? That is, are allstationary processes linear?Answer: Almost.3.2 Wold Decomposition (TSTM Section 5.7)Example: SetXt A cos(ωt) B sin(ωt), ω (0,π),where A,B WN(0,σ2). Then {Xt} is stationary since E Xt 0, γ(h) σ2 cos(ωh) Def: Let Pn ( ) be the best linear predictor operator onto the linear spanof the observations Xn, Xn-1, . . . .For this example, Pn 1 ( X n ) X n .Such processes with this property are called deterministic.MaPhySto Workshop 9/0429

3.2 Wold Decomposition (cont)The Wold Decomposition. If {Xt} is a nondeterministic stationary timeseries with mean zero, then X t ψ j Z t j Vt ,j 0wherei. ψ0 1, Σ ψj2 .ii. {Zt} WN(0,σ2)iii. cov(Zs,Vt) 0 for all s and t iv. Pt ( Z t ) Z tfor all t. v. Ps (Vt ) Vtfor all s and t.vi. {Vt} is deterministic.The sequences {Zt}, {Vt}, and {ψt} are unique and can be written as Z t X t Pt 1 ( X t ), ψ j E ( X t Z t j ) / E ( Z t2 ),MaPhySto Workshop 9/04 Vt X t ψ j Z t j .j 030

3.2 Wold Decomposition (cont)Remark. For many time series (in particular for all ARMA processes) thedeterministic component Vt is 0 for all t and the series is then said to bepurely nondeterministic.Example. LetXt Ut Y, where {Ut} WN(0,σ2) and is independent ofY (0,τ2). Then, in this case, Zt Ut and Vt Y (see TSTM, problem5.24).Remarks: If {Xt} is purely nondeterministic, then {Xt} is a linear process. Spectral distribution for nondeterministic processes has the form FX FU FV, where U t ψ j Z t j which has spectral densityj 0σ2σ2 ijλ 2f (λ ) ψ je ψ(eiλ ) 22 π j 02πMaPhySto Workshop 9/0431

3.2 Wold Decomposition (cont) If σ2 E ( X t Pt 1 ( X t )) 2 0, then FX FU FV,is the Lebesque decomposition of the spectral distributionfunction; FU is the absolutely continuous part and FV is thesingular part.Example. LetXt Ut Y, where {Ut} WN(0,σ2) and is independent ofY (0,τ2). Then2σFX ( dλ ) ( dλ ) τ2δ0 ( dλ )2πKolmogorov’s Formula.σ 2π exp{(2π)2 1πClearlyσ2 0 iffMaPhySto Workshop 9/04π 22lnf(λ)dλ},whereσ E(X P(X)).tt 1t π ln f (λ)dλ . π32

3.2 Wold Decomposition (cont)Example (TSTM problem 5.23).Xt 1 sin2ψτψ Z{ZWN,} (0,), j t jtj π jj j . This process has a spectral density function but is deterministic!!Example (see TSTM problem 5.20). Letand setX t εt 2εt 1 , {εt } WN (0, τ2 ),Z t (1 .5B ) 1 X t .5 j X t j εt 2εt 1 .5(εt 1 2εt 2 ) .52 ( εt 2 εt 3 ) Lj 0 εt 3 .5 j εt jj 1It follows that {Z t } WN (0, σ2 ) and X t Z t .5Z t 1 is the WD for {Xt}.a) If {εt} IID N(0,σ2), is {Zt} IID? Answer?b) If {εt} IID(0,σ2), is {Zt} IID? Answer?MaPhySto Workshop 9/0433

3.2 Wold Decomposition (cont)Remark: In this last example, the process {Zt} is called an allpassmodel of order 1. More on this type of process later.Go to ITSM Demo3.3 ReversibilityRecall that the stationary time series {Xt} is time-reversible if(X1, . . . ,Xn) d (Xn, . . . ,X1) for all n.MaPhySto Workshop 9/0434

3.3 ReversibilityThe stationary time series {Xt} is time-reversible if(X1, . . . ,Xn) d (Xn, . . . ,X1) for all n.Theorem (Breidt & Davis 1991). Consider the linear time series {Xt}Xt ψ Zj jt j, {Z t } IID ,where ψ(z) zr ψ(z-1) for any integer r. Assume either(a) Z0 has mean 0 and finite variance and {Xt} has a spectral densitypositive almost everywhere.or(b) 1/ψ(z) π(z) Σjπjzj, the series converging absolutely in some annulusD containing the unit circle andπ(B)Xt ΣjπjXt-j Zt.Then {Xt} is time-reversible if and only if Z0 is Gaussian.MaPhySto Workshop 9/0435

3.3 Reversibility (cont)Remark: The condition ψ(z) zr ψ(z-1) on the filter precludes the filterfrom being symmetric about one of the coefficients. In this case, thetime series would be time-reversible for non-Gaussian noise. Forexample, consider the seriesX t Z t .5Z t 1 Z t 2 , {Z t } IIDHere ψ(z) 1 - .5z z2 z2 (1 - .5 z-1 z2) z2 ψ(z-1) and the series istime-reversible.Proof of Theorem: Clearly any stationary Gaussian time series is timereversible (why?). So suppose Z0 is nonGaussian and assume (a). If{Xt} time-reversible, then11ψ( B )Zt X t dX Zt t 1 1ψ( B )ψ( B )ψ( B )MaPhySto Workshop 9/04 a Zjj t j.36

3.3 Reversibility (cont)The first equality takes a bit of argument and relies on the spectralrepresentation of {Xt} given byXt itλe dZ (λ),( π ,π ]where Z(λ) is a process of orthogonal increments (see TSTM, Chapter 4).It follows, by the assumptions on the spectral density of {Xt} that11itλX edZ (λ ),t miλ ψ( B )ψ( e )( π ,π ]is well defined. Soψ( B )Zt dZt 1ψ( B ) a Zj jt j.and, by the assumption on ψ(z), the rhs is a non-trivial sum. Note that 2a j 1 Why?j The above relation is a characterization of a Gaussian distribution(see Kagan, Linnik, and Rao (1973).)MaPhySto Workshop 9/0437

3.3 Reversibility (cont)Example: Recall for the exampleX t εt 2εt 1 , {εt } IID (0, τ2 ),and non-normal, the Wold decomposition is given byX t Z t .5Z t 1 ,where Z t εt 3 .5 j εt j .j 1By previous result, {Zt} cannot be time-reversible and hence is not IID.Remark: This theorem can be used to show identifiability of theparameters and noise sequence for an ARMA process.MaPhySto Workshop 9/0438

3.4 IdentifiabilityMotivating example: The invertible MA(1) processX t Z t θZ t 1 , {Z t } IID (0, σ2 ), θ 1,has a non-invertible MA(1) representation,X t εt θ 1εt 1 , {εt } WN (0, θ2σ2 ), θ 1.Question: Can the {εt} also be IID?Answer: Only if the Zt are Gaussian.If the Zt are Gaussian, then there is an identifiability problem,( θ, σ2 ) ( θ 1 , θ2σ2 ), θ 1,give the same model.MaPhySto Workshop 9/0439

3.4 Identifiability (cont)For ARMA processes {Xt} satisfying the recursions,X t φ1 X t 1 L φ p X t p Z t θ1Z t 1 L θq Z t q , {Z t } IID (0, σ2 ),φ( B ) X t θ( B ) Z tcasuality and invertibility are typically assumed, i.e.,φ( z ) 0 and θ( z ) 0 for z 1.By flipping roots of the AR and MA polynomials from outside the unit circleto inside the unit circle, there are approximately 2p q equivalentARMA representations of Xt driven with noise that is white (not IID). Foreach of these equivalent representations, the noise is only IID in theGaussian case.Bottom line: For nonGaussian ARMA, there is a distinction betweencausal and noncausal; and invertible and non-invertible models.MaPhySto Workshop 9/0440

3.4 Identifiability (cont)Theorem (Cheng 1992): Suppose the linear time seriesXt ψ Zj jt j, {Z t } IID (0, σ2 ), j ψ 2j ,has a positive spectral density a.e. and can also be represented asXt η Yj j t j, {Yt } IID (0, τ2 ), j η2j .Then if {Xt} is nonGaussian, it follows that1Yt cZ t t0 , η j ψ j t0 ,cfor some positive constant c.Proof of Theorem: As in the proof of the reversibility result, we can write1η( B )Zt Xt Yt ψ( B )ψ( B )MaPhySto Workshop 9/04 a Yj j t jand Yt b Zj jt j41

3.4 Identifiability (cont)Now let {Y(s,t)} IID, Y(s,t) d Y1 and setUt a Y ( s, t ).s - sClearly, {Ut} is IID with same distribution as Z1. Consequently,Y1 d bUt tt b a Y ( s, t ).t s t sSince 2 2b t as 1,t s Which by applying Theorems 5.6.1 and 3.3.1 in Kagan, Linnik, and Rao(1973), the sum above is trivial, i.e., there exists integers m and n suchthat am and bn are the only two nonzero coefficients. It follows thatYt bn Z t n , η j MaPhySto Workshop 9/041ψ j n .bn42

3.5 Linear TestsCumulants and Polyspectra. We cannot base tests for linearity onsecond moments. A direct approach is to consider moments of higherorder and corresponding generalizations of spectral analysis.Suppose that {Xt} satisfies supt E Xt k for some k 3 andE ( X t0 X t1 L X t j ) E ( X t0 h X t1 h L X t j h )for all t0,t1, . . . , tj, h 0, and j 0, . . ., k-1.kth order cumulant. Coefficient, Ck(r1, . . . , rk-1), of ikz1z2 zk in theTaylor series expansion about (0,0, ,0) ofχ( z1 ,K, zk ) ln E exp(iz1 X t iz2 X t r1 L izk X t rk 1 )MaPhySto Workshop 9/0443

3.5 Linear Tests (cont)3rd order cumulant.C3 ( r, s ) E (( X t µ)( X t r µ)( X t s µ) )If C (r, s) rs3then we define the bispectral density or (3rd – order polyspectral density)To be the Fourier transform,1f 3 ( ω1 , ω2 ) ( 2 π) 2 C (r, s)er s 3 irω1 isω2,-π ω1, ω2 π.MaPhySto Workshop 9/0444

3.5 Linear Tests (cont)kth - order polyspectral density.Provided L r1r2rk 1 Ck ( r1 ,K, rk 1 ) ,f k ( ω1 ,K, ωk 1 ) : 1 ( 2π) k 1 r1 Lr2 ir1ω1 L irk 1ωk 1C(r,K,r)e, k 1 k 1rk 1 -π ω1, . . . , ωk 1 π. (See Rosenblatt (1985) Stationary Sequences andRandom Fields for more details.)MaPhySto Workshop 9/0445

3.5 Linear Tests (cont)Applied to a linear process. If {Xt} has the Wold decomposition X t ψ j Z t j , {Z t } IID (0, σ2 ),j 0with E Zt 3 , EZt3 η, and Σj ψj , then C3 ( r, s ) η ψ j ψ j r ψ j sj where ψj : 0 for j 0. Henceηf 3 ( ω1 , ω2 ) 2 ψ( eiω1 iω2 )ψ( e iω1 )ψ( e iω2 ).4πMaPhySto Workshop 9/0446

3.5 Linear Tests (cont)The spectral density of {Xt} isσ2f ( ω) ψ( eiω ) 2 .2πHence, defining f 3 ( ω1 , ω2 ) 2φ( ω1 , ω2 ) ,f ( ω1 ) f ( ω2 ) f ( ω1 ω2 )we find thatη2φ( ω1 , ω2 ) .62 πσTesting for constancy of φ( ) thus provides a test for linearity of {Xt} (seeSubba Rao and Gabr (1980)).MaPhySto Workshop 9/0447

3.5 Linear Tests (cont)Gaussian linear process. If {Xt} is Gaussian, then EZ3 0, and the thirdorder cumulant is zero (why?). In fact Ck 0 for all k 2.It follows that f3(ω1, ω2) 0 for all ω1, ω2 [0,π]. A test for linearGaussianity can therefore be obtained by estimating f3(ω1,ω2) andtesting the hypothesis that f3 0 (see Subba Rao and Gabr (1980)).MaPhySto Workshop 9/0448

3.6 PredictionSuppose {Xt} is a purely nondeterministic process with WD given by X t ψ j Z t j , {Z t } WN (0, σ2 ).j 0Thenso that Z t X t Pt 1 ( X t ) Pt 1 X t ψ j Z t j .j 1Question. When does the best linear predictor equal the best predictor?That is, when does Pt 1 X t E ( X t X t 1 , X t 2 K) ?MaPhySto Workshop 9/0449

3.6 Prediction (cont) Pt 1 X t E ( X t X t 1 , X t 2 K) ?Answer. Need Z t X t Pt 1 X t σ( X t 1 , X t 2 ,K)or, equivalently,E ( Z t X t 1 , X t 2 ,K) 0.That is,BLP BPif and only if {Zt} is a Martingale-difference sequence.Def. {Zt} is a Martingale-difference sequence wrt a filtration Ft (anincreasing sequence of sigma fields) if E Zt for all t anda) Zt is Ft measurableb) E(Zt Ft-1) 0 a.s.MaPhySto Workshop 9/0450

3.6 Prediction (cont)Remarks.1) An IID sequence with mean zero is a MG difference sequence.2) A purely nondeterministic Gaussian process is a Gaussian linearprocess. This follows by the Wold decomposition and the fact thatthe resulting {Zt} sequence must be IID N(0,σ2) .Example (Whittle): Consider the noncausal AR(1) process given byXt 2 Xt-1 Zt ,where {Zt} IID P(Zt -1) P(Zt 0) .5. Iterating backwards in time, wefind thatX t 1 .5 X t .5Z t .52 X t 1 .52 Z t 1 .5Z tM .5( Z t .5Z t 1 .52 Z t 2 L).MaPhySto Workshop 9/0451

3.6 Prediction (cont)X t .5( Z t 1 .5Z t 2 .52 Z t 3 L)Z t* 1 Z t* 2 Z t* 3 2 3 L,222Z t* 1 Z t 1is a binary expansion of a uniform (0,1) random variable. Notice thatfrom Xt, we can find Xt 1, by lopping off the first term in the binaryexpansion. This operation is exactly,Xt 1 2 Xt mod 1Properties:MaPhySto Workshop 9/04if Xt .5, 2Xt , 2Xt -1, if Xt .5.52

-20-10X(t)Realizationfrom an allpass modelof order 2010204. Allpass models-30(t3 noise )0200400600A C F : ( a llp a s s 60.8tACFMaPhySto Workshop 9/041000A C F : ( a llp a s s ) 201020Lag304001020Lag304053

4. Allpass models (cont)Causal AR polynomial: φ(z) 1 φ1z L φ zpp, φ(z) 0 for z 1.Define MA polynomial:θ(z) zp φ(z 1)/φp (zp φ1zp-1 L φ )/ φpp 0 for z 1 (MA polynomial is non-invertible).Model for data {Xt} : φ(B)Xt θ(B) Zt , {Zt} IID (non-Gaussian)BkXt Xt-kExamples:All-pass(1): Xt φ Xt-1 Zt φ 1 Zt-1 , φ 1.All-pass(2): Xt φ1 Xt-1 φ2 Xt-2 Zt φ1/ φ2 Zt-1 1/ φ2 Zt-2MaPhySto Workshop 9/0454

Properties: causal, non-invertible ARMA with MA representation B p φ( B 1 )Z t ψ jZ t jXt φpφ( B )j 0 uncorrelated (flat spectrum) ipω 2iω2φ(e ) σ 2σ2 2f X (ω) iω 222πφ p 2πφ p φ(e )e zero mean data are dependent if noise is non-Gaussian(e.g. Breidt & Davis 1991). squares and absolute values are correlated. Xt is heavy-tailed if noise is heavy-tailed.MaPhySto Workshop 9/0455

Estimation for All-Pass Models) Second-order moment techniques do not work least squares Gaussian likelihood) Higher-order cumulant methods Giannakis and Swami (1990) Chi and Kung (1995)) Non-Gaussian likelihood methods likelihood approximation assuming known density quasi-likelihood) Other LAD- least absolute deviation R-estimation (minimum dispersion)MaPhySto Workshop 9/0456

4.1 Application of Allpass modelsNoninvertible MA models with heavy tailed noiseXt Zt θ1 Zt-1 . . . θq Zt-q ,a. {Zt} IID nonnormal. . . θ q zqb. θ(z) 1 θ1 z No zeros inside the unit circle invertibleSome zero(s) inside the unit circle noninvertibleMaPhySto Workshop 9/0457

Realizations of an invertible and noninvertible MA(2) processesModel: Xt θ (B) Zt , {Zt} IID(α 1), where-300-40-20-1000010020θi(B) (1 1/2B)(1 1/3B) and θni(B) (1 2B)(1 F300246LagMaPhySto Workshop 9/048100246810Lag58

Application of all-pass to noninvertible MA model fittingSuppose {Xt} follows the noninvertible MA modelXt θi(B) θni(B) Zt , {Zt} IID.Step 1: Let {Ut} be the residuals obtained by fitting a purelyinvertible MA model, i.e.,SoX t θˆ (B)U t θi (B) θni (B)U t , ( θni is the invertible version of θ ni ).θ ni (B)Ut Ztθni (B)Step 2: Fit a purely causal AP model to {Ut} θni (B)U t θ ni (B)Z t .MaPhySto Workshop 9/0459

6*10 52*10 5X(t)10 6Volumes of Microsoft (MSFT) stock traded over 755 transaction days(6/3/96 to 5/28/99)0200400600tMaPhySto Workshop 9/0460

Analysis of MSFT:Step 1: Log(volume) follows MA(4).Xt (1 .513B .277B2 .270B3 .202B4) Ut(invertible MA(4))Step 2: All-pass model of order 4 fitted to {Ut} using MLE (t-dist):(1 .628B .229B2 .131B3 .202B4 )U t (1 .649B 1.135B2 3.116B3 4.960B4 )Z t . (νˆ 6.26)(Model using R-estimation is nearly the same.)Conclude that {Xt} follows a noninvertible MA(4) which after refitting hasthe form:Xt (1 1.34B 1.374B2 2.54B3 4.96B4) Zt , {Zt} IID t(6.3)MaPhySto Workshop 9/0461

0.010203040010203040Lag(c) ACF of Squares of Zt(d) ACF of Absolute Values of ) ACF of Absolute Values of Ut0.8(a) ACF of Squares of Ut010MaPhySto Workshop 9/0420Lag304001020Lag304062

Summary: Microsoft Trading Volume) Two-step fit of noninvertible MA(4): invertible MA(4): residuals not iid causal AP(4); residuals iid) Direct fit of purely noninvertible MA(4):(1 1.34B 1.374B2 2.54B3 4.96B4)) For MCHP, invertible MA(4) fits.MaPhySto Workshop 9/0463

0.5Minimum AICc ARMA model:ARMA(1,1)0.0Yt .574 Yt-1 εt – .311 εt-1, {εt} WN(0,.0564)-0.5100020003000distance (m)400050001.01.000.4Red model0.6acf0.6Red model0.00.00.20.2acfBlue sample0.80.8Blue sample0.4residuals deg 4Muddy Creek: residuals from poly(d 4) fit0100200MaPhySto Workshop 9/04 lag (m)3004000100200lag (m)30040064

Residual ACF: Abs 40-.40-.60-.60-.80-.80-1.00CausalARMA(1,1) modelYt .574 Yt-1 εt – .311 εt-1,{εt} WN(0,.0564)-1.0001.00Residual ACF: Squares1.00510152025303540Residual ACF: Abs 40-.60-.60-.80-.80-1.00152025303540Residual ACF: Squares1.00.8005 :Yt 1.743 Yt-1 εt – .311 εt-1-1.0025303540051015202530354065

Example: Seismogram DeconvolutionSimulated water gun seismogram {βk} wavelet sequence (Lii and Rosenblatt, 1988)-500000005000000 {Zt} IID reflectivity sequence0MaPhySto Workshop 9/04200400600time800100066

Water Gun Seismogram FitStep 1: AICC suggests ARMA (12,13) fit fit invertible ARMA(12,13) via Gaussian MLE residualsnot IIDStep 2: fit all-pass toresiduals order selected is r 2. residualsappear IIDStep 3: Conclude that {Xt} follows a non-invertible ARMAMaPhySto Workshop 9/0467

1.00.80.60.00.20.4acfACF of Wt25101.00152025300.8lag (h)0.00.20.4acf0.6ACF of Zt20MaPhySto Workshop 9/0451015lag (h)20253068

Water Gun Seismogram Fit eEstimate400000600000Recorded water gun wavelet and its estimate020406080lagMaPhySto Workshop 9/0469

Water Gun Seismogram Fit (cont)Simulated reflectivity sequence and its estimatesMaPhySto Workshop 9/0470

4.2 Estimation for Allpass Models: approximating the likelihoodData: (X1, . . ., Xn)Model:X t φ01 X t 1 L φ0 p X t p ( Z t p φ01Z t p 1 L φ0 p Z t ) / φ0 rwhere φ0r is the last non-zero coefficient among the φ0j’s.Noise: zt p φ01 zt p 1 L φ0 p zt ( X t φ01 X t 1 L φ0 p X t p ),where zt Zt / φ0r.More generally define,if t n p,., n 1, 0,z t p (φ) φ1 zt p 1 (φ) L φ p zt (φ) φ( B) X t , if t n,., p 1.Note: zt(φ0) is a close approximation to zt (initialization error)MaPhySto Workshop 9/0471

Assume that Zt has density function fσ and consider the vectorz ( X 1 p ,., X 0 , z1 p (φ),., z0 (φ), z1 (φ),., z n p 1 (φ),., z n (φ))'1444442444443 14444244443independent piecesJoint density of z:h(z ) h1 ( X 1 p ,., X 0 , z1 p (φ),., z0 (φ)) n p f σ (φ q zt (φ)) φ q

Time Series: Theory and Methods Brockwell and Davis (2001). Introduction to Time Series and Forecasting. Durbin and Koopman (2001). Time Series Analysis by State-Space Models. Embrechts, Klüppelberg, and Mikosch (1997). Modelling Extremal Events. Fan and Yao (2001). Nonlinear Time Series. Frances and van Dijk (2000).

Related Documents:

Columbia 25th Birthday Button, 1992 Columbia 25th Birthday Button, 1992 Columbia Association's Celebrate 2000 Button, 1999 Columbia 40th Birthday Button, 2007 Lake-Front Live, Columbia Festival of the Arts Button, n.d. Columbia 22nd Birthday Button, 1989 I Love Columbia Button, n.d. Histor

Nonlinear Finite Element Analysis Procedures Nam-Ho Kim Goals What is a nonlinear problem? How is a nonlinear problem different from a linear one? What types of nonlinearity exist? How to understand stresses and strains How to formulate nonlinear problems How to solve nonlinear problems

Third-order nonlinear effectThird-order nonlinear effect In media possessing centrosymmetry, the second-order nonlinear term is absent since the polarization must reverse exactly when the electric field is reversed. The dominant nonlinearity is then of third order, 3 PE 303 εχ The third-order nonlinear material is called a Kerr medium. P 3 E

Outline Nonlinear Control ProblemsSpecify the Desired Behavior Some Issues in Nonlinear ControlAvailable Methods for Nonlinear Control I For linear systems I When is stabilized by FB, the origin of closed loop system is g.a.s I For nonlinear systems I When is stabilized via linearization the origin of closed loop system isa.s I If RoA is unknown, FB provideslocal stabilization

Nonlinear oscillations of viscoelastic microcantilever beam based on modi ed strain gradient theory . nonlinear curvature e ect, and nonlinear inertia terms are also taken into account. In the present study, the generalized derived formulation allows modeling any nonlinear . Introduction Microstructures have considerably drawn researchers' .

SMB_Dual Port, SMB_Cable assembly, Waterproof Cap RF Connector 1.6/5.6 Series,1.0/2.3 Series, 7/16 Series SMA Series, SMB Series, SMC Series, BT43 Series FME Series, MCX Series, MMCX Series, N Series TNC Series, UHF Series, MINI UHF Series SSMB Series, F Series, SMP Series, Reverse Polarity

Columbia Days Inn 1504 Nashville Highway Columbia, TN 38401 1-800-576-0003 Comfort Inn Columbia 1544 Bear Creek Pike Columbia, TN 38401 1-866-270-2846 Holiday Inn Express 1554 Bear Creek Pike Columbia, TN 38401 1-800-465-4329 Jameson Inn Columbia 715 James M. Campbell Columbia, TN 34802 1-800-423-7846

Sepanjang buku ini kata-kata dan frasa tertentu dicetak tebal. Hal ini menunjukkan bahwa mereka memiliki arti medis atau hukum khusus – ini didefinisikan pada bab 1. Manfaat dari Polis Grup Anda dirinci pada bab 4 dari buku ini. Sertifikat Asuransi Anda menunjukkan pertanggungan yang tersedia, masa dan tingkat pertanggungan Anda. Sebagaimana halnya dengan kontrak asuransi kesehatan, ada .