1D Kalman Filter Kalman Filter For Computing An On-line .

2y ago
24 Views
2 Downloads
307.21 KB
7 Pages
Last View : 29d ago
Last Download : 3m ago
Upload by : Abby Duckworth
Transcription

6.869ScheduleComputer Vision Tuesday, May 3:Prof. Bill Freeman Thursday, May 5:– Particle filters, tracking humans, Exam 2 out– Tracking humans, and how to write conference papers& give talks, Exam 2 dueParticle Filter Tracking– Particle filtering Tuesday, May 10:Readings: F&P Extra Chapter: “Particle Filtering”– Motion microscopy, separating shading and paint (“funthings my group is doing”) Thursday, May 12:– 5-10 min. student project presentations, projects due.11D Kalman filter2Kalman filter for computing an on-line average What Kalman filter parameters and initialconditions should we pick so that the optimalestimate for x at each iteration is just the averageof all the observations seen so far?34Kalman filter modeld i 1, mi 1, σ di 0, σ mi 1Initial conditionsx0 0 σ 0 Iteration01xi 0y0xi y0y0 y12σ i 1σ i 112What happens if the x dynamics are given anon-zero variance?2y0 y12y0 y1 y231213561

Kalman filter modeld i 1, mi 1, σ d i 1, σ mi 1(KF) Distribution propagationprediction from previous time frameInitial conditionsx 0 σ 0Iteration i 0012y0x0xi y0y0 2 y13σ i 2σ i 123y0 2 y13y0 2 y1 5 y28Noiseadded tothatprediction53587Make new measurement at next time frame8[Isard 1998]Representing non-linear DistributionsDistribution propagation910[Isard 1998]Representing non-linear DistributionsRepresenting non-linear DistributionsUnimodal parametric models fail to capture realworld densities Mixture models are appealing, but very hard topropagate analytically!11122

Representing Distributions usingWeighted SamplesRepresenting Distributions usingWeighted SamplesRather than a parametric form, use a set of samplesto represent a density:Rather than a parametric form, use a set of samplesto represent a density:1314Sampled representation of aprobability distributionRepresenting distributions usingweighted samples, another pictureYou can also think of this as a sum of dirac deltafunctions, each of weight w:15[Isard 1998]Marginalizing a sampled densityp f ( x ) w iδ ( x u i )16iMarginalizing a sampled densityIf we have a sampled representation of a joint densityand we wish to marginalize over one variable:we can simply ignore the corresponding components of thesamples (!):17183

Sampled Bayes RuleSampled Bayes ruleTransforming a Sampled Representation of aPrior into a Sampled Representation of aPosterior:posteriorlikelihood, prior19Sampled Prediction20Sampled Correction (Bayes rule) ?Prior Æ posteriorReweight every sample with the likelihood of theobservations, given that sample:yielding a set of samples describing the probabilitydistribution after the correction (update) step:Drop elements to marginalize to get 21Naïve PF Tracking22Sample impoverishment Start with samples from something simple(Gaussian) RepeatTest with linear case:– Correctkf: xpf: o– PredictBut doesn’t work that well because of sampleimpoverishment 23244

Sample impoverishmentResample the prior10 of the 100 particles, along with the true Kalmanfilter track, with variance:In a sampled density representation, the frequency ofsamples can be traded off against weight:s.t. These new samples are a representation of the samedensity.25timeI.e., make N draws with replacement from theoriginal set of samples, using the weights as theprobability of drawing a sample.26A practical particle filter with resamplingResampling concentrates samples27282930A variant (predict, then resample, then correct)[Isard 1998]5

A variant (animation)ApplicationsTracking– hands– bodies– leaves3132[Isard 1998]Contour trackingHead tracking3334[Isard 1998][Isard 1998]Leaf trackingHand tracking3536[Isard 1998][Isard 1998]6

Mixed state trackingOutline Sampling densities Particle filtering[Figures from F&P except as noted]3738[Isard 1998]7

1D Kalman filter 4 Kalman filter for computing an on-line average What Kalman filter parameters and initial conditions should we pick so that the optimal estimate for x at each iteration is just the average . Microsoft PowerPoint - 2

Related Documents:

9 Problem with Kalman Filter zLinear case (Kalman Filter) zState update equation zMeasurement Equation zNon-Linear case (Extended Kalman Filter) where, f and h are non-linear functions x(k) Ax(k 1) w(k 1) z(k) Hx(k) v(k)x(k) f (x(k 1),w(k 1))z(k) h(x(k),v(k))Extended Kalman Filter (EKF) zLinearizes about current mean a

Introduction to Robotics and Intelligent Systems The Kalman Filter. Admin issues. Kalman Filter: an instance of Bayes’ Filter. Kalman Filter: an instance of Bayes’ Filter Linear dynamics with Gaussian noise Linear obs

Transform Kalman Filter (ETKF), the Ensemble Adjustment Kalman Filter (EAKF), and a filter . An operator on V is called a Hilbert-Schmidt operator if hA,Bi HS , and HS(V) is the space of all Hilbert-Schmidt operators on V. The Hilbert-Schmidt norm again dominates

Unscented Kalman Filter (UKF): Algorithm [3/3] Unscented Kalman filter: Update step (cont.) 4 Compute the filter gain Kk and the filtered state mean mk and covariance Pk, conditional to the measurement yk: Kk Ck S 1 k mk m k Kk [yk µ ]

of Complementry Filter and Kalman Filter. The main difference is that in a Kalman Filter, the observer gain is selected optimally using known characteristics of the physical system. In addition, a Kalman Filter can exploit knowledge of the physical system so that accelerometer data

Extended Kalman Filter (EKF) is often used to deal with nonlinear system identi cation. However, as suggested in [1], the EKF is not e ective in the case of highly nonlinear problems. Instead, two techniques are examined herein, the Unscented Kalman Filter method (UKF), proposed by Julier and

Simo Särkkä Lecture 2: From Linear Regression to Kalman Filter and Beyond. Summary Linear regression problem can be solved as batch problem or recursively – the latter solution is a special case of Kalman filter.

definition to four forms of artificial achievement of human goals, as summarised in Figure 1, taken from their book. Figure 1: Some definitions of AI, organised into four categories (Russel & Norvig, 2016) Our curiosity about intelligence stems from the Greek philosopher Aristotle (384–322 BC). Many