Download Observers and Kalman Filters: Estimating Uncertain Systems and more Slides Robotics and Autonomous Systems in PDF only on Docsity! Observers and Kalman Filters docsity.com Stochastic Models of an Uncertain World • Actions are uncertain. • Observations are uncertain. • i ~ N(0,i) are random variables Ý x F(x,u) y G(x) Ý x F(x,u,1) y G(x,2 ) docsity.com Estimates and Uncertainty
* Conditional probability density function
jl
F x(i)fe( 1) z(2)ae efi) Ep Sa %)
docsity.com
Gaussian (Normal) Distribution • Completely described by N(, 2) • Mean • Standard deviation , variance 2 1 2 e (x )2 / 2 2 docsity.com The Central Limit Theorem • The sum of many random variables • with the same mean, but • with arbitrary conditional density functions, converges to a Gaussian density function. • If a model omits many small unmodeled effects, then the resulting error should converge to a Gaussian density function. docsity.com Estimating a Value • Suppose there is a constant value x. • Distance to wall; angle to wall; etc. • At time t1, observe value z1 with variance • The optimal estimate is with variance 1 2 ˆ x(t1) z1 1 2 docsity.com A Second Observation • At time t2, observe value z2 with variance 2 2 docsity.com Merged Evidence
docsity.com
—
ol
My
_
be
PS
—
—
cl
+
=
N ‘,
=
=
—
be
—
cl
Pn
—
=
See
A
Predictor-Corrector • Update best estimate given new data • Update variance: ˆ x(t2) ˆ x(t1)K(t2)(z2 ˆ x(t1)) K(t2) 1 2 1 2 2 2 2 (t2) 2 (t1)K(t2 ) 2 (t1) (1K(t2)) 2 (t1) docsity.com Static to Dynamic • Now suppose x changes according to Ý x F(x,u,) u (N (0, )) docsity.com Dynamic Prediction • At t2 we know • At t3 after the change, before an observation. • Next, we correct this prediction with the observation at time t3. ˆ x(t3 ) ˆ x(t2) u[t3 t2] 2 (t3 ) 2 (t2) 2 [t3 t2 ] ˆ x(t2) 2 (t2) docsity.com Kalman Filter • Takes a stream of observations, and a dynamical model. • At each step, a weighted average between • prediction from the dynamical model • correction from the observation. • The Kalman gain K(t) is the weighting, • based on the variances and • With time, K(t) and tend to stabilize. 2 (t) 2 2 (t) docsity.com Simplifications • We have only discussed a one-dimensional system. • Most applications are higher dimensional. • We have assumed the state variable is observable. • In general, sense data give indirect evidence. • We will discuss the more complex case next. Ý x F(x,u,1) u1 zG(x,2) x 2 docsity.com Up To Higher Dimensions • Our previous Kalman Filter discussion was of a simple one- dimensional model. • Now we go up to higher dimensions: • State vector: • Sense vector: • Motor vector: • First, a little statistics. x n z m u l docsity.com Covariance Matrix • Along the diagonal, Cii are variances. • Off-diagonal Cij are essentially correlations. C1,1 1 2 C1,2 C1,N C2,1 C2,2 2 2 CN ,1 CN ,N N 2 docsity.com Independent Variation • x and y are Gaussian random variables (N=100) • Generated with x=1 y=3 • Covariance matrix: Cxy 0.90 0.44 0.44 8.82 docsity.com Dependent Variation • c and d are random variables. • Generated with c=x+y d=x-y • Covariance matrix: Ccd 10.62 7.93 7.93 8.84 docsity.com Time Update (Predictor) • Update expected value of x • Update error covariance matrix P • Previous statements were simplified versions of the same idea: ˆ xk Aˆ xk1 Buk1 Pk APk1A T Q ˆ x(t3 ) ˆ x(t2) u[t3 t2] 2 (t3 ) 2 (t2) 2 [t3 t2 ] docsity.com Measurement Update (Corrector) • Update expected value • innovation is • Update error covariance matrix • Compare with previous form ˆ xk ˆ xk Kk(zk Hˆ xk ) zk Hˆ xk Pk (IK kH) Pk ˆ x(t3) ˆ x(t3 )K(t3)(z3 ˆ x(t3 )) 2 (t3) (1K(t3)) 2 (t3 ) docsity.com The Kalman Gain • The optimal Kalman gain Kk is • Compare with previous form K k Pk H T (HPk H T R) 1 Pk HT HPk H T R K(t3) 2(t3 ) 2 (t3 ) 3 2 docsity.com Linearize the Non-Linear • Let A be the Jacobian of f with respect to x. • Let H be the Jacobian of h with respect to x. • Then the Kalman Filter equations are almost the same as before! A ij f i x j (xk1,uk1) Hij hi x j (xk) docsity.com EKF Update Equations • Predictor step: • Kalman gain: • Corrector step: ˆ xk f ( ˆ xk1,uk1) Pk APk1A T Q K k Pk H T (HPk H T R) 1 ˆ xk ˆ xk Kk(zk h(ˆ xk )) Pk (IK kH) Pk docsity.com