Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Manual for the Matlab toolbox EKFUKF.pdf
Скачиваний:
84
Добавлен:
10.02.2015
Размер:
1.54 Mб
Скачать

CHAPTER 4. MULTIPLE MODEL SYSTEMS

4.1Linear Systems

We can now modify the equations of linear systems described in (2.12) to have form

xk = Akj 1 xk 1 + qkj 1

(4.1)

yk = Hkj xk + rkj ;

 

where now we have denoted by j the model (or mode) which is in effect during the time step k 1. Conditioned on the currently active mode we can use the classical Kalman filter (section 2.1.2) for estimating the state of the system on each time step. However, the active mode of the system is not usually known, so we must estimate it also.

4.1.1 Interacting Multiple Model Filter

IMM-filter (Bar-Shalom et al., 2001) is a computationally efficient and in many cases well performing suboptimal estimation algorithm for Markovian switching systems of type described above. Basically it consists of three major steps: interaction (mixing), filtering and combination. In each time step we obtain the initial conditions for certain model-matched filter by mixing the state estimates produced by all filters from the previous time step under the assumption that this particular model is the right model at current time step. Then we perform standard Kalman filtering for each model, and after that we compute a weighted combination of updated state estimates produced by all the filters yielding a final estimate for the state and covariance of the Gaussian density in that particular time step. The weights are chosen according to the probabilities of the models, which are computed in filtering step of the algorithm.

The equations for each step are as follows:

Interaction:

The mixing probabilities ikjj for each model Mi and Mj are calculated as

 

 

 

n

 

 

 

 

cj

=

Xi

pij i

 

;

(4.2)

 

 

=1

k 1

 

 

 

 

 

 

 

 

ijj

=

 

1

pij i

;

 

(4.3)

 

 

 

k

 

cj

k 1

 

 

 

where ik 1 is the probability of model Mi in the time step k 1 and cj a normalization factor.

66

CHAPTER 4. MULTIPLE MODEL SYSTEMS

Now we can compute the mixed inputs (that is, means and covariances) for each filter as

 

n

ijjmi

 

 

 

 

 

 

 

m0j

=

;

 

 

 

 

 

(4.4)

k 1

Xi

k k 1

 

 

 

 

 

 

 

=1

k k 1

 

 

 

 

 

 

k 1

n

k 1

k 1

k 1

k 1

 

i=1

 

P0j

X

ijj

 

h

m0j

ih

m0j

i

T

=

Pi

+ mi

mi

 

(4.5);

where mik 1 and Pik 1 are the updated mean and covariance for model i at time step k 1.

Filtering:

Now, for each model Mi the filtering is done as

hi

m ;i

; P ;i

=

KFp(m0j

; P0j

; Ai

; Qi

);

(4.6)

k

k

 

k 1

k 1

k 1

k 1

 

 

mki ; Pki

=

KFu(mk ;i; Pk ;i; yk; Hki ; Rki );

 

(4.7)

where we have denoted the prediction and update steps (equations (2.18) and (2.19)) of the standard Kalman filter with KFp( ) and KFu( ), correspondingly. In addition to mean and covariance we also compute the likelihood of the measurement for each filter as

ki = N(vki ; 0; Ski );

(4.8)

where vki is the measurement residual and Sik it’s covariance for model Mi in the KF update step.

The probabilities of each model Mi at time step k are calculated as

 

 

 

n

 

 

 

 

 

Xi

 

(4.9)

c

=

 

 

 

ki ci;

 

 

=1

 

 

 

i

 

1

 

i

 

(4.10)

 

=

 

 

 

ci;

 

 

k

 

 

c

k

 

 

where c is a normalizing factor.

• Combination:

In the final stage of the algorithm the combined estimate for the state mean and covariance are computed as

 

 

 

n

 

 

 

 

 

 

 

 

 

 

m

k

=

Xi

i

mi

;

 

 

 

 

 

 

(4.11)

 

 

 

k

k

 

 

 

 

 

 

 

 

 

 

 

=1

 

 

 

 

 

 

 

 

 

 

Pk

=

n

ki

nPki

mki

mk

mki

mk

 

T o

: (4.12)

i=1

 

 

 

X

 

 

 

 

 

 

 

 

 

67

CHAPTER 4. MULTIPLE MODEL SYSTEMS

In this toolbox prediction and updates steps of the IMM-filter can be computed with functions imm_kf_predict and imm_kf_update. For convience we have also provided function imm_filter, which combines the two previously mentioned steps.

4.1.2 Interacting Multiple Model Smoother

Likewise in the single model case also it is useful to smooth the state estimates of the IMM filter by using all the obtained measurements. Since the optimal fixed-interval smoothing with n models and N measurements requires running nN smoothers we must resort to suboptimal approaches. One possibility (Helmick et al., 1995) is to combine the estimates of two IMM filters, one running forwards and the other backwards in time. This approach is restricted to systems having invertible state dynamics (i.e. system for which the inverse of matrix Aj in (4.1) exists), which is always the case for discretized continuous-time systems.

First we shall review the equations for the IMM-filter running backwards in time, and then the actual smoothing equations combining the estimates of the two filters.

Backward-time IMM-filter

Our aim is now to compute the backward filtering density p(xkjyk:N ) for each time step, which is expressed as a sum of model conditioned densities:

 

n

 

p(xkjyk:N ) =

kb;jp(xkj jyk:N );

(4.13)

 

Xj

 

 

=1

 

where b;jk is the backward-time filtered model probability of Mkj. In the last time step N this is the same as the forward filter’s model probability, that is, b;jN = jN . Assuming the model conditioned densities p(xjkjyk:N ) are Gaussians the backward density in (4.13) is a mixture of Gaussians, which is now going to be approximated with a single Gaussian via moment matching.

The model conditioned backward-filtering densities can be expressed as

p(xkj jyk:N ) =

1

 

c p(yk:N jxkj )p(xkj jyk+1:N );

(4.14)

where c is some normalizing constant, p(yk:N jxjk) the model-conditioned measurement likelihood and p(xjkjyk+1:N ) is the model-conditioned density of the state given the future measurements. The latter density is expressed as

p(xkj jyk+1:N ) =

n

 

 

kb;i+1jj p(xkj jMki

+1; yk+1:N );

(4.15)

 

Xi

 

 

 

=1

 

 

68

CHAPTER 4. MULTIPLE MODEL SYSTEMS

where b;ijj is the conditional model probability computed as

 

 

 

k+1

 

 

 

 

 

 

 

 

 

 

 

 

kb;i+1jj =

P fMki

+1jMkj; yk+1:N g

 

(4.16)

 

=

 

1

 

pb;k

b;i

;

 

(4.17)

 

 

 

 

 

 

 

aj

ij

k+1

 

 

 

where aj is a normalization constant given by

 

 

 

 

 

 

 

 

 

n

 

 

 

 

 

 

aj

=

 

 

 

pb;k b;i :

 

(4.18)

 

 

 

Xi

ij

k+1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

=1

 

 

 

 

 

The backward-time transition probabilities of switching from model Mi

to

model Mkj in (4.17) and (4.18) are defined as pijb;k = P fMkjjMki

 

k+1

 

+1g. The prior

model probabilities can be computed off-line recursively for each time step k as

 

P fMkjg =

n

 

 

 

 

 

 

 

 

P fMkjjMki 1gP fMki 1g

 

(4.19)

 

Xi

 

 

 

 

 

 

 

 

 

=1

 

 

 

 

 

 

 

 

 

 

 

n

 

 

 

 

 

 

 

 

Xi

 

 

 

 

 

(4.20)

 

=

 

pijP fMki 1g

 

 

 

=1

 

 

 

 

 

 

 

 

 

and using these we can compute pb;k as

 

 

 

 

 

 

 

ij

 

 

 

 

 

 

 

 

pijb;k

1

pjiP fMkjg;

 

(4.21)

 

=

 

 

 

bi

 

where bi is the normalizing constant

 

 

 

 

 

 

 

 

 

 

n

 

pjiP fMkjg:

 

 

 

 

bi =

 

 

(4.22)

 

 

 

Xj

 

 

 

 

 

 

 

=1

 

 

 

 

 

 

 

The density p(xkj jMki

+1; ykN+1:N )

 

is now approximated with a

Gaussian

N(xkjmb;ikjk+1; Pb;kjk+1(i)), where the mean and covariance are given by the Kalman filter prediction step using the inverse of the state transition matrix:

hi

m^b;i

; P^b;i

= KFp(mb;i

; Pb;i

; (Ai

) 1; Qi

):

(4.23)

k

k

k+1

k+1

k+1

k+1

 

 

The density p(xjkjyk+1:N ) in (4.15) is a mixture of Gaussians, and it’s now approximated with a single Gaussian as

 

 

j

j

b;0j

^b;0j

 

 

 

(4.24)

 

p(xkjyk+1:N ) = N(xkjm^k

; Pk

 

);

 

 

where the mixed predicted mean and covariance are given as

 

 

 

 

n

 

 

 

 

 

 

 

 

 

m^b;0j =

Xi

b;ijj m^b;i

 

 

 

 

 

 

(4.25)

k

k+1

k

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

=1

 

P^kb;i + m^kb;i

 

 

 

 

 

T

 

P^kb;0j =

n

kb;i+1jj

m^kb;0j

m^kb;i m^kb;0j

 

(4.26)

i=1

 

X

 

 

 

 

 

 

 

 

69

CHAPTER 4. MULTIPLE MODEL SYSTEMS

Now, the filtered density p(xjkjyk:N ) is a Gaussian N(xkjmb;jk ; Pb;jk ), and solving it’s mean and covariance corresponds to running the Kalman filter update step as follows:

hi

b;j

b;j

b;0j

^b;0j

j

j

(4.27)

mk

; Pk

= KFu(m^k

; Pk

; yk; Hk

; Rk):

The measurement likelihoods for each model are computed as

b;i

= N(vb;i

; 0; Sb;i);

(4.28)

k

k

k

 

where vkb;i is the measurement residual and Sb;ik it’s covariance for model Mi in the KF update step. With these we can update the model probabilities for time step k as

kb;j

=

1

aj kb;i

;

(4.29)

 

 

a

 

 

where a is a normalizing constant

 

 

 

 

 

 

m

 

 

a =

 

 

aj kb;i:

(4.30)

 

Xj

 

 

 

=1

 

 

 

Finally, we can form the Gaussian approximation to overall backward filtered distribution as

 

 

p(xkjyk:N ) = N(xkjmkb ; Pkb );

 

 

(4.31)

where the mean and covariance are mixed as

 

 

 

 

 

 

n

 

 

 

 

 

 

 

mb =

Xj

b;jmb;j

 

 

 

 

(4.32)

k

k

k

 

 

 

 

 

 

 

 

 

 

 

 

=1

 

Pkb;j + mkb;j

 

 

 

T

 

Pkb =

n

kb;j

mkb

mkb;j mkb

 

: (4.33)

j=1

 

X

 

 

 

 

 

 

4.1.3 Two-filter based fixed-interval IMM-Smoother

We can now proceed to evaluate the fixed-interval smoothing distribution

 

 

 

 

n

ks;jp(xkj jy1:N );

 

 

 

p(xkjy1:N ) =

Xj

(4.34)

 

 

 

 

=1

 

 

where the smoothed model probabilities are computed as

 

 

 

ks;j

=

P fMkjjy1:N g

(4.35)

 

 

 

=

 

1

dj j ;

(4.36)

 

 

 

 

 

 

 

 

 

d

k

 

 

 

 

 

 

 

where

j

is the forward-time

filtered model probability,

dj the density

 

k

 

 

 

 

 

 

 

dj =

p(yk+1:N jMkj; y1:k) (which is to be evaluated later) and d the normal-

ization constant given by

 

 

n

 

 

 

 

 

 

 

 

 

 

 

 

d =

Xj

dj kj :

(4.37)

 

 

 

 

 

 

 

 

 

 

=1

 

 

70

CHAPTER 4. MULTIPLE MODEL SYSTEMS

The model-conditioned smoothing distributions p(xjkjy1:N ) in (4.34) are expressed as a mixtures of Gaussians

 

 

 

 

p(xkj jy1:N ) =

 

n

ks;i+1jjp(xijMkj+1; y1:n);

 

 

 

 

 

 

 

 

(4.38)

 

 

 

 

 

 

=1

 

 

 

 

 

 

 

 

 

 

 

 

Xi

 

 

 

 

 

 

where the conditional probability s;ijj

is given by

 

 

 

 

 

 

 

 

k+1

 

 

 

 

 

 

 

 

 

 

 

ks;i+1jj =

 

 

P fMki

+1jMkj; y1:ng

(4.39)

 

 

 

 

 

=

 

 

 

1

pji kji

 

 

 

(4.40)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

dj

 

 

 

 

 

 

and the likelihood ji

by

 

 

 

 

 

 

 

 

 

 

 

 

 

 

k

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

kji = p(yk+1:N jMkj; Mki

+1; y1:k):

(4.41)

We approximate this now as

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

kji p(x^kb;ijMkj; Mki

+1; xkj );

(4.42)

which means that the future measurements yk+1:N are replaced

with the n

model-conditioned backward-time one-step predicted means and

covariances

b;i

^b;i

 

n

 

 

 

 

 

 

 

 

 

 

 

 

fm^k

; Pk

gr=1, and y1:k will be replaced by the n model-conditioned forward-

time filtered means and covariances

fmki jPki grn=1. It follows then that the

likelihoods can be evaluated as

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

kji = N( kjij0; Dkji);

(4.43)

where

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

kji

=

m^kb;i mkj

(4.44)

 

 

 

 

 

ji

 

 

 

 

^b;i

 

j

(4.45)

 

 

 

 

 

Dk

=

Pk + Pk:

The terms dj can now be computed as

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

n

 

 

 

 

 

 

 

 

 

 

dj =

pji kji:

 

(4.46)

 

 

 

 

 

 

 

 

 

 

=1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Xi

 

 

 

 

 

The smoothing distribution p(xkj jMki

+1; y1:N ) of the state matched to the models

Mj and Mi

 

over two successive sampling periods can be expressed as

k

 

k+1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

p(xkj jMki

+1; y1:N ) =

1

p(yk+1:N jMki

+1; xk)p(xkj jy1:k);

(4.47)

 

 

 

 

 

 

 

 

 

c

71

CHAPTER 4. MULTIPLE MODEL SYSTEMS

where p(yk+1:N jMki+1; xk) is the forward-time model-conditioned filtering distribution, p(xjkjy1:k) the backward-time one-step predictive distribution and c a normalizing constant. Thus, the smoothing distribution can be expressed as

j

i

 

 

 

b;i

^b;i

 

 

 

i

i

p(xkjMk+1

; y1:N ) / N(xkjm^k

; Pk )

N(xkjmk

; Pk)

 

 

 

=

N(xkjmks;ji; Pks;ji);

 

 

 

where

 

 

Pks;ji Pki

1 mki + P^kb;i

 

 

m^kb;i

mks;ji

=

 

 

 

 

 

 

 

 

 

 

 

 

1

 

 

 

 

 

1

 

 

 

Pks;ji

=

Pki 1 +

P^kb;i 1

:

 

 

 

 

(4.48)

(4.49)

(4.50)

(4.51)

The model-conditioned smoothing distributions p(xjkjy1:N ), which were expressed as mixtures of Gaussians in (4.38), are now approximated by a single Gaussians via moment matching to yield

 

 

p(xkj jy1:N ) N(xkj jmks;j; Pks;j);

 

 

(4.52)

where

 

 

 

 

 

 

 

 

n

 

 

 

 

 

 

ms;j =

Xi

s;ijjms;ji

 

 

 

(4.53)

k

k+1

k

 

 

 

 

 

 

 

 

 

 

=1

 

Pks;ij + mks;ij mks;j

 

 

T

 

Pks;j =

n

ks;i+1jj

mks;ij mks;j

 

: (4.54)

i=1

 

X

 

 

 

 

 

With these we can match the moments of the overall smoothing distribution to give a single Gaussian approximation

 

 

p(xkjy1:N ) N(xkjmks ; Pks );

 

 

(4.55)

where

 

 

 

 

 

 

 

 

n

 

 

 

 

 

 

ms =

Xj

s;jms;ji

 

 

 

(4.56)

k

k

k

 

 

 

 

 

 

 

 

 

 

=1

 

Pks;j + mks;j mks

 

 

T

 

Pks =

n

ks;j

mks;j mks

 

: (4.57)

j=1

 

X

 

 

 

 

 

These smoothing equations can be computed with function imm_smooth.

72

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]