Probability Hypothesis Density Filter

Matthieu Bloch

Thursday, December 1, 2022

Today in ECE 6555

  • Announcements
    • 2 lectures left (including today)
    • One more homework (optional, I didn't realize how close the end of the semester was…)
    • One final exam take-home
  • Last time
    • Gaussian processes
    • Check office hours video for discussion of
  • Today
    • PhD filter for multi-target tracking
  • Questions?

Single target tracking

  • Probabilistic state-space model
    • State vector \(x\in\bbR^n\)
    • Markov process or order 1 with transition density \(f_{k|k-1}(x_k|x_{k-1})\) governing state evolution
    • Observation \(z\in\bbR^d\)
    • Noisy observation with lieklihood \(g_k(z_k|x_k)\)
  • Bayesian filtering
    1. Initialization: start from known \(p(x_0)\)
    2. For \(i\geq 1\)
      1. Prediction: compute \(p(x_i|z_{0:i-1})\) by Chapman-Kolmogorov equation \[ p(x_i|y_{0:i-1})= \int f_i(x_i|x_{i-1})p(x_{i-1}|y_{0:i-1})dx_{i-1} \]
      2. Update: compute \(p(x_i|z_{0:i})\) by Bayes' rule \[ p(x_i|y_{0:i})= \frac{1}{Z_i}p(y_i|x_i)p(x_i|z_{0:i-1})\textsf{ with } Z_i= \int g_i(z_i|x_i)p(x_i|y_{0:i-1})dx_i \]

Multi-target tracking

  • Multi-target tracking extends the problem to multiple objects
    • At time \(k\), \(M(k)\) targets with state vectors \(\set{x_i}_{i=1}^{M(k)}\), each in \(\calX\subset\bbR^n\)
      • Targets may die, evolve, or new targets may appear
    • At time \(k\), \(N(k)\) measurements \(\set{z_i}_{i=1}^{N(k)}\), each in \(\calZ\subset\bbR^d\)
      • Measurements cannot be associated to targets
  • No ordering on the targets and measurements (no association)
    • Represented as collections \(X_k\eqdef \set{x_i}_{i=1}^{M(k)}\in\calF(\calX)\), \(Z_k\eqdef \set{z_i}_{i=1}^{N(k)}\in\calF(\calZ)\)
    • Collections modeled as random finite sets (RFS) to model undertainty
  • RFS model
    • Given \(X_{k-1}\), each target \(x_{k-1}\in X_{k-1}\) survives with probability \(p_{S,k}(x_{k-1})\) or dies
    • Conditioned on existence, state in \(X_{k-1}\) evolves according to \(f_{k|k-1}(x_k|x_{x_{k-1}})\)
    • A given state \(x_{k-1}\) can evolve either into an RFS \(S_{k|k-1}(x_{k-1})\) equal to \(x_k\) or \(\emptyset\)
    • Spontaneous births of targets described by independent RFS \(\Gamma_k\)
    • Existing target \(\xi\) can spawn new targets described by independent RFS \(B_{k|k-1}(\xi)\)
    \[ X_k = \left[\bigcup_{\xi\in X_{k-1}}S_{k|k-1}(\xi)\right]\bigcup\left[\bigcup_{\xi\in X_{k-1}}B_{k|k-1}(\xi)\right]\bigcup\Gamma_k \]

RFS model

  • Measurement model

    • Target \(x_k\in X_k\) detected with probabiltiy \(p_{D,k}(x_k)\)
    • If detected, target generates observation \(z_k\) according to \(g_k(z_k|x_k)\)
    • A given state \(x_{k}\in X_k\) can generatesa an RFS \(\Theta_{k}(x_{k})\) equal to \(z_k\) or \(\emptyset\)
    • Sensors can also receive clutter (false measurements) \(K_k\)

    \[ Z_k = \left[\bigcup_{\xi\in X_{k}}\Theta_{k}(\xi)\right]\bigcup K_k \]

  • Ideal Bayes filter to track multiple-target posterior density \[ p_{k|k-1}(X_k|Z_{1:k-1}) = \int f_{k|k-1}(X_k|X)p_{k-1}(X|Z_{1:k-1})\mu_s(X) \] \[ p_k(X_k|Z_{1:k}) = \frac{g_k(Z_k|X_k)p_{k|k-1}(X_k|Z_{1:k-1})}{\int g_k(Z_k|X)p_{k|k-1}(X_k|Z_{1:k-1})\mu_s(X)} \]

    • Intractable!

PhD filter

  • Probability Hypothesis Density filter is analogous to the Kalman filter for single-target tracking
    • Approximation to alleviate computational burden in the multiple-target Bayes filter
    • Propagates a first-order statistical moment of the posterior multiple-target state
  • Intensity function: first- order moment of RFS \(X\in\calX\)
    • Intensity if \(v:\calX\to\bbR\) such that for any \(\calS\subset\calX\) \[ \int\abs{X\cap \calS} P(dX) = \int_\calS v(x) dx \]
    • \(\hat{N}=\int v(x)dx\) is the expected number of targets in \(X\)
    • local maxima of \(v\) can be used to generated estimate of elements in \(X\)
  • Poisson RFS
    • Completely characterized by intensity function
    • \(\P{\abs{X}=n}\) is poisson with mean \(\hat{N}\)
    • elements of \(X\) are i.i.d. according to the probability density function \(v(x)/N\)
  • Clutter, Birth, Spawn RFS modeled as Poisson RFS

PhD filter

  • Intensities
    • \(\gamma_k\): intensity of birth RFS \(\Gamma_k\)
    • \(\beta_{k|k-1}(\cdot|\xi)\): intensity of spawn RFS \(B_{k|k-1}(\xi)\)
    • \(p_{S,k}(\xi)\): probability that target survives
    • \(p_{D,k}(\xi)\): probability that target is detected
    • \(\kappa_k\): intensity of clutter RFS
  • Assumptions
    • Each target evolves and generates observations independently of one another
    • Clutter is Poisson and independent of target-originated measurements
    • The predicted multiple-target RFS governed by \(p_{k|k-1}\) is Poisson
      • approximation but true without spawning if \(X_{k-1}\) and \(\Gamma_k\) Poisson
  • Recursions for intensity \[ v_{k|k-1}(x) = \int p_{S,k}(\xi)f_{k|k-1}(x|\xi)v_{k-1}(\xi)d\xi + \int \beta_{k|k-1}(x|\xi)v_{k-1}(\xi)d\xi + \gamma_k(x) \] \[ v_k(x) = (1-p_{D,k}(x))v_{k|k-1}(x) + \sum_{z\in Z_k}\frac{p_{D,k}(x)g_{k}(z|x)v_{k|k-1}(x)}{\kappa_k(z)+\int p_{D,k}(\xi)g_k(z|\xi)v_{k|k-1}(\xi)d\xi} \]

Linear gaussian models

  • Linear Gaussian models for dynamics and measurment

    • \(f_{k|k-1}(x|\xi)\sim\calN(x;F_{k-1}\xi,Q_{k-1})\)
    • \(g_k(z|x) \sim\calN(z;H_k x,R_k)\)
  • Survival and detection probabilities state independent \[ p_{S,k}(x) = p_{S,k}\quad p_{D,k}(x) = p_{D,k} \]

  • Gaussian mixtures for birth and spawn intensities \[ \gamma_k(x)\eqdef \sum_{i=1}^{J_{\gamma,k}}w_{\gamma,k}^{(i)}\calN(x;m_{\gamma,k}^{(i)},P_{\gamma,k}^{(i)}) \] \[ \beta_{k|k-1}(x|\xi)\eqdef \sum_{i=1}^{J_{\beta,k}}w_{\beta,k}^{(j)}\calN(x;F_{\beta,k-1}^{(j)}\xi+d_{\beta,k-1}^{(j)},Q_{\beta,k-1}^{(j)}) \]

  • Key insight: Gaussian mixtures are preserved

Gaussian Mixture PhD recursion (1/3)

  • Assume that \[ v_{k-1}(x) = \sum_{i=1}^{J_{k-1}}w_{k-1}^{(i)}\calN(x;m_{k-1}^{(i)},P_{k-1}^{(i)}) \] then \[ v_{k-1|k-1}(x) = v_{S,k|k-1}(x) + v_{\beta,k|k-1}(x) + \gamma_k(x) \]
  • where \[ v_{S,k|k-1}(x)=p_{S,k}\sum_{j=1}^{J_{k-1}}w_{k-1}^{(j)}\calN(x;m_{S,k|k-1}^{(j)},P_{S,k|k-1}^{(j)}) \] \[ m_{S,k|k-1}^{(j)} = F_{k-1}m_{k-1}^{(j)}\qquad P_{S,k|k-1}^{(j)} = F_{k-1}P_{k-1}^{(j)}F_{k-1}^T+Q_{k-1} \]

Gaussian Mixture PhD recursion (2/3)

  • \[ v_{\beta,k|k-1}(x)=\sum_{j=1}^{J_{k-1}}\sum_{\ell=1}^{J_{\beta,k}}w_{k-1}^{(j)}w_{\beta,k}^{\ell}\calN(x;m_{\beta,k|k-1}^{(j,\ell)},P_{\beta,k|k-1}^{(j,\ell)}) \]
  • where \[ m_{\beta,k|k-1}^{(j,\ell)} = F_{\beta,k-1}^{(\ell)}m_{k-1}^{(j)}+d_{\beta,k-1}^{(\ell)}\quad P_{\beta,k|k-1}^{(j,\ell)} = F_{\beta,k-1}^{(\ell)}P_{\beta,k-1}^{(j)}{F_{\beta,k-1}^{(\ell)}}^T+Q_{\beta,k-1}^{(\ell)} \]

Gaussian Mixture PhD recursion (3/3)

  • Assume that \[ v_{k|k-1}(x) = \sum_{i=1}^{J_{k|k-1}}w_{k|k-1}^{(i)}\calN(x;m_{k|k-1}^{(i)},P_{k|k-1}^{(i)}) \] then \[ v_{k-1}(x) = (1-p_{D,k})v_{k|k-1}(x) + \sum_{z\in Z_k}v_{D,k}(x;z) \] \[ v_{D,k}(x;z) = \sum_{j=1}^{J_{k|k-1}}w_{k}^{(j)}(z)\calN(x;m_{k|k}^{(j)},P_{k|k}^{(j)}) \] \[ w_{k}^{(j)}=\frac{p_{D,k}w_{k|k-1}^{(j)}q_k^{(j)(z)}}{\kappa_k(z)+p_{D,k}\sum_{\ell=1}^{J_{k|k-1}}w_{k|k-1}^{(\ell)}q_k^{(\ell)}(z)} \]