Traditional particle filter resampling is driven only by some approximation of our posterior uncertainty. This ignores
any prior knowledge that we have about the dynamics. In this project we aim to build normalizing flows that act as
discriminators for the resampling step of a particle filter. The resampling that we propose will reduce the dimension of
the resampling step from the dimension of Euclidean representation to that of the dimension of the underlying manifold
of the dynamics.
In the figure above, the left panel has the three standard deviations of a Gaussian distribution in blue, and a
empirically known discriminator representing the attractor of the Ikeda map represented by the yellow shaded region. The
Gaussian distribution describes our prior uncertainty about the state of some system (like the position of a target) and
the yellow outline describes our prior uncertainty about the valid states on the Ikeda map attractor. The right panel
represents our combined uncertainty, which is a highly non-Gaussian distribution.
The Gaussian distribution can be generalized to any distribution, which is typically a mixture model in the case of an
ensemble mixture model filter, and the discriminator can be approximated from data using a normalizing flow. A normalizing
flow is a type of data driven model ("neural network" in "machine learning" terms) that learns to transform between a simple
distribution such as the Epanechnikov distribution, and a complex distribution like that of the distribution of induced by
a chaotic dynamical system. Normalizing flows are invertible, thus inducing a discriminator when the initial distribution is
compactly supported, such is the case with the Epanechnikov distribution.
Stay tuned!