I am a computational scientist; I create methods that enable practitioners to make better predictions and make smarter
decisions while using as little computational power as is practical. My primary research focus is on Bayesian inference
and uncertainty quantification in the broad sense, with a particular focus on computational methods for uncertainty
quantification in medium-to-large scale scientific problems such as multi-target tracking, sensor tasking, data
assimilation, and equation discovery.
We point a sensor at the night sky expecting to see a satellite. It is not there. As the absence of evidence is not the
evidence of absence, we have to make use of this information to the maximal extend allowed, but how? My new approach
reformulates Bayesian inference as a geometric problem of uniform distributions on convex H-polytopes and H-polyhedra.
In this formulation Bayesian inference simply becomes matrix and vector concatenation. Extending this idea to mixture
models of uniform distributions on H-polytopes allows us to create methods that converge to true Bayesian inference in
distribution. This allows us to ``subtract out'' the sensor window from our measurement of uncertainty about where the
satellite with little additional computational cost.
Traditional particle filter resampling is driven only by some approximation of our posterior uncertainty. This ignores
any prior knowledge that we have about the dynamics. In this project we aim to build normalizing flows that act as
discriminators for the resampling step of a particle filter. The resampling that we propose will reduce the dimension of
the resampling step from the dimension of Euclidean representation to that of the dimension of the underlying manifold
of the dynamics.