Kalman Filter in 3 Ways
Slide: Discover the Kalman Filter through the Geometric perspective of orthogonal projection, the Probabilistic perspective of Bayesian filtering, and the Optimization perspective of weighted least squares. Inspired by Ling Shi's 2024-25 Spring lecture "Networked Sensing, Estimation and Control".
Introduction
System Model and Assumptions
Consider a discrete-time linear Gaussian system with initial condition
Assumptions:
is controllable and is observable , and are mutually uncorrelated - The future state of the system is conditionally independent of the past states given the current state
Goal: Find
Geometric Perspective: Orthogonal Projection
Hilbert Space of Random Variables
Key Idea:
- View random variables as vectors in Hilbert space
- Inner product:
- Orthogonality:
- Optimal estimate is orthogonal projection onto observation space
Geometric Interpretation:

Courtesy: https://math.stackexchange.com/users/117818/qqo
Time Update
State Prediction:
Covariance Prediction:
Innovation Process
Definition:
Properties:
- Zero Mean:
- White Sequence:
for - Orthogonality Principle:
for
Measurement Update
State Update:
Covariance Update:
Kalman Gain Derivation
Optimal Kalman Gain:
Covariance Derivation:
Probabilistic Perspective: Bayesian Filtering
Bayesian Filtering Framework
Prediction Step: Gaussian Propagation
Predicted Mean:
Predicted Covariance:
Update Step: Gaussian Product
Gaussian Product:
Posterior Result:
Optimization Perspective: MAP Estimation
Maximum A Posteriori Formulation
MAP Estimation:
Weighted Least Square:
MAP as Weighted Least Squares
Posterior Distribution:
Assume Gaussian Distributions:
Negative Log-Posterior:
where
MAP Solution
Weighted Least Squares Form:
MAP Estimate:
Equivalence Proof
Using Matrix Inversion Lemma:
Proof:
This shows the equivalence between the MAP solution and the Kalman update.
Conclusion
Theoretical Insights and Extensions
Key Insights:
- Geometric: Reveals orthogonality principle and innovation process
- Probabilistic: Shows optimality under Gaussian assumptions
- Optimization: Connects to weighted least squares and regularization
Unified Algorithm: All approaches yield the same recursive equations:
Extensions:
- Nonlinear systems: EKF, UKF, particle filters
- Non-Gaussian noise: robust Kalman filters