Calculus of variations and optimal control; optimization

See also 34H05, 34K35, 65Kxx, 90Cxx, 93-XX.

Optimal Control Theory

The optimal control problem basic mathematical notion from the variational calculus - minimization of functionals - Euler-Lagrance equation - minimization of functional under constraints - optimal control of continuous or discrete time systems with or without state/input constraints - the minimum principle of Pontryagin - the linear quadratic (LQ) regulation and tracking problem - Riccati equation - bang-bang control - Hamilton-Jacobi-Bellman theory - dynamic programming - the linear quadratic Gaussian (LQG) problem - applications in Matlab.

Nicholas Karampetakis

Prof.
Subscribe to RSS - Calculus of variations and optimal control; optimization
X