Valid from: Spring 2014
Decided by: FN1/Anders Gustafsson
Date of establishment: 2014-03-12
Division: Automatic Control
Course type: Third-cycle course
Teaching language: English
Optimal control is the problem of determining the control function for a dynamical system to minimize a cost related to the system trajectory. The subject has its roots in the calculus of variations but it evolved to an independent branch of applied mathematics and engineering in the 1950s. The overall goal of the course is to provide an understanding of the main results in calculus of variations and optimal control, and how these results can be used in various applications such as in robotics, finance, economics, and biology
Knowledge and Understanding
For a passing grade the doctoral student must
Competences and Skills
For a passing grade the doctoral student must
The Euler-Lagrange equation. Pontryagin's maximum principle. Dynamic programming. The Hamilton-Jacobi-Bellman equation. Finite and infinite horizon optimal control problems. Viscosity solutions to partial differential equations. The Linear quadratic regulator. Necessary and sufficient conditions for optimality. Numerical methods for optimal control problems.
Liberzon, D.: Calculus of Variations and Optimal Control: A Concise Introduction.
Types of instruction: Lectures, seminars, project
Examination format: Seminars given by participants
Grading scale: Failed, pass
Examiner:
Course coordinators:
Web page: http://www.control.lth.se/Education/DoctorateProgram/optimal-control.html