On completion of the course, participants shall be able to:
• Describe modern algorithms for minimization of a convex functional, such as e.g. FBS, Dykstra, Douglas-Rachford, ADMM, Chambolle-Pock, and get an overview of pros and cons as well as different areas of application, for each respective algorithm.
• Be able to clarify under which circumstances the above algorithms converge, and in basic terms understand the underlying proof.
• Understand how minimization problems can be reformulated in terms of maximally monotone operators, and its connection to various fixpoint theorems.
• Understand how duality and the Fenchel transform is used to reformulate minimization problems.
• Clarify basic properties of convex functions and its connections to lower semicontinuous functions and subdifferential calculus.
• Understand the role of proximal operators for convex optimization routines.