# 401-4938-14L Stochastic Optimal Control

Semester | Spring Semester 2018 |

Lecturers | M. Soner |

Periodicity | two-yearly recurring course |

Course | Does not take place this semester. |

Language of instruction | English |

Abstract | Dynamic programming approach to stochastic optimal control problems will be developed. In addition to the general theory, detailed analysis of several important control problems will be given. |

Learning objective | Goals are to achieve a deep understanding of 1. Dynamic programming approach to optimal control; 2. Several classes of important optimal control problems and their solutions. 3. To be able to use this models in engineering and economic modeling. |

Content | In this course, we develop the dynamic programming approach for the stochastic optimal control problems. The general approach will be described and several subclasses of problems will also be discussed in including: 1. Standard exit time problems; 2. Finite and infinite horizon problems; 3. Optimal stoping problems; 4. Singular problems; 5. Impulse control problems. After the general theory is developed, it will be applied to several classical problems including: 1. Linear quadratic regulator; 2. Merton problem for optimal investment and consumption; 3. Optimal dividend problem of (Jeanblanc and Shiryayev); 4. Finite fuel problem; 5. Utility maximization with transaction costs; 6. A deterministic differential game related to geometric flows. Textbook will be Controlled Markov Processes and Viscosity Solutions, 2nd edition, (W.H. Fleming and H.M. Soner) Springer-Verlag, (2005). And lecture notes will be provided. |

Literature | Controlled Markov Processes and Viscosity Solutions, 2nd edition, (W.H. Fleming and H.M. Soner) Springer-Verlag, (2005). And lecture notes will be provided. |

Prerequisites / Notice | Basic knowledge of Brownian motion, stochastic differential equations and probability theory is needed. |