Hamilton-Jacobi-Bellman Equations and Approximate Dynamic Programming on Time Scales

John E. Seiffertt IV, Missouri University of Science and Technology
Suman Sanyal
Donald C. Wunsch, Missouri University of Science and Technology

This document has been relocated to http://scholarsmine.mst.edu/ele_comeng_facwork/1065

There were 35 downloads as of 27 Jun 2016.

Abstract

The time scales calculus is a key emerging area of mathematics due to its potential use in a wide variety of multidisciplinary applications. We extend this calculus to approximate dynamic programming (ADP). The core backward induction algorithm of dynamic programming is extended from its traditional discrete case to all isolated time scales. Hamilton-Jacobi-Bellman equations, the solution of which is the fundamental problem in the field of dynamic programming, are motivated and proven on time scales. By drawing together the calculus of time scales and the applied area of stochastic control via ADP, we have connected two major fields of research.