Abstract
The time scales calculus is a key emerging area of mathematics due to its potential use in a wide variety of multidisciplinary applications. We extend this calculus to approximate dynamic programming (ADP). The core backward induction algorithm of dynamic programming is extended from its traditional discrete case to all isolated time scales. Hamilton-Jacobi-Bellman equations, the solution of which is the fundamental problem in the field of dynamic programming, are motivated and proven on time scales. By drawing together the calculus of time scales and the applied area of stochastic control via ADP, we have connected two major fields of research.
Recommended Citation
J. E. Seiffertt et al., "Hamilton-Jacobi-Bellman Equations and Approximate Dynamic Programming on Time Scales," IEEE Transactions on Systems, Man, and Cybernetics, Part B, Institute of Electrical and Electronics Engineers (IEEE), Aug 2008.
The definitive version is available at https://doi.org/10.1109/TSMCB.2008.923532
Department(s)
Electrical and Computer Engineering
Second Department
Computer Science
Keywords and Phrases
Calculus; Dynamic Programming; Stochastic Systems
Document Type
Article - Conference proceedings
Document Version
Final Version
File Type
text
Language(s)
English
Rights
© 2008 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
Publication Date
01 Aug 2008