Backpropagation and Ordered Derivatives in the Time Scales Calculus

Abstract

Backpropagation is the most widely used neural network learning technique. It is based on the mathematical notion of an ordered derivative. In this paper, we present a formulation of ordered derivatives and the backpropagation training algorithm using the important emerging area of mathematics known as the time scales calculus. This calculus, with its potential for application to a wide variety of inter-disciplinary problems, is becoming a key area of mathematics. It is capable of unifying continuous and discrete analysis within one coherent theoretical framework. Using this calculus, we present here a generalization of backpropagation which is appropriate for cases beyond the specifically continuous or discrete. We develop a new multivariate chain rule of this calculus, define ordered derivatives on time scales, prove a key theorem about them, and derive the backpropagation weight update equations for a feedforward multilayer neural network architecture. By drawing together the time scales calculus and the area of neural network learning, we present the first connection of two major fields of research.

Department(s)

Electrical and Computer Engineering

International Standard Serial Number (ISSN)

1045-9227

Document Type

Article - Journal

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2010 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.

Publication Date

01 Jan 2010

Share

 
COinS