Symbolic Finite Differencing-part I
Programming practice is limited by labor costs such as implementation design, program development, debugging, and maintenance (including evolution and integration). Because resource utilization is often difficult to predict precisely, the economics of software developement also depends on the risk of the implementation failing to meet its performance requirements. Consequently, complex algorithms are frequently avoided in large systems — even in optimizing compilers, where run-time performance of the compiled code is so important. Our aim is to overcome some of these limitations by means of a transformational programming tool that facilitates implementation of complex algorithms with guaranteed worst-case asymptotic time and space. RAPTS, a working prototype of such a tool is scheduled to be demonstrated at ESOP '90. This paper is in two parts. In part I we specify a general finite differencing framework that unifies aspects of compiler and programming methodologies, and is the basis for one of the three main transformations implemented in RAPTS. In Part II we illustrate how the transformational methodology underlying RAPTS can be used to improve the implementation of its own finite differencing transformation. Improved reduction in strength models and algorithms for conventional languages are produced as an outgrowth of this discussion.
R. Paige, "Symbolic Finite Differencing-part I," Lecture Notes in Computer Science: ESOP'90, Springer Verlag, Jan 1990.
The definitive version is available at https://doi.org/10.1007/3-540-52592-0_54
Mathematics and Statistics
Article - Journal
© 1990 Springer Verlag, All rights reserved.