Compiling Fast Partial Derivatives of Functions Given by Algorithms
Speelpenning, Bert
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/66437
Description
Title
Compiling Fast Partial Derivatives of Functions Given by Algorithms
Author(s)
Speelpenning, Bert
Issue Date
1980
Department of Study
Computer Science
Discipline
Computer Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Computer Science
Language
eng
Abstract
If the gradient of the function y = f(x(,1),...,x(,n)) is desired where f is given by an algorithm Af(x,n,y), most numerical analysts will use numerical differencing. This is a sampling scheme that approximates derivatives by the slope of secants in closely spaced points. Symbolic methods that make full use of the program text of Af should be able to come up with a better way to evaluate the gradient of f. The system "Jake" described in this thesis produces gradients significantly faster than numerical differencing. A system sketch of Jake is presented below:
Jake can handle algorithms Af with arbitrary flow of control. If algorithm Af requires T time to evaluate y for given values of x(,1),...,x(,n), the algorithm Af' produced by Jake will evaluate the gradient (PAR-DIFF)y/(PAR-DIFF)x(,1),...,(PAR-DIFF)y/(PAR-DIFF)x(,n) in time O(T). In contrast, numerical differencing requires O(nT). The space requirements of Af' are modest. Measurements performed on one particular machine suggest that Jake is faster than numerical differencing for n > 8. Somewhat weaker results have been obtained for the problem of computing Jacobians of arbitrary shape.
Jake is based on a view of an execution of the algorithm Af as a sequence of transformations in a state space. The Jacobian of the entire execution of the algorithm is the matrix product of the Jacobians for each individual transformation. The optimal multiplication order of these Jacobians leads to substantial savings over numerical differencing. In case of gradients (y being a scalar) these savings can be shown to be O(n). The algorithms Af and Af' in this approach are sufficiently close so that Af can be transformed into Af' by automatic means. Current compiler technology can be used to build a practical translator; no automatic theorem proving or heuristic search procedures are required.
Numerical analysts currently hesitant to use methods of functional iteration requiring a knowledge of derivatives may find in Jake a tool making such methods the preferred ones.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.