[Pl-seminar] Semantics Seminar Schedule

Mitchell Wand wand at ccs.neu.edu
Sun Mar 29 00:05:00 EDT 2009


NU Programming Languages Seminar
Wednesday, April 1, 2009
11:45-1:30
Room 366 WVH (http://www.ccs.neu.edu/home/wand/directions.html)

              Automatic Differentiation of Functional Programs
                 and its use for Probabilistic Programming

                            Jeffrey Mark Siskind
               School of Electrical and Computer Engineering
                             Purdue University

It is extremely useful to be able to take derivatives of functions
expressed as computer programs to support function optimization and
approximation, parameter estimation, machine learning, and ultimately
computational science and engineering design.  The established
discipline of Automatic Differentiation (AD) has largely focussed on
imperative languages, where it is most efficiently implemented as a
source-to-source transformation performed by a preprocessor.  This
talk will present a novel formulation of AD for functional programs
expressed in the lambda calculus.  A key novel aspect of our
formulation is that AD is performed by higher-order functions that
reflectively transform closure bodies.  Our methods exhibit an
important closure property that prior AD formulations lack: the
ability to transform the entire space of input programs, including
those produced by the AD transformations themselves.  This is crucial
for solving the kinds of nested optimizations that arise in domains
such as game theory and automatic control.  Furthermore, since the
input and output of our transformations is the lambda calculus,
efficient implementation is facilitated by novel extensions of
standard compilation techniques.  We exhibit a novel "almost"
union-free polyvariant flow-analysis algorithm, formulated as abstract
interpretation, that partially evaluates calls to the AD operators,
migrating reflective source-to-source transformation to compile time.
This yields code with run-time performance that exceeds the best
existing AD implementations for imperative languages by a factor of
two and outperforms all AD implementations for functional languages by
two to three orders of magnitude.

AD has traditionally been applied to purely numeric programs written
in imperative languages like Fortran.  Our novel methods can be
applied to mixed symbolic-numeric programs written in functional
languages.  This is useful for developing complex stochastic models
such as is done in the emerging field of probabilistic programming.
We demonstrate how our methods support this enterprise by constructing
evaluators for two different probabilistic programming languages, one
based on Scheme and one based on Prolog, and using both forward-mode
and reverse-mode variants of our AD methods to take the gradients of
such evaluators executing probabilistic programs in their respective
target languages in order to perform gradient-based maximum-likelihood
estimation of the distribution parameters of the free random variables
in those programs.  We demonstrate that for this domain our methods
yield performance that exceeds straightforward implementation of AD in
functional languages by many orders of magnitude.

Joint work with Barak A. Pearlmutter

================================================================

Upcoming Events:

# Nothing scheduled :-(  Wouldn't you like to give a talk?

--Mitch





More information about the pl-seminar mailing list