Introduction

Build Status

FastDifferentiation (FD) is a package for generating efficient executables to evaluate derivatives of Julia functions. It can also generate efficient true symbolic derivatives for symbolic analysis. Unlike forward and reverse mode automatic differentiation FD automatically generates efficient derivatives for arbitrary function types: ℝ¹->ℝ¹, ℝ¹->ℝᵐ, ℝⁿ->ℝ¹, and ℝⁿ->ℝᵐ.

For f:ℝⁿ->ℝᵐ with n,m large FD may have better performance than conventional AD algorithms because the FD algorithm finds expressions shared between partials and computes them only once. In some cases FD derivatives can be as efficient as manually coded derivatives (see the Lagrangian dynamics example in the D* paper or the Benchmarks section of the documentation for another example).

FD may take much less time to compute symbolic derivatives than Symbolics.jl even in the ℝ¹->ℝ¹ case. The executables generated by FD may also be much faster (see Symbolic Processing).

You should consider using FastDifferentiation when you need:

  • a fast executable for evaluating the derivative of a function and the overhead of the preprocessing/compilation time is swamped by evaluation time.
  • to do additional symbolic processing on your derivative. FD can generate a true symbolic derivative to be processed further in Symbolics.jl or another computer algebra system.

This is the FD feature set:

Dense JacobianSparse JacobianDense HessianSparse HessianHigher order derivativesJᵀvHv
Compiled function
Symbolic expression

Jᵀv and Jv compute the Jacobian transpose times a vector and the Jacobian times a vector, without explicitly forming the Jacobian matrix. For applications see this paper.

Hv computes the Hessian times a vector without explicitly forming the Hessian matrix. This can be useful when the Hessian matrix is large and sparse.

If you use FD in your work please share the functions you differentiate with me. I'll add them to the benchmarks. The more functions available to test the easier it is for others to determine if FD will help with their problem.

This is beta software being modified on a daily basis. Expect bugs and frequent, possibly breaking changes, over the next month or so. Documentation is frequently updated so check the latest docs before filing an issue. Your problem may have been fixed and documented.

Notes about special derivatives

The derivative of |u| is u/|u| which is NaN when u==0. This is not a bug. The derivative of the absolute value function is undefined at 0 and the way FD signals this is by returning NaN.

Conditionals

As of version 0.4.1 FD allows you to create expressions with conditionals using either the builtin ifelse function or a new function if_else. ifelse will evaluate both inputs. By contrast if_else has the semantics of if...else...end; only the true or false branch will be executed. This is useful when your conditional is used to prevent exceptions because of illegal input values:

julia> f = if_else(x<0,NaN,sqrt(x))
(if_else  (x < 0) NaN sqrt(x))

julia> g = make_function([f],[x])


julia> g([-1])
1-element Vector{Float64}:
 NaN

julia> g([2.0])
1-element Vector{Float64}:
 1.4142135623730951
end

In this case you wouldn't want to use ifelse because it evaluates both the true and false branches and causes a runtime exception:

julia> f = ifelse(x<0,NaN,sqrt(x))
(ifelse  (x < 0) NaN sqrt(x))

julia> g = make_function([f],[x])
...

julia> g([-1])
ERROR: DomainError with -1.0:
sqrt was called with a negative real argument but will only return a complex result if called with a complex argument. Try sqrt(Complex(x)).

However, you cannot yet compute derivatives of expressions that contain conditionals:

julia> jacobian([f],[x,y])
ERROR: Your expression contained ifelse. FastDifferentiation does not yet support differentiation through ifelse or any of these conditionals (max, min, copysign, &, |, xor, <, >, <=, >=, !=, ==, signbit, isreal, iszero, isfinite, isnan, isinf, isinteger, !)

This may be a breaking change for some users. In previous versions the expression x==y returned a Bool. Some data structures, such as Dict use == by default to determine if two entries are the same. This will no longer work since x==y will now return an expression graph. Use an IdDict instead since this uses ===.

A future PR will add support for differentiating through conditionals.