API
FastDifferentiation.@variables
— Macro@variables args...
create FD variables to use in symbolic expressions. Example:
julia> @variables x y
y
julia> f = x*y
(x * y)
FastDifferentiation.clear_cache
— Methodclear_cache()
Clears the global expression cache. To maximize efficiency of expressions the differentation system automatically eliminates common subexpressions by checking for their existence in the global expression cache. Over time this cache can become arbitrarily large. Best practice is to clear the cache before you start defining expressions, define your expressions and then clear the cache.
FastDifferentiation.derivative
— Methodderivative(A::AbstractArray{<:Node}, variables...)
Computes ∂A/(∂variables[1],...,∂variables[n])
. Repeated differentiation rather than computing different columns of the Jacobian.
Example
julia> A = [t t^2;3t^2 5]
2×2 Matrix{Node}:
t (t ^ 2)
(3 * (t ^ 2)) 5
julia> derivative(A,t)
2×2 Matrix{Node}:
1.0 (2 * t)
(6 * t) 0.0
julia> derivative(A,t,t)
2×2 Matrix{Node{T, 0} where T}:
0.0 2
6 0.0
FastDifferentiation.derivative
— MethodConvenience derivative
for scalar functions. Takes a scalar input and returns a scalar output
FastDifferentiation.differential
— Methoddifferential(variables::Node...)
Returns an anonymous function that takes the derivative of a scalar function with respect to variables
.
Example
julia> @variables t
t
julia> f = t^2
(t ^ 2)
julia> Dt = differential(t)
#69 (generic function with 1 method)
julia> Dt(f)
(2 * t)
julia> Dt = differential(t,t)
#69 (generic function with 1 method)
julia> Dt(f)
2
FastDifferentiation.hessian
— Methodhessian(expression::Node, variable_order::AbstractVector{<:Node})
Returns the dense symbolic Hessian matrix.
Example
julia> @variables x y
julia> hessian(x^2*y^2,[x,y])
2×2 Matrix{FastDifferentiation.Node}:
(2 * (y ^ 2)) (4 * (y * x))
(4 * (x * y)) (2 * (x ^ 2))
FastDifferentiation.hessian_times_v
— Methodhessian_times_v(term::Node, partial_variables::AbstractVector{<:Node})
Computes Hessian times a vector v without forming the Hessian matrix. Useful when the Hessian would be impractically large.
FastDifferentiation.if_else
— FunctionSpecial if_else to use for conditionals instead of builtin ifelse because the latter evaluates all its arguments. Many ifelse statements are used as guards against computations that would cause an exception so need a new version that is transformed into
condition ? true_branch : false_branch
during code generation.
FastDifferentiation.jacobian
— Methodjacobian(
terms::AbstractVector{<:Node},
partial_variables::AbstractVector{<:Node}
)
Jacobian matrix of the n element function defined by terms
. Each term element is a Node expression graph. Only the columns of the Jacobian corresponsing to the elements of partial_variables
will be computed and the partial columns in the Jacobian matrix will be in the order specified by partial_variables
. Examples:
julia> @variables x y
julia> jacobian([x*y,y*x],[x,y])
2×2 Matrix{Node}:
y x
y x
julia> jacobian([x*y,y*x],[y,x])
2×2 Matrix{Node}:
x y
x y
julia> jacobian([x*y,y*x],[x])
2×1 Matrix{Node}:
y
y
FastDifferentiation.jacobian_times_v
— Methodjacobian_times_v(
terms::AbstractVector{<:Node},
partial_variables::AbstractVector{<:Node}
)
Returns a vector of Node, where each element in the vector is the symbolic form of Jv
. Also returns v_vector
a vector of the v
variables. This is useful if you want to generate a function to evaluate Jv
and you want to separate the inputs to the function and the v
variables.
FastDifferentiation.jacobian_transpose_v
— Methodjacobian_transpose_v(
terms::AbstractVector{<:Node},
partial_variables::AbstractVector{<:Node}
)
Returns a vector of Node, where each element in the vector is the symbolic form of Jᵀv
. Also returns v_vector
a vector of the v
variables. This is useful if you want to generate a function to evaluate Jᵀv
and you want to separate the inputs to the function and the v
variables.
FastDifferentiation.make_Expr
— Methodmake_Expr(
func_array::AbstractArray{<:Node},
input_variables::AbstractVector{<:Node},
in_place::Bool,
init_with_zeros::Bool
)
FastDifferentiation.make_Expr
— Methodmake_Expr(
A::SparseMatrixCSC{<:Node,<:Integer},
input_variables::AbstractVector{<:Node},
in_place::Bool, init_with_zeros::Bool
)
init_with_zeros
argument is not used for sparse matrices.
FastDifferentiation.make_function
— Methodmake_function(
func_array::AbstractArray{<:Node},
input_variables::AbstractVector{<:Node}...;
in_place::Bool=false, init_with_zeros::Bool=true
)
Makes a function to evaluate the symbolic expressions in func_array
. Every variable that is used in func_array
must also be in input_variables
. However, it will not cause an error if variables in input_variables
are not variables used by func_array
.
julia> @variables x
x
julia> f = x+1
(x + 1)
julia> jac = jacobian([f],[x]) #the Jacobian has a single constant element, 1, and is no longer a function of x
1×1 Matrix{FastDifferentiation.Node}:
1
julia> fjac = make_function(jac,[x])
...
julia> fjac(2.0) #the value 2.0 is passed in for the variable x but has no effect on the output. Does not cause a runtime exception.
1×1 Matrix{Float64}:
1.0
If in_place=false
then a new array will be created to hold the result each time the function is called. If in_place=true
the function expects a user supplied array to hold the result. The user supplied array must be the first argument to the function.
julia> @variables x
x
julia> f! = make_function([x,x^2],[x],in_place=true)
...
julia> result = zeros(2)
2-element Vector{Float64}:
0.0
0.0
julia> f!(result,[2.0])
4.0
julia> result
2-element Vector{Float64}:
2.0
4.0
If the array is sparse then the keyword argument init_with_zeros
has no effect. If the array is dense and in_place=true
then the keyword argument init_with_zeros
affects how the in place array is initialized. If init_with_zeros = true
then the in place array is initialized with zeros. If init_with_zeros=false
it is the user's responsibility to initialize the array with zeros before passing it to the runtime generated function.
This can be useful for modestly sparse dense matrices with say at least 1/4 of the array entries non-zero. In this case a sparse matrix may not be as efficient as a dense matrix. But a large fraction of time could be spent unnecessarily setting elements to zero. In this case you can initialize the in place Jacobian array once with zeros before calling the run time generated function.
FastDifferentiation.make_variables
— Methodmakevariables(name::Symbol,arraysize::T...)
Returns an Array of variables with names corresponding to their indices in the Array.
Example:
julia> make_variables(:x,3)
3-element Vector{FastDifferentiation.Node}:
x1
x2
x3
julia> make_variables(:x,2,3)
2×3 Matrix{FastDifferentiation.Node}:
x1_1 x1_2 x1_3
x2_1 x2_2 x2_3
julia> make_variables(:x,2,3,2)
2×3×2 Array{FastDifferentiation.Node, 3}:
[:, :, 1] =
x1_1_1 x1_2_1 x1_3_1
x2_1_1 x2_2_1 x2_3_1
[:, :, 2] =
x1_1_2 x1_2_2 x1_3_2
x2_1_2 x2_2_2 x2_3_2
FastDifferentiation.sparse_hessian
— Methodsparse_hessian(expression::Node, variable_order::AbstractVector{<:Node})
Compute a sparse symbolic Hessian. Returns a sparse matrix of symbolic expressions. Can be used in combination with make_function
to generate an executable that will return a sparse matrix or take one as an in-place argument.
Example
julia> @variables x y
julia> a = sparse_hessian(x*y,[x,y])
2×2 SparseArrays.SparseMatrixCSC{FastDifferentiation.Node, Int64} with 2 stored entries:
⋅ 1
1 ⋅
julia> f1 = make_function(a,[x,y])
...
julia> f1([1.0,2.0])
2×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
⋅ 1.0
1.0 ⋅
julia> tmp = similar(a,Float64)
2×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
⋅ 4.24399e-314
4.24399e-314 ⋅
julia> f2 = make_function(a,[x,y],in_place=true)
...
julia> f2(tmp, [1.0,2.0])
2×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
⋅ 1.0
1.0 ⋅
julia> tmp
2×2 SparseArrays.SparseMatrixCSC{Float64, Int64} with 2 stored entries:
⋅ 1.0
1.0 ⋅
FastDifferentiation.sparse_jacobian
— Methodsparse_jacobian(
terms::AbstractVector{<:Node},
partial_variables::AbstractVector{<:Node}
)
Returns a sparse array containing the Jacobian of the function defined by terms
FastDifferentiation.sparsity
— Methodsparsity(sym_func::AbstractArray{<:Node})
Computes a number representing the sparsity of the array of expressions. If nelts
is the number of elements in the array and nzeros
is the number of zero elements in the array then sparsity = (nelts-nzeros)/nelts
.
Frequently used in combination with a call to make_function
to determine whether to set keyword argument init_with_zeros
to false.