You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: index.md
+8-7
Original file line number
Diff line number
Diff line change
@@ -33,9 +33,9 @@ The [autodiff.org](http://www.autodiff.org/) site serves as a portal for the aca
33
33
34
34
## The Big List
35
35
36
-
What follows is a big list of Julia differentiation packages and related tooling, last updated in January 2024.
36
+
What follows is a big list of Julia differentiation packages and related tooling, last updated in February 2025.
37
37
If you notice something inaccurate or outdated, please [open an issue](https://github.com/JuliaDiff/juliadiff.github.io/issues) to signal it.
38
-
The packages marked as inactive are those which have had no release in 2023.
38
+
The packages marked as inactive are those which have had no release since 2023.
39
39
40
40
The list aims to be comprehensive in coverage.
41
41
By necessity, this means it is not comprehensive in detail.
@@ -46,9 +46,8 @@ It is worth investigating each package yourself to really understand its ins and
46
46
-[JuliaDiff/ReverseDiff.jl](https://github.com/JuliaDiff/ReverseDiff.jl): Operator overloading AD backend
47
47
-[FluxML/Zygote.jl](https://github.com/FluxML/Zygote.jl): Source transformation AD backend
48
48
-[EnzymeAD/Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl): LLVM-level source transformation AD backend
49
+
-[compintell/Mooncake.jl](https://github.com/compintell/Mooncake.jl): Source transformation AD backend
49
50
-[FluxML/Tracker.jl](https://github.com/FluxML/Tracker.jl): Operator overloading AD backend
50
-
-[compintell/Tapir.jl](https://github.com/compintell/Tapir.jl): Source transformation AD backend (experimental)
51
-
-[dfdx/Yota.jl](https://github.com/dfdx/Yota.jl): Source transformation AD backend
52
51
53
52
### Forward mode automatic differentiation
54
53
@@ -72,11 +71,12 @@ It is worth investigating each package yourself to really understand its ins and
72
71
-[JuliaDiff/TaylorSeries.jl](https://github.com/JuliaDiff/TaylorSeries.jl): Taylor polynomial expansions in one or more variables
73
72
-[JuliaDiff/TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl): Higher order directional derivatives (experimental)
74
73
-[JuliaDiff/Diffractor.jl](https://github.com/JuliaDiff/Diffractor.jl): Source transformation AD backend (experimental)
74
+
-[bmad-sim/GTPSA.jl](https://github.com/bmad-sim/GTPSA.jl): Truncated power series
75
75
76
76
### Interfaces
77
77
78
-
-[gdalle/DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl): Generic interface for first- and second-order differentiation with any AD backend on 1-argument functions (`f(x) = y` or `f!(y, x)`).
79
-
-[JuliaDiff/AbstractDifferentiation.jl](https://github.com/JuliaDiff/AbstractDifferentiation.jl): Generic interface for first- and second-order differentiation with a subset of AD backends on functions with more than one argument (will soon wrap DifferentiationInterface.jl).
78
+
-[JuliaDiff/DifferentiationInterface.jl](https://github.com/JuliaDiff/DifferentiationInterface.jl): Generic interface for first- and second-order differentiation with any AD backend, including sparsity handling.
79
+
-[JuliaDiff/AbstractDifferentiation.jl](https://github.com/JuliaDiff/AbstractDifferentiation.jl): Generic interface for first- and second-order differentiation with a subset of AD backends (precursor to DifferentiationInterface.jl).
80
80
81
81
### Rulesets
82
82
@@ -93,9 +93,9 @@ These packages define derivatives for basic functions, and enable users to do th
93
93
94
94
### Sparsity
95
95
96
-
-[JuliaDiff/SparseDiffTools.jl](https://github.com/JuliaDiff/SparseDiffTools.jl): Exploit sparsity to speed up FiniteDiff.jl and ForwardDiff.jl, as well as other algorithms.
97
96
-[adrhill/SparseConnectivityTracer.jl](https://github.com/adrhill/SparseConnectivityTracer.jl): Sparsity pattern detection for Jacobians and Hessians.
98
97
-[gdalle/SparseMatrixColorings.jl](https://github.com/gdalle/SparseMatrixColorings.jl): Efficient coloring and and decompression algorithms for sparse Jacobians and Hessians.
98
+
-[JuliaDiff/SparseDiffTools.jl](https://github.com/JuliaDiff/SparseDiffTools.jl): Exploit sparsity to speed up FiniteDiff.jl and ForwardDiff.jl, as well as other algorithms.
99
99
100
100
### Differentiating through more stuff
101
101
@@ -111,6 +111,7 @@ Some complex algorithms are not natively differentiable, which is why derivative
0 commit comments