Efficiently computes derivatives of numpy code
Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization.
Release | Stable | Testing |
---|---|---|
Fedora Rawhide | 1.7.0-2.fc42 | - |
Fedora 42 | 1.7.0-2.fc42 | - |
Fedora 41 | 1.7.0-1.fc41 | 1.7.0-1.fc41 |
Fedora 40 | 1.7.0-1.fc40 | - |
You can contact the maintainers of this package via email at
python-autograd dash maintainers at fedoraproject dot org
.