edc98fd01e
handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. WWW: https://github.com/HIPS/autograd
10 lines
607 B
Text
10 lines
607 B
Text
Autograd can automatically differentiate native Python and Numpy code. It can
|
|
handle a large subset of Python's features, including loops, ifs, recursion and
|
|
closures, and it can even take derivatives of derivatives of derivatives. It
|
|
supports reverse-mode differentiation (a.k.a. backpropagation), which means it
|
|
can efficiently take gradients of scalar-valued functions with respect to
|
|
array-valued arguments, as well as forward-mode differentiation, and the two
|
|
can be composed arbitrarily. The main intended application of Autograd is
|
|
gradient-based optimization.
|
|
|
|
WWW: https://github.com/HIPS/autograd
|