48c3a3c8f4
Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It uses reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments. The main intended application is gradient-based optimization.
6 lines
408 B
Text
6 lines
408 B
Text
$NetBSD: distinfo,v 1.1 2016/08/24 23:50:12 markd Exp $
|
|
|
|
SHA1 (autograd-1.1.5.tar.gz) = 1ed7727ac1d634b47b9ebe7244a851e76e3edd81
|
|
RMD160 (autograd-1.1.5.tar.gz) = 27ae3c0ef6a69141c1dddaa5975640f35ad63d94
|
|
SHA512 (autograd-1.1.5.tar.gz) = 4c41363acc2fbddad9bf587b6f6b9dbe151c0c1ef95059b192262f6d4eec2309e69d906f40bb3b39677323735af20ba7706993267e2b91607b251b09ea61aa7c
|
|
Size (autograd-1.1.5.tar.gz) = 24986 bytes
|