gradiatorjs
GradiatorJS is a lightweight, from-scratch autodiff engine and a neural network library written in typescript. Featuring a powerful automatic differentiation engine using a computation graph to enable backpropagation on dynamic network architectures. You
@2bad/micrograd
[](https://www.npmjs.com/package/@2bad/micrograd) [](https://opensource.org/license/MIT) [![GitHub Build Status](https://img.shields.io/git
scalar-autograd
Scalar-based reverse-mode automatic differentiation in TypeScript.
micrograd
A porting of Karpathy's Micrograd to JS
adnn.ts
adnn provides TypeSafe Javascript-native neural networks on top of general scalar/tensor reverse-mode automatic differentiation. You can use just the AD code, or the NN layer built on top of it. This architecture makes it easy to define big, complex numer