gradiatorjs

GradiatorJS is a lightweight, from-scratch autodiff engine and a neural network library written in typescript. Featuring a powerful automatic differentiation engine using a computation graph to enable backpropagation on dynamic network architectures. You

@2bad/micrograd

[![NPM version](https://img.shields.io/npm/v/@2bad/micrograd)](https://www.npmjs.com/package/@2bad/micrograd) [![License](https://img.shields.io/npm/l/@2bad/micrograd)](https://opensource.org/license/MIT) [![GitHub Build Status](https://img.shields.io/git

adnn.ts

adnn provides TypeSafe Javascript-native neural networks on top of general scalar/tensor reverse-mode automatic differentiation. You can use just the AD code, or the NN layer built on top of it. This architecture makes it easy to define big, complex numer