tinygrad
Version:
A JavaScript/TypeScript autograd engine with operator overloading, inspired by micrograd
199 lines (146 loc) • 4.19 kB
Markdown
# TinyGrad
A JavaScript/TypeScript autograd engine with operator overloading, inspired by [micrograd](https://github.com/karpathy/micrograd).
## Features
- 🔥 **Automatic Differentiation**: Full backpropagation support for scalar values
- ⚡ **Operator Overloading**: Natural mathematical syntax using JavaScript operator overloading
- 🧠 **Neural Networks**: Built-in neuron, layer, and MLP implementations
- 📦 **Lightweight**: Zero dependencies for the core library
- 🎯 **TypeScript**: Fully typed with excellent IDE support
- 🌐 **Universal**: Works in browsers and Node.js
## Installation
```bash
npm install tinygrad
# or
pnpm add tinygrad
# or
yarn add tinygrad
# or
bun add tinygrad
```
## Quick Start
```typescript
"use operator overloading";
import { engine, nn } from "tinygrad";
const { Value } = engine;
const { MLP } = nn;
// Scalar operations with automatic differentiation
const a = new Value(2.0);
const b = new Value(-3.0);
const c = new Value(10.0);
const e = a * b;
const d = e + c;
const f = d.relu();
// Compute gradients
f.backward();
console.log(f.data); // 4.0
console.log(a.grad); // -3.0
console.log(b.grad); // 2.0
// Build a neural network
const model = new MLP(3, [4, 4, 1]); // 3 inputs, 2 hidden layers of 4 neurons, 1 output
const x = [
new Value(2.0),
new Value(3.0),
new Value(-1.0)
];
const output = model.call(x);
console.log(output.data); // Forward pass result
```
## API Reference
### Value
The `Value` class represents a scalar value with gradient tracking.
**Constructor:**
```typescript
new Value(data: number, children?: Value[], _op?: string)
```
**Supported Operations:**
- `add(other)` or `+` - Addition
- `sub(other)` or `-` - Subtraction
- `mul(other)` or `*` - Multiplication
- `div(other)` or `/` - Division
- `pow(n)` or `**` - Power
- `neg()` or unary `-` - Negation
- `relu()` - ReLU activation
**Methods:**
- `backward()` - Compute gradients via backpropagation
### Neural Network Modules
#### Neuron
```typescript
new Neuron(nin: number, nonlin: boolean = true)
```
#### Layer
```typescript
new Layer(nin: number, nout: number, nonlin: boolean = true)
```
#### MLP (Multi-Layer Perceptron)
```typescript
new MLP(nin: number, nouts: number[])
```
**Methods:**
- `call(x: Value[])` - Forward pass
- `parameters()` - Get all trainable parameters
- `zeroGrad()` - Reset gradients to zero
## Operator Overloading
TinyGrad uses the `unplugin-op-overloading` plugin to enable natural mathematical syntax. Add the following to the top of your file:
```typescript
"use operator overloading";
```
This enables:
```typescript
const x = new Value(2);
const y = new Value(3);
const z = x * y + x ** 2; // Much cleaner than z = x.mul(y).add(x.pow(2))
```
## Training Example
```typescript
"use operator overloading";
import { engine, nn } from "tinygrad";
const { Value } = engine;
const { MLP } = nn;
// Dataset
const X = [[2, 3, -1], [3, -1, 0.5], [0.5, 1, 1], [1, 1, -1]];
const y = [1, -1, -1, 1]; // targets
const model = new MLP(3, [4, 4, 1]);
// Training loop
for (let i = 0; i < 100; i++) {
// Forward pass
const inputs = X.map(row => row.map(x => new Value(x)));
const scores = inputs.map(x => model.call(x));
// Loss (MSE)
let loss = new Value(0);
for (let j = 0; j < y.length; j++) {
const diff = scores[j] - new Value(y[j]);
loss = loss + diff * diff;
}
// Backward pass
model.zeroGrad();
loss.backward();
// Update (SGD)
const lr = 0.01;
for (const p of model.parameters()) {
p.data -= lr * p.grad;
}
if (i % 10 === 0) {
console.log(`Step ${i}, Loss: ${loss.data}`);
}
}
```
## Demo
Check out the [interactive demo](https://github.com/yourusername/tinygrad) to see TinyGrad in action with:
- Real-time visualization of training progress
- Decision boundary visualization
- Interactive controls for learning rate and training steps
## Development
```bash
# Install dependencies
bun install
# Run development server (demo)
bun run dev
# Build library
bun run build:lib
# Type checking
bun run typecheck
```
## License
MIT
## Credits
Inspired by [micrograd](https://github.com/karpathy/micrograd) by Andrej Karpathy.