UNPKG

ruvector-attention-wasm

Version:

High-performance attention mechanisms for WebAssembly - Transformer, Hyperbolic, Flash, MoE, and Graph attention

38 lines (37 loc) 940 B
{ "name": "ruvector-attention-wasm", "version": "0.1.0", "type": "module", "description": "High-performance attention mechanisms for WebAssembly - Transformer, Hyperbolic, Flash, MoE, and Graph attention", "license": "MIT OR Apache-2.0", "repository": { "type": "git", "url": "https://github.com/ruvnet/ruvector" }, "homepage": "https://github.com/ruvnet/ruvector", "author": "rUv <ruv@ruv.io>", "keywords": [ "wasm", "webassembly", "attention", "transformer", "machine-learning", "neural-networks", "hyperbolic", "moe", "flash-attention", "graph-attention", "rust" ], "files": [ "ruvector_attention_wasm_bg.wasm", "ruvector_attention_wasm.js", "ruvector_attention_wasm.d.ts", "ruvector_attention_wasm_bg.wasm.d.ts" ], "main": "ruvector_attention_wasm.js", "types": "ruvector_attention_wasm.d.ts", "sideEffects": [ "./snippets/*" ] }