tokenlens
A lightweight registry of LLM model information, like name and context sizes, for building AI-powered apps.
@tokenlens/models
Tree-shakeable static models.dev catalog split by provider for TokenLens.
node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
snow-flow
ServiceNow development with SnowCode - 75+ LLM providers (Claude, GPT, Gemini, Llama, Mistral, DeepSeek, Groq, Ollama) • 410 Optimized Tools • 2 MCP Servers • Native Predictive Intelligence builder • Multi-agent orchestration • Use ANY AI coding assistant
@lenml/tokenizers
a lightweight no-dependency fork of transformers.js (only tokenizers)