node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
v3.11.0 URL: https://unpkg.com/node-llama-cpp@3.11.0/dist/index.js
OpenBrowse Files
llamallama-cppllama.cppbindingsaicmakecmake-jsprebuilt-binariesllmggufmetalcudavulkangrammarembeddingrerankrerankingjson-grammarjson-schema-grammarfunctionsfunction-callingtoken-predictionspeculative-decodingtemperatureminPtopKtopPseedjson-schemaraspberry-piself-hostedlocalcataimistraldeepseekqwenqwqtypescriptlorabatchinggpu

@aibrow/node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
v1.7.0 URL: https://unpkg.com/@aibrow/node-llama-cpp@1.7.0/dist/index.js
OpenBrowse Files
llamallama-cppllama.cppbindingsaicmakecmake-jsprebuilt-binariesllmggufmetalcudavulkangrammarembeddingrerankrerankingjson-grammarjson-schema-grammarfunctionsfunction-callingtoken-predictionspeculative-decodingtemperatureminPtopKtopPseedjson-schemaraspberry-piself-hostedlocalcataimistraldeepseekqwenqwqtypescriptlorabatchinggpu

quiad

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
v1.3.1 URL: https://unpkg.com/quiad@1.3.1/dist/index.js
OpenBrowse Files
similiqueidllama.cppbindingsaisequiblanditiiserrorllmlaboriosammetalcudagrammarjson-grammarjson-schema-grammartemperaturecupiditateassumendajson-schemaraspberry-piself-hostedvelpraesentium

custom-koya-node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
v0.1.0 URL: https://unpkg.com/custom-koya-node-llama-cpp@0.1.0/dist/index.js
OpenBrowse Files
llamallama-cppllama.cppbindingsaicmakecmake-jsprebuilt-binariesllmggufmetalcudagrammarjson-grammarjson-schema-grammartemperaturetopKtopPjson-schemaraspberry-piself-hostedlocalcatai