Search
@presidio-dev/hai-guardrails
A set of guards for LLM Apps
v1.10.1 URL:
https://unpkg.com/@presidio-dev/hai-guardrails@1.10.1/dist/index.cjs
Open
Browse Files
presidio
guardrails
llm
hai
redaction
security
defence
governance
guards
human-ai
prompt-injection
llm-guardrails
halucination
llm-guard
A TypeScript library for validating and securing LLM prompts
v0.1.8 URL:
https://unpkg.com/llm-guard@0.1.8/dist/index.js
Open
Browse Files
llm
security
validation
prompt
jailbreak
pii
toxicity
profanity
prompt-injection
relevance
prompt-cop
A lightweight security tool to detect potential prompt injection vulnerabilities in code files
v1.0.4 URL:
https://unpkg.com/prompt-cop@1.0.4/index.js
Open
Browse Files
security
prompt-injection
vulnerability-scanner
code-analysis
cli
breaker-ai
CLI to scan prompts for injection risks
v1.0.2 URL:
https://unpkg.com/breaker-ai@1.0.2
Open
Browse Files
breaker
ai
prompt
injection
security
breaker-ai
cli
prompt-security
prompt-injection
prompt-safety
prompt-risk
prompt-scanner
prompt-checker
prompt-validator
prompt-manners
A prompt injection mitigation library
v0.0.11 URL:
https://unpkg.com/prompt-manners@0.0.11/dist/index.mjs
Open
Browse Files
llm
ai
prompt-injection
injection
security
security-tool