Ultra-fast, low latency LLM prompt injection/jailbreak detection ⛓️
jailbreak security-tools large-language-models prompt-engineering chatgpt-prompts llm-security llm-local llm-guard llm-guardrails
-
Updated
Jul 26, 2024 - Python