Security scanner for local LLMs scanning LLM vulnerabilities including jailbreaks, prompt injection, training data leakage, and adversarial abuse
ai jailbreak vulnerability pentest scanning pentesting-tools red-team-tools llm ollama llm-vulnerabilities ai-jailbreak-prompts ai-vulnerability-assessment llmstudio llm-pentesting
-
Updated
Nov 22, 2025 - Python