LangChain vulnerable to code injection
Critical severity
GitHub Reviewed
Published
Apr 5, 2023
to the GitHub Advisory Database
•
Updated Sep 30, 2024
Description
Published by the National Vulnerability Database
Apr 5, 2023
Published to the GitHub Advisory Database
Apr 5, 2023
Reviewed
Apr 5, 2023
Last updated
Sep 30, 2024
In LangChain through 0.0.131, the
LLMMathChain
chain allows prompt injection attacks that can execute arbitrary code via the Pythonexec()
method.References