CoolPrompt is a framework for automative prompting creation and optimization.
- Automatic prompt engineering for solving tasks using LLM
- (Semi-)automatic generation of markup for fine-tuning
- Formalization of response quality assessment using LLM
- Prompt tuning for agent systems
- Optimize prompts with our autoprompting optimizers: HyPE, ReflectivePrompt, DistillPrompt
- LLM-Agnostic Choice: work with your custom llm (from open-sourced to proprietary) using supported Langchain LLMs
- Generate synthetic evaluation data when no input dataset is provided
- Evaluate prompts incorporating multiple metrics for both classification and generation tasks
- Retrieve feedbacks to interpret prompt optimization results
- Automatic task detecting for scenarios without explicit user-defined task specifications
- Install with pip:
pip install coolprompt- Install with git:
git clone https://github.com/CTLab-ITMO/CoolPrompt.git
pip install -r requirements.txtImport and initialize PromptTuner using model qwen3-4b-instruct via HuggingFace
from coolprompt.assistant import PromptTuner
prompt_tuner = PromptTuner()
prompt_tuner.run('Write an essay about autumn')
print(prompt_tuner.final_prompt)
# You are an expert writer and seasonal observer tasked with composing a rich,
# well-structured, and vividly descriptive essay on the theme of autumn...See more examples in notebooks to familiarize yourself with our framework
- The framework is developed by Computer Technologies Lab (CT-Lab) of ITMO University.
- API Reference
- We welcome and value any contributions and collaborations, so please contact us. For new code check out CONTRIBUTING.md.
For technical details and full experimental results, please check our papers.
CoolPrompt
Publishing
ReflectivePrompt
@misc{zhuravlev2025reflectivepromptreflectiveevolutionautoprompting,
title={ReflectivePrompt: Reflective evolution in autoprompting algorithms},
author={Viktor N. Zhuravlev and Artur R. Khairullin and Ernest A. Dyagin and Alena N. Sitkina and Nikita I. Kulin},
year={2025},
eprint={2508.18870},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.18870},
}
DistillPrompt
@misc{dyagin2025automaticpromptoptimizationprompt,
title={Automatic Prompt Optimization with Prompt Distillation},
author={Ernest A. Dyagin and Nikita I. Kulin and Artur R. Khairullin and Viktor N. Zhuravlev and Alena N. Sitkina},
year={2025},
eprint={2508.18992},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.18992},
}