Skip to content

Latest commit

 

History

History
74 lines (45 loc) · 3.24 KB

README.md

File metadata and controls

74 lines (45 loc) · 3.24 KB

🎓 FrugalGPT: Better Quality and Lower Cost for LLM Applications

The FrugalGPT framework offers a collection of techniques for building LLM applications with budget constraints.

🚀 Getting Started

You can directly run the Google Colab Notebook to experience FrugalGPT. You don't even need API keys to get started with it.

Once you go through the notebook, you'll be ready to build your own LLM applcations with FrugalGPT!

🔧 Installation

You can also install FrugalGPT locally by running the following commands:

git clone https://github.com/stanford-futuredata/FrugalGPT
cd FrugalGPT
pip install git+https://github.com/stanford-futuredata/FrugalGPT
wget  https://github.com/lchen001/DataHolder/releases/download/v0.0.1/HEADLINES.zip
unzip HEADLINES.zip -d strategy/
rm HEADLINES.zip
wget -P db/ https://github.com/lchen001/DataHolder/releases/download/v0.0.1/HEADLINES.sqlite
wget -P db/ https://github.com/lchen001/DataHolder/releases/download/v0.0.1/qa_cache.sqlite

Now you are ready to use the local intro notebook!

📚 Read More

You can get an overview via our Twitter threads:

And read more in the academic paper:

A detailed blog with code examples:

📣 Updates & Changelog

🔹 2024.09.18 - Provided tradeoffs evaluation examples

  • ✅ Provided tradeoffs evaluation examples. For example, you can run the following colab notebook to evaluate the tradeoffs achieved on the SCIQ dataset: Open in Colab

🔹 2024.09.10 - Added support to more recent models

  • ✅ Added support of a few new models. This includes proprietary models such as GPT-4o, GPT-4-Turbo, and GPT-4o-mini, and a few open-source models such as Llama 3.1 (405B), Llama 3 (70B) and Gemma 2 (9B)
  • ✅ Released prompts and in-context examples used for SCIQ

🔹 2024.01.01 - Extracted API generations

  • ✅ Added the generations from 12 commercial LLM APIs for each dataset evaluated in the paper
  • ✅ Included both input queries and associated parameters (e.g., temperature and stop token)
  • ✅ Released them as CSV files here

🎯 Reference

If you use FrugalGPT in a research paper, please cite our work as follows:

@article{chen2023frugalgpt,
  title={FrugalGPT: How to Use Large Language Models While Reducing Cost and Improving Performance},
  author={Chen, Lingjiao and Zaharia, Matei and Zou, James},
  journal={arXiv preprint arXiv:2305.05176},
  year={2023}
}