"Effortlessly harness the power of LLMs on Excel and DataFrames—seamless, smart, and efficient!"
LLMWorkbook is a Python package designed to seamlessly integrate Large Language Models (LLMs) into your workflow with tabular data, be it Excel, CSV, DataFrames/Arrays. This package allows you to easily configure an LLM, send prompts row-wise from any tabular datasets, and store responses back in the DataFrame with minimal effort.
Visit our complete documentation site →
For comprehensive guides, examples, and API reference, visit our dedicated documentation website.
- Easily map LLM responses to a specific column in a pandas DataFrame, Excel, CSV.
- Run list of prompts easily.
- Get started with easy to follow Examples
✔ New OpenAI Responses Endpoint
Install the package from GitHub:
pip install llmworkbookLLMWorkbook provides wrapper utilities to prepare various data formats for LLM consumption. These utilities transform input data into a format suitable for LLM processing, ensuring consistency and compatibility.
These wrapper methods can handle popular data sources like Excel (xlsx), CSV, Pandas DataFrames, multi dimensional arrays.
See Examples for details. - Github - Examples
import pandas as pd
from llmworkbook import LLMConfig, LLMRunner, LLMDataFrameIntegrator# Provide a dataframe, the usual
df = pd.DataFrame(data)config = LLMConfig(
provider="openai",
system_prompt="Process these Data rows as per the provided prompt",
options={
"model": "gpt-4o-mini",
"temperature": 1,
"max_tokens": 1024,
},
)runner = LLMRunner(config)
integrator = LLMDataFrameIntegrator(runner=runner, df=df)updated_df = integrator.add_llm_responses(
prompt_column="prompt_text",
response_column="llm_response",
async_mode=False # Set to True for asynchronous requests
)Example code is available in the Git Repository for easy reference.
- Add support for more LLM providers (Google VertexAI, Cohere, Groq, MistralAI).
- Add an interface frontend for low code applications.
- Implement rate-limiting and token usage tracking.
- Summarized history persisted across session to provide quick context for next session.
Detailed documentation for each module is available in the Documentation file.
- Wrapping Data file.
- Providers - OpenAI Gpt4All Ollama
- CLI Usage file.
- LLMDataFrameIntegrator - Row/Batch Processing
Homepage Repository Documentation Examples Bug Tracker Issues