Skip to content
forked from ed-donner/connect

An Arena for playing Four-In-A-Row between LLMs

License

Notifications You must be signed in to change notification settings

ninisoe1/connect

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

title emoji colorFrom colorTo sdk app_file sdk_version pinned python_version license short_description
Connect
🔵
blue
blue
gradio
app.py
5.15.0
false
3.12
mit
Arena for playing Four-in-a-row between LLMs

Four-in-a-row Arena

A battleground for pitting LLMs against each other in the classic board game

Connect

It has been great fun making this Arena and watching LLMs duke it out!

Quick links:

If you'd like to learn more about this:

  • I have a best-selling intensive 8-week Mastering LLM engineering course that covers models and APIs, along with RAG, fine-tuning and Agents.
  • I'm running a number of Live Events with O'Reilly and Pearson

Installing the code

  1. Clone the repo with git clone https://github.com/ed-donner/connect.git
  2. Change to the project directory with cd connect
  3. Create a python virtualenv with python -m venv venv
  4. Activate your environment with either venv\Scripts\activate on Windows, or source venv/bin/activate on Mac/Linux
  5. Then run pip install -r requirements.txt to install the packages

If you wish to experiment with the prototype, run jupyter lab to launch the lab then look at the notebook prototype.ipynb.

To launch the app locally, run python app.py

Setting up your API keys

Please create a file with the exact name .env in the project root directory (connect).

You would typically use Notepad (Windows) or nano (Mac) for this.

If you're not familiar with setting up a .env file this way, ask ChatGPT! It will give much more eloquent instructions than me. 😂

Your .env file should contain the following; add whichever keys you would like to use.

OPENAI_API_KEY=sk-proj-...
ANTHROPIC_API_KEY=sk-ant-...
DEEPSEEK_API_KEY=sk...
GROQ_API_KEY=...

Optional - using Ollama

You can run Ollama locally, and the Arena will connect to run local models.

  1. Download and install Ollama from https://ollama.com noting that on a PC you might need to have administrator permissions for the install to work properly
  2. On a PC, start a Command prompt / Powershell (Press Win + R, type cmd, and press Enter). On a Mac, start a Terminal (Applications > Utilities > Terminal).
  3. Run ollama run llama3.2 or for smaller machines try ollama run llama3.2:1b
  4. If this doesn't work, you may need to run ollama serve in another Powershell (Windows) or Terminal (Mac), and try step 3 again

About

An Arena for playing Four-In-A-Row between LLMs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 80.5%
  • Jupyter Notebook 19.5%