using LMarena.ai base foundation fastchat and building another version of it, where you can locally test LLMs through it. If it fails, connect Ollama and test.
python testing automation experimental-project opensource-projects beginer-projects llm fastchat llm-testing lmarena-ai
-
Updated
Aug 20, 2025 - Python