Starting a new coding project can be boring and repetitive. As a software developer, you waste countless hours creating the first draft of you project and use your precious energy deciding on how to name you scipts. Multiply that by a couple projects a month and you're basically wasting hundreds of hours of you life.
Headstart is a multi-agent LLM application that solves all of those problems. Using a bunch of specialized LLM agents it creates a draft of your projects in a blink of an eye, saving you time, energy, and saving you from decision fatigue. It offers an interactive experience to adjust the draft to your needs, and fix any imperfections the bots might have made.
Python 3.11 is strongly recommended. To install necessary dependecies run:
make installTo eliminate LLM expenses and make the program quicker to set up, it uses a local model ran by Ollama. To run headstart, please make sure you have Ollama installed and running on your system. Make sure you have llama3.1 model downloaded. The model requires around 5GB of memory to run.
ollama pull llama3.1
ollama serve
After installing the project, headstart can be run from CLI:
cd headstart
crewai runThis project is licensed under the MIT License.
You might get an error on your first run of the tool:
This is an issue with litellm described in this thread: https://community.crewai.com/t/list-index-out-of-range-msg-i/5612/8 and is not yet fixed, so one has to fix it manually by modifying the .venv/lib/python3.11/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py file in ollama_pt function around line 225:
## MERGE CONSECUTIVE ASSISTANT CONTENT ##
while msg_i < len(messages) and messages[msg_i]["role"] == "assistant":
assistant_content_str += convert_content_list_to_str(messages[msg_i])
--- msg_i += 1 # remove this
tool_calls = messages[msg_i].get("tool_calls")
ollama_tool_calls = []
if tool_calls:
for call in tool_calls:
call_id: str = call["id"]
function_name: str = call["function"]["name"]
arguments = json.loads(call["function"]["arguments"])
ollama_tool_calls.append(
{
"id": call_id,
"type": "function",
"function": {
"name": function_name,
"arguments": arguments,
},
}
)
if ollama_tool_calls:
assistant_content_str += (
f"Tool Calls: {json.dumps(ollama_tool_calls, indent=2)}"
)
--- msg_i += 1
+++ msg_i += 1 # unindent this