Skip to content

Latest commit

 

History

History
127 lines (93 loc) · 3.76 KB

README_en.md

File metadata and controls

127 lines (93 loc) · 3.76 KB

python version GitHub forks GitHub Repo stars GitHub
English | 简体中文

OpenManusX 🙋

Manus and OpenManus are great, but OpenManus currently does not have a front-end. Therefore, I spent 2 hours developing a simple WebUI based on the Flask framework, which can be called by calling OpenManus to achieve front-end calling. 🛫!

📑 The front-end page needs continuous optimization, plan

  • OpenManusX
    • Open source initial version of WebUI;
    • The preview and save areas of OpenManusX files need to support PDF ppt、word、excel、 Preview of highlighted code;
    • The large model dialog box needs to be beautified for output, and the OpenManus runtime log needs to be optimized for display, such as code highlighting;
    • Continuously polish the front and back ends to achieve automated execution.

Start your journey of intelligent agents with OpenManusX!

installation guide

  1. Create a new conda environment:
conda create -n OpenManusX python=3.12
conda activate OpenManusX
  1. Clone warehouse: progressiveness install OpenManus, and the subsequent installation of OpenManusX's webUI will be very fast
https://github.com/mannaandpoem/OpenManus.git
cd OpenManus
  1. Installation dependencies:
pip install -r requirements.txt

4.Install OpenManusAI in 2 ways

# 1 Warehouse installation
https://github.com/Shybert-AI/OpenManusAI.git
cd OpenManus
pip install -r requirements.txt

# 2 Copy the running agent of OpenManus to the app.cy file

async def main(prompt):
    agent = Manus()
    await agent.run(prompt)

Configuration Description

The OpenManusX configuration API, like OpenManus, requires configuring the LLM API to be used. Please follow the steps below to configure the Deepseek R1 model

  1. Create a 'config. toml' file in the 'config' directory (can be copied from the example):
cp config/config.example.toml config/config.toml
  1. Edit 'config/config. toml' to add API keys and custom settings:
## Global LLM configuration
#[llm]
#model = "deepseek-chat"
#base_url = "https://api.deepseek.com/v1"
#api_key = "sk-xxxxxxxxxxxx"
#max_tokens = 4096
#temperature = 0.6
#
## Optional configuration for specific LLM models
#[llm.vision]
#model = "deepseek-chat"
#base_url = "https://api.deepseek.com/v1"
#api_key = "sk-xxxxxxxxxxxx"


# Global LLM configuration
[llm]
model = "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B"
base_url = "https://api.siliconflow.cn/v1/"
api_key = "sk-xxxxxxxxxxxxxxxxxx"
max_tokens = 4096
temperature = 0.6

# Optional configuration for specific LLM models
[llm.vision]
model = "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B"
base_url = "https://api.siliconflow.cn/v1/"
api_key = "sk-xxxxxxxxxxxxxxxxxx"

quick start

Run OpenManusX with one command, and then open it on the webpage http://127.0.0.1:5000

python app.py

acknowledgments

Special Thanks OpenManus and browser-use The basic support provided for this project!

⭐ Star History

Star History Chart