English | 简体中文
Manus and OpenManus are great, but OpenManus currently does not have a front-end. Therefore, I spent 2 hours developing a simple WebUI based on the Flask framework, which can be called by calling OpenManus to achieve front-end calling. 🛫!
- OpenManusX
- Open source initial version of WebUI;
- The preview and save areas of OpenManusX files need to support PDF ppt、word、excel、 Preview of highlighted code;
- The large model dialog box needs to be beautified for output, and the OpenManus runtime log needs to be optimized for display, such as code highlighting;
- Continuously polish the front and back ends to achieve automated execution.
Start your journey of intelligent agents with OpenManusX!
- Create a new conda environment:
conda create -n OpenManusX python=3.12
conda activate OpenManusX
- Clone warehouse: progressiveness install OpenManus, and the subsequent installation of OpenManusX's webUI will be very fast
https://github.com/mannaandpoem/OpenManus.git
cd OpenManus
- Installation dependencies:
pip install -r requirements.txt
4.Install OpenManusAI in 2 ways
# 1 Warehouse installation
https://github.com/Shybert-AI/OpenManusAI.git
cd OpenManus
pip install -r requirements.txt
# 2 Copy the running agent of OpenManus to the app.cy file
async def main(prompt):
agent = Manus()
await agent.run(prompt)
The OpenManusX configuration API, like OpenManus, requires configuring the LLM API to be used. Please follow the steps below to configure the Deepseek R1 model
- Create a 'config. toml' file in the 'config' directory (can be copied from the example):
cp config/config.example.toml config/config.toml
- Edit 'config/config. toml' to add API keys and custom settings:
## Global LLM configuration
#[llm]
#model = "deepseek-chat"
#base_url = "https://api.deepseek.com/v1"
#api_key = "sk-xxxxxxxxxxxx"
#max_tokens = 4096
#temperature = 0.6
#
## Optional configuration for specific LLM models
#[llm.vision]
#model = "deepseek-chat"
#base_url = "https://api.deepseek.com/v1"
#api_key = "sk-xxxxxxxxxxxx"
# Global LLM configuration
[llm]
model = "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B"
base_url = "https://api.siliconflow.cn/v1/"
api_key = "sk-xxxxxxxxxxxxxxxxxx"
max_tokens = 4096
temperature = 0.6
# Optional configuration for specific LLM models
[llm.vision]
model = "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B"
base_url = "https://api.siliconflow.cn/v1/"
api_key = "sk-xxxxxxxxxxxxxxxxxx"
Run OpenManusX with one command, and then open it on the webpage http://127.0.0.1:5000
python app.py
Special Thanks OpenManus and browser-use The basic support provided for this project!