Skip to content

feat: 为DistributedRAG项目完成基本RAG框架 #64

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

winnnnnnd
Copy link

No description provided.

@mindspore-courses mindspore-courses deleted a comment from winnnnnnd Jul 28, 2025
@winnnnnnd
Copy link
Author

测试2

@Tridu33
Copy link
Contributor

Tridu33 commented Jul 28, 2025

测试2

test

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不需要提交这个文件,删掉然后再git commit/push一下

# 注意:mindnlp可能会依赖特定版本的torch或mindspore,pip会自动处理


mindspore~=2.3.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

版本太老,现在用ms2.6.0

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个文件夹都不要提交.gitignore文件里面忽略掉这个目录

接收一个包含多个文本字符串的列表,返回它们对应的向量列表。
"""
if not embedding_model:
raise HTTPException(status_code=500, detail="Embedding模型未能成功加载,服务不可用。")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

报异常和各种流程推荐使用logging而不是单机的print,因为分布式多机可能会有很多重复的或者打印在其他奇奇怪怪的地方,或者是logger的Python库。https://zhuanlan.zhihu.com/p/445411809 。以后的print也整改下。

# transformers
# 根据您选择的MiniCPM模型,可能还需要其他特定依赖

mindspore~=2.3.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

版本同上

# 根据您选择的MiniCPM模型,可能还需要其他特定依赖

mindspore~=2.3.0
mindnlp~=0.4.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# ---------------------------------------------------
# 3. LLM 推理服务
# ---------------------------------------------------
llm-server:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

除了自行MindNLP自行拉起各种小模型的llm-server,还需要拉起一个正经商用级别推理的vLLM-MindSpore的容器:https://gitee.com/mindspore/vllm-mindspore/blob/master/install_depend_pkgs.sh , docker命令参考https://gitee.com/wang_hua_2019/cmb/blob/master/dockerfiles/dockerfile_unified , 可以先拉起一个qwen2.5_7bdaa单卡在线推理 https://www.mindspore.cn/vllm_mindspore/docs/zh-CN/master/getting_started/tutorials/qwen2.5_7b_singleNPU/qwen2.5_7b_singleNPU.html ,这个分布式在于充分利用集群内多机多卡,你先拿单机127.0.0.1做IP即可。单机用环境变量区分不同的卡,在创建docker容器时,可以使用命令-e ASCEND_VISIBLE_DEVICES=0,1,2,3,4,5,6,7来指定挂载的NPU卡。 后面算力代金券到手后,我给你录个vllm的教程,现在这个容器可以先不搞,我在这里提一下。

uvicorn[standard]

# MindNLP及相关依赖
# 注意:mindnlp可能会依赖特定版本的torch或mindspore,pip会自动处理
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mindnlp不依赖torch

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个也不需要,可以新commit一个删掉这个的

@@ -0,0 +1 @@
summer-ospp/DistributedRAG/volumes/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

**/pycache
*.pyc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants