Skip to content

AttributeError: 'Body' object has no attribute 'llm' #52

Open
@Dzz2004

Description

@Dzz2004

Checked other resources

  • I searched the Codefuse documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in Codefuse-Repos rather than my code.
  • I added a very descriptive title to this issue.

System Info

Windows

Code Version

Latest Release

Description

在尝试连接本地chatglm2-6b模型时,输入任何问题都没有输出,终端出现如下报错。8000端口,项目里给的默认端口是8888,我运行了fastchat,改成了8000,同时模型端口也改成了21002(fastchat中modelworker的端口)

出现这样的问题是本地模型没有连上吗?

Example Code

问题大概出现在这里:

def _fastapi_stream2generator(self, response: StreamingResponse, as_json: bool =False):
'''
将api.py中视图函数返回的StreamingResponse转化为同步生成器
'''
try:
loop = asyncio.get_event_loop()
except:
loop = asyncio.new_event_loop()

    try:
        for chunk in  iter_over_async(response.body_iterator, loop):
            if as_json and chunk:
                yield json.loads(chunk)
            elif chunk.strip():
                yield chunk
    except Exception as e:
        logger.error(traceback.format_exc())

Error Message and Stack Trace (if applicable)

2024-10-07 22:00:10,718 - _client.py[line:1038] - INFO: HTTP Request: GET http://127.0.0.1:7862/sdfiles/list "HTTP/1.1 200 OK"
2024-10-07 22:00:11,154 - _client.py[line:1038] - INFO: HTTP Request: GET http://127.0.0.1:7862/sdfiles/download?filename=&save_filename= "HTTP/1.1 200 OK"
2024-10-07 22:00:11.191 | DEBUG | webui.dialogue:dialogue_page:294 - prompt: 你好
2024-10-07 22:00:11.674 | ERROR | webui.utils:_fastapi_stream2generator:252 - Traceback (most recent call last):
File "C:\Users\Lenovo\Desktop\codefuse-chatbot\examples\webui\utils.py", line 246, in _fastapi_stream2generator
for chunk in iter_over_async(response.body_iterator, loop):
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\muagent\utils\server_utils.py", line 120, in iter_over_async
done, obj = loop.run_until_complete(get_next())
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\asyncio\base_events.py", line 647, in run_until_complete
return future.result()
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\muagent\utils\server_utils.py", line 115, in get_next
obj = await ait.anext()
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\starlette\concurrency.py", line 63, in iterate_in_threadpool
yield await anyio.to_thread.run_sync(_next, iterator)
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\anyio_backends_asyncio.py", line 2405, in run_sync_in_worker_thread
return await future
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\anyio_backends_asyncio.py", line 914, in run
result = context.run(func, *args)
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\starlette\concurrency.py", line 53, in _next
return next(iterator)
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\muagent\chat\base_chat.py", line 80, in chat_iterator
model = getChatModelFromConfig(llm_config)
File "C:\Users\Lenovo.conda\envs\devopsgpt\lib\site-packages\muagent\llm_models\openai_model.py", line 117, in getChatModelFromConfig
if llm_config and llm_config.llm and isinstance(llm_config.llm, LLM):
AttributeError: 'Body' object has no attribute 'llm'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions