Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

i watch UI but don't work, can't talk, server fail in tts. #18

Closed
sontoriyama opened this issue Sep 16, 2024 · 7 comments
Closed

i watch UI but don't work, can't talk, server fail in tts. #18

sontoriyama opened this issue Sep 16, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@sontoriyama
Copy link

I have ollama and miniconda in i7 7gen, 1070gtx, 16gb ram.

I change configs yaml like medium.en for medium (because i wan speak in spanish), and change sites "en" for "es". But for in local can have good tts in spanish i don't know what i will have. Sorry for my english, i'm spanish ;)

error:
(vt) C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber>python server.py
INFO: Started server process [10100]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://localhost:12393 (Press CTRL+C to quit)
INFO: ::1:57092 - "GET / HTTP/1.1" 304 Not Modified
INFO: ('::1', 57094) - "WebSocket /client-ws" [accepted]
INFO: connection open
Connection established
Model Information Loaded.
Model Information Loaded.
config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.26k/2.26k [00:00<?, ?B/s]
vocabulary.txt: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████| 460k/460k [00:00<00:00, 1.55MB/s]
tokenizer.json: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2.20M/2.20M [00:00<00:00, 4.28MB/s]
model.bin: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1.53G/1.53G [00:35<00:00, 42.9MB/s]
Error: Piper TTS voice model not found at path "./models/piper_voice/es_ES-amy-medium.onnx"
Downloading the default voice model...
Downloading from https://huggingface.co/rhasspy/piper-voices/resolve/v1.0.0/en/en_US/amy/medium/en_US-amy-medium.onnx
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 244, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\middleware\errors.py", line 152, in call
await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 62, in wrapped_app
raise exc
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 735, in app
await route.handle(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 362, in handle
await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 95, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 62, in wrapped_app
raise exc
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 93, in app
await func(session)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\fastapi\routing.py", line 383, in app
await dependant.call(**solved_result.values)
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\server.py", line 94, in websocket_endpoint
self.open_llm_vtuber = OpenLLMVTuberMain(self.open_llm_vtuber_config)
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\main.py", line 73, in init
self.tts = self.init_tts()
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\main.py", line 133, in init_tts
tts = TTSFactory.get_tts_engine(tts_model, **tts_config)
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\tts\tts_factory.py", line 43, in get_tts_engine
return PiperTTSEngine(voice_path=kwargs.get("voice_model_path"), verbose=kwargs.get("verbose"))
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\tts\piperTTS.py", line 26, in init
scripts.install_piper_tts.download_default_model()
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\scripts\install_piper_tts.py", line 203, in download_default_model
download_file(default_voice_model_url, voice_model_path)
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\scripts\install_piper_tts.py", line 44, in download_file
with open(save_path, "wb") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\models\piper_voice\en_US-amy-medium.onnx'
INFO: ('::1', 57137) - "WebSocket /client-ws" [accepted]
INFO: ('127.0.0.1', 57268) - "WebSocket /client-ws" [accepted]
INFO: connection closed
INFO: connection closed
INFO: connection closed
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 332, in asgi_send
await self.send(data) # type: ignore[arg-type]
File "C:\programes\miniconda3\envs\vt\lib\site-packages\websockets\legacy\protocol.py", line 630, in send
await self.ensure_open()
File "C:\programes\miniconda3\envs\vt\lib\site-packages\websockets\legacy\protocol.py", line 931, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: no close frame received or sent

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\websockets.py", line 85, in send
await self._send(message)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 48, in sender
await send(message)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 345, in asgi_send
raise ClientDisconnected from exc
uvicorn.protocols.utils.ClientDisconnected

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 244, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\middleware\errors.py", line 152, in call
await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 62, in wrapped_app
raise exc
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 735, in app
await route.handle(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 362, in handle
await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 95, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 62, in wrapped_app
raise exc
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 93, in app
await func(session)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\fastapi\routing.py", line 383, in app
await dependant.call(**solved_result.values)
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\server.py", line 87, in websocket_endpoint
await websocket.send_text(
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\websockets.py", line 165, in send_text
await self.send({"type": "websocket.send", "text": data})
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\websockets.py", line 88, in send
raise WebSocketDisconnect(code=1006)
starlette.websockets.WebSocketDisconnect
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 332, in asgi_send
await self.send(data) # type: ignore[arg-type]
File "C:\programes\miniconda3\envs\vt\lib\site-packages\websockets\legacy\protocol.py", line 630, in send
await self.ensure_open()
File "C:\programes\miniconda3\envs\vt\lib\site-packages\websockets\legacy\protocol.py", line 931, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: no close frame received or sent

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\websockets.py", line 85, in send
await self._send(message)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 48, in sender
await send(message)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 345, in asgi_send
raise ClientDisconnected from exc
uvicorn.protocols.utils.ClientDisconnected

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 244, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
File "C:\programes\miniconda3\envs\vt\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\middleware\errors.py", line 152, in call
await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 62, in wrapped_app
raise exc
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 735, in app
await route.handle(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 362, in handle
await self.app(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 95, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 62, in wrapped_app
raise exc
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\routing.py", line 93, in app
await func(session)
File "C:\programes\miniconda3\envs\vt\lib\site-packages\fastapi\routing.py", line 383, in app
await dependant.call(**solved_result.values)
File "C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber\server.py", line 87, in websocket_endpoint
await websocket.send_text(
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\websockets.py", line 165, in send_text
await self.send({"type": "websocket.send", "text": data})
File "C:\programes\miniconda3\envs\vt\lib\site-packages\starlette\websockets.py", line 88, in send
raise WebSocketDisconnect(code=1006)
starlette.websockets.WebSocketDisconnect
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [10100]
^C
(vt) C:\eines\llm-amb-config-fastwhisperetc-tts-detotscolors-ollama-animee\Open-LLM-VTuber>

@t41372
Copy link
Owner

t41372 commented Sep 17, 2024

Fixing it

@t41372 t41372 added the bug Something isn't working label Sep 17, 2024
@t41372
Copy link
Owner

t41372 commented Oct 11, 2024

Sorry for taking so long to reply. The main issue is the implementation of Piper TTS. I solved the part of the issue demonstrated in this thread very quickly but found another huge issue with the Piper TTS implementation in this project that basically makes it unusable. I wrote about this problem in the update note of the v0.2.3 release.

I recommend you use MeloTTS or EdgeTTS for Spanish instead of Piper TTS because the issue might not be fixed for a while.

@lonngxiang
Copy link

运行直接错误,zh这是什么问题呢:ERROR: Exception in ASGI application
image
image

@t41372
Copy link
Owner

t41372 commented Oct 17, 2024

朋友,你能把报错信息截的完整点吗... 截图里没有什么有效信息... 最上面那个 OS.Error 是什么?
还有,你能不能新开一个 issue? 你这个问题跟这个issue 好像没有关系...

@lonngxiang
Copy link

朋友,你能把报错信息截的完整点吗... 截图里没有什么有效信息... 最上面那个 OS.Error 是什么? 还有,你能不能新开一个 issue? 你这个问题跟这个issue 好像没有关系...

就是直接运行python server.py

python server.py
INFO:     Started server process [26252]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://localhost:12393 (Press CTRL+C to quit)
INFO:     ::1:12221 - "GET / HTTP/1.1" 200 OK
INFO:     ::1:12221 - "GET /libs/live2dcubismcore.min.js HTTP/1.1" 200 OK
INFO:     ::1:12220 - "GET /libs/live2d.min.js HTTP/1.1" 200 OK
INFO:     ::1:12236 - "GET /TaskQueue.js HTTP/1.1" 200 OK
INFO:     ::1:12234 - "GET /libs/pixi.min.js HTTP/1.1" 200 OK
INFO:     ::1:12235 - "GET /libs/index.min.js HTTP/1.1" 200 OK
INFO:     ::1:12237 - "GET /index.css HTTP/1.1" 200 OK
INFO:     ::1:12237 - "GET /libs/ort.js HTTP/1.1" 200 OK
INFO:     ::1:12236 - "GET /libs/bundle.min.js HTTP/1.1" 200 OK
INFO:     ::1:12235 - "GET /live2d.js HTTP/1.1" 200 OK
INFO:     ('::1', 12239) - "WebSocket /client-ws" [accepted]
INFO:     connection open
Connection established
Model Information Loaded.
Model Information Loaded.
C:\Users\loong\.conda\envs\nlp\lib\site-packages\rotary_embedding_torch\rotary_embedding_torch.py:35: FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.
  @autocast(enabled = False)
C:\Users\loong\.conda\envs\nlp\lib\site-packages\rotary_embedding_torch\rotary_embedding_torch.py:253: FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.
  @autocast(enabled = False)
2024-10-17 16:29:26,410 - modelscope - WARNING - Using branch: master as version is unstable, use with caution
Downloading: 100%|████████████████████████████████████████████████████████████████| 6.00k/6.00k [00:00<00:00, 12.9kB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████| 116k/116k [00:00<00:00, 260kB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████| 10.9k/10.9k [00:00<00:00, 33.9kB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████| 238k/238k [00:00<00:00, 478kB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████| 368k/368k [00:00<00:00, 614kB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████| 1.81k/1.81k [00:00<00:00, 5.07kB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████████| 396/396 [00:00<00:00, 1.08kB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████| 56.1k/56.1k [00:00<00:00, 137kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████| 935k/935k [00:00<00:00, 1.23MB/s]
Downloading: 100%|█████████████████████████████████████████████████████████████████| 56.5k/56.5k [00:00<00:00, 113kB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████| 27.3k/27.3k [00:00<00:00, 66.1kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████| 893M/893M [00:20<00:00, 46.3MB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████| 9.09k/9.09k [00:00<00:00, 27.3kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████| 880k/880k [00:00<00:00, 1.18MB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████| 194k/194k [00:00<00:00, 381kB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████| 318k/318k [00:00<00:00, 479kB/s]
Downloading: 100%|███████████████████████████████████████████████████████████████████| 344k/344k [00:00<00:00, 487kB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████| 30.5k/30.5k [00:00<00:00, 48.3kB/s]
Downloading: 100%|████████████████████████████████████████████████████████████████| 43.9k/43.9k [00:00<00:00, 85.8kB/s]
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 242, in run_asgi
    result = await self.app(self.scope, self.asgi_receive, self.asgi_send)  # type: ignore[func-returns-value]
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\applications.py", line 113, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\middleware\errors.py", line 152, in __call__
    await self.app(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\middleware\exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\_exception_handler.py", line 62, in wrapped_app
    raise exc
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\_exception_handler.py", line 51, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\routing.py", line 715, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\routing.py", line 735, in app
    await route.handle(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\routing.py", line 362, in handle
    await self.app(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\routing.py", line 95, in app
    await wrap_app_handling_exceptions(app, session)(scope, receive, send)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\_exception_handler.py", line 62, in wrapped_app
    raise exc
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\_exception_handler.py", line 51, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\starlette\routing.py", line 93, in app
    await func(session)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\fastapi\routing.py", line 383, in app
    await dependant.call(**solved_result.values)
  File "C:\Users\loong\Downloads\Open-LLM-VTuber-main\Open-LLM-VTuber-main\server.py", line 61, in websocket_endpoint
    open_llm_vtuber = OpenLLMVTuberMain(self.open_llm_vtuber_config)
  File "C:\Users\loong\Downloads\Open-LLM-VTuber-main\Open-LLM-VTuber-main\main.py", line 69, in __init__
    self.asr = self.init_asr()
  File "C:\Users\loong\Downloads\Open-LLM-VTuber-main\Open-LLM-VTuber-main\main.py", line 140, in init_asr
    asr = ASRFactory.get_asr_system(asr_model, **asr_config)
  File "C:\Users\loong\Downloads\Open-LLM-VTuber-main\Open-LLM-VTuber-main\asr\asr_factory.py", line 28, in get_asr_system
    return FunASR(
  File "C:\Users\loong\Downloads\Open-LLM-VTuber-main\Open-LLM-VTuber-main\asr\fun_asr.py", line 29, in __init__
    self.model = AutoModel(
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\funasr\auto\auto_model.py", line 104, in __init__
    model, kwargs = self.build_model(**kwargs)
  File "C:\Users\loong\.conda\envs\nlp\lib\site-packages\funasr\auto\auto_model.py", line 192, in build_model
    model = model_class(**model_conf, vocab_size=vocab_size)
TypeError: 'NoneType' object is not callable
INFO:     connection closed
INFO:     ::1:12237 - "GET /bg/cityscape.jpeg HTTP/1.1" 200 OK
INFO:     ::1:12237 - "GET /favicon.ico HTTP/1.1" 404 Not Found

@t41372
Copy link
Owner

t41372 commented Oct 17, 2024

针对你遇到的问题我开了一个新的issue,我们在那里讨论

@t41372
Copy link
Owner

t41372 commented Dec 13, 2024

There is a new TTS provider, sherpa-onnx merged in v0.5.0-alpha.1, which can do MeloTTS and PiperTTS inferencing while being very easy to install. The old PiperTTS implementation will likely get removed in favor of this new inference package.
#50

@t41372 t41372 closed this as completed Dec 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants
@t41372 @lonngxiang @sontoriyama and others