Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Completely offline #324

Open
MoonDragon-MD opened this issue Dec 4, 2024 · 1 comment
Open

Completely offline #324

MoonDragon-MD opened this issue Dec 4, 2024 · 1 comment
Labels
use case cool demo by users

Comments

@MoonDragon-MD
Copy link

Hi, I am trying the program through Docker, since I have gpt4all I set it without key, http://localhost:4891 , Mistral-Nemo-Instruct-2407-Q4_K_M.gguf

But it gives error because I didn't put the key (even though I don't need it), so I put a random one “0000a0000b0000c0000”, however I get these errors:
NLP Spacy model loaded successfully! 2024-12-04 17:10:10 ✅ All sentences have been successfully split! 2024-12-04 17:10:18 2024-12-04 16:10:18.457 Uncaught app exception 2024-12-04 17:10:18 Traceback (most recent call last): 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions 2024-12-04 17:10:18 yield 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 236, in handle_request 2024-12-04 17:10:18 resp = self._pool.handle_request(req) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request 2024-12-04 17:10:18 raise exc from None 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request 2024-12-04 17:10:18 response = connection.handle_request( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection.py", line 99, in handle_request 2024-12-04 17:10:18 raise exc 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection.py", line 76, in handle_request 2024-12-04 17:10:18 stream = self._connect(request) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_sync/connection.py", line 122, in _connect 2024-12-04 17:10:18 stream = self._network_backend.connect_tcp(**kwargs) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_backends/sync.py", line 205, in connect_tcp 2024-12-04 17:10:18 with map_exceptions(exc_map): 2024-12-04 17:10:18 File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ 2024-12-04 17:10:18 self.gen.throw(typ, value, traceback) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpcore/_exceptions.py", line 14, in map_exceptions 2024-12-04 17:10:18 raise to_exc(exc) from exc 2024-12-04 17:10:18 httpcore.ConnectError: [Errno 111] Connection refused 2024-12-04 17:10:18 2024-12-04 17:10:18 The above exception was the direct cause of the following exception: 2024-12-04 17:10:18 2024-12-04 17:10:18 Traceback (most recent call last): 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 981, in _request 2024-12-04 17:10:18 response = self._client.send( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 926, in send 2024-12-04 17:10:18 response = self._send_handling_auth( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 954, in _send_handling_auth 2024-12-04 17:10:18 response = self._send_handling_redirects( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 991, in _send_handling_redirects 2024-12-04 17:10:18 response = self._send_single_request(request) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_client.py", line 1027, in _send_single_request 2024-12-04 17:10:18 response = transport.handle_request(request) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 235, in handle_request 2024-12-04 17:10:18 with map_httpcore_exceptions(): 2024-12-04 17:10:18 File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ 2024-12-04 17:10:18 self.gen.throw(typ, value, traceback) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py", line 89, in map_httpcore_exceptions 2024-12-04 17:10:18 raise mapped_exc(message) from exc 2024-12-04 17:10:18 httpx.ConnectError: [Errno 111] Connection refused 2024-12-04 17:10:18 2024-12-04 17:10:18 The above exception was the direct cause of the following exception: 2024-12-04 17:10:18 2024-12-04 17:10:18 Traceback (most recent call last): 2024-12-04 17:10:18 File "/app/core/ask_gpt.py", line 66, in ask_gpt 2024-12-04 17:10:18 response = client.chat.completions.create( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py", line 274, in wrapper 2024-12-04 17:10:18 return func(*args, **kwargs) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py", line 704, in create 2024-12-04 17:10:18 return self._post( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1268, in post 2024-12-04 17:10:18 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 945, in request 2024-12-04 17:10:18 return self._request( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1005, in _request 2024-12-04 17:10:18 return self._retry_request( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1083, in _retry_request 2024-12-04 17:10:18 return self._request( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1005, in _request 2024-12-04 17:10:18 return self._retry_request( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1083, in _retry_request 2024-12-04 17:10:18 return self._request( 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1015, in _request 2024-12-04 17:10:18 raise APIConnectionError(request=request) from err 2024-12-04 17:10:18 openai.APIConnectionError: Connection error. 2024-12-04 17:10:18 2024-12-04 17:10:18 During handling of the above exception, another exception occurred: 2024-12-04 17:10:18 2024-12-04 17:10:18 Traceback (most recent call last): 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling 2024-12-04 17:10:18 result = func() 2024-12-04 17:10:18 File "/usr/local/lib/python3.10/dist-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec 2024-12-04 17:10:18 exec(code, module.__dict__) 2024-12-04 17:10:18 File "/app/st.py", line 124, in <module> 2024-12-04 17:10:18 main() 2024-12-04 17:10:18 File "/app/st.py", line 120, in main 2024-12-04 17:10:18 text_processing_section() 2024-12-04 17:10:18 File "/app/st.py", line 36, in text_processing_section 2024-12-04 17:10:18 process_text() 2024-12-04 17:10:18 File "/app/st.py", line 56, in process_text 2024-12-04 17:10:18 step4_1_summarize.get_summary() 2024-12-04 17:10:18 File "/app/core/step4_1_summarize.py", line 39, in get_summary 2024-12-04 17:10:18 summary = ask_gpt(summary_prompt, response_json=True, valid_def=valid_summary, log_title='summary') 2024-12-04 17:10:18 File "/app/core/ask_gpt.py", line 103, in ask_gpt 2024-12-04 17:10:18 raise Exception(f"Still failed after {max_retries} attempts: {e}") 2024-12-04 17:10:18 Exception: Still failed after 3 attempts: Connection error.
Here is a summary of the API gpt4all
I was wondering if there is a way to make it work?

Right now I don't care about the dubbing but couldn't you use the built-in windows tts without downloading anything else?

@Huanshere
Copy link
Owner

Thanks for sharing. In v2.1.0 I added a note for local llm, which is to set max_workers to 1 and summary_length to 2000 in config.yaml. This is because local llm couldn't handle long context. Hope this will help!

@Huanshere Huanshere added the use case cool demo by users label Dec 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
use case cool demo by users
Projects
None yet
Development

No branches or pull requests

2 participants