Skip to content

Crash: Assertion '!this->empty()' failed #1696

Closed

Description

System Info

aur/gpt4all-chat 2.5.4-1 (+0 0.00) (Installed)
Linux pc 6.6.3-arch1-1 #1 SMP PREEMPT_DYNAMIC Wed, 29 Nov 2023 00:37:40 +0000 x86_64 GNU/Linux

Information

  • The official example notebooks/scripts
  • My own modified scripts
  • GUI

Reproduction

  1. open gpt4all
  2. start download of mistral-7b-openorca.Q4_0.gguf
  3. download finishes, app crashes
[Debug] (Thu Nov 30 10:55:58 2023): deserializing chats took: 0 ms
[Warning] (Thu Nov 30 10:56:52 2023): Opening temp file for writing: "/home/moti/AI/Text Generation/gpt4all/incomplete-mistral-7b-openorca.Q4_0.gguf"
[Warning] (Thu Nov 30 10:57:18 2023): Opening temp file for writing: "/home/moti/AI/Text Generation/gpt4all/incomplete-mistral-7b-openorca.Q4_0.gguf"
[Warning] (Thu Nov 30 10:59:05 2023): stream 3 finished with error: "Internal server error"
[Warning] (Thu Nov 30 10:59:05 2023): Opening temp file for writing: "/home/moti/AI/Text Generation/gpt4all/incomplete-mistral-7b-openorca.Q4_0.gguf"
[Warning] (Thu Nov 30 10:59:05 2023): "ERROR: Downloading failed with code 401 \"Internal server error\""
llama_new_context_with_model: max tensor size =   102.55 MB
llama.cpp: using Vulkan on /usr/include/c++/13.2.1/bits/stl_vector.h:1208: constexpr std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::front() [with _Tp = ggml_vk_device; _Alloc = std::allocator<ggml_vk_device>; reference = ggml_vk_device&]: Assertion '!this->empty()' failed.
[Debug] (Thu Nov 30 11:01:34 2023): deserializing chats took: 6 ms
[Warning] (Thu Nov 30 11:01:35 2023): ERROR: Previous attempt to load model resulted in crash for `mistral-7b-openorca.Q4_0.gguf` most likely due to insufficient memory. You should either remove this model or decrease your system RAM by closing other applications. id "0eabbb6e-765e-4319-af82-c03c81e9e303"
llama_new_context_with_model: max tensor size =   102.55 MB
llama.cpp: using Vulkan on /usr/include/c++/13.2.1/bits/stl_vector.h:1208: constexpr std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::front() [with _Tp = ggml_vk_device; _Alloc = std::allocator<ggml_vk_device>; reference = ggml_vk_device&]: Assertion '!this->empty()' failed.
[2]    36718 IOT instruction (core dumped)  gpt4all-chat

Expected behavior

...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingvulkan

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions