Description
Carrying this over from feature request issue 23...
Bug? -- If the default llama-index is not base, but base still exists and you begin a chat with chat-with-file enableed, the model will not find your indexed files... Have to switch to chat with files mode, select the database and switch back to Chat.
could you please describe in more detail, step by step with example setup? Unfortunately, I can't reproduce this problem.
More context:
I have two indexes, base : Base, and custom : Custom
On start-up, under debug Indexes:
Current idx: base
Indexes (list):
[
{
"id": "base",
"name": "Base"
},
{
"id": "custom",
"name": "Custom"
}
]
Plugin settings:
- Chat with files (Llama-index, inline): custom
-
- Ask Llama-index first (enabled)
- Command: Files I/O (Index to use when indexing files): base (Note this setting was a challenge to find or changed maybe??)
- Context History: (I thought I was able to designate where to save context history, more specifcally the Auto-Index conversation history, which I had set to base, but maybe I'm confusing it with the Files I/O setting, but I cannot find that setting any longer).
Effectively, on startup 'base' is the index being referenced; however, I have 'Ask Llama-index first' enabled, but I don't think it's being referenced. I can change the current index by changing the Mode to 'Chat with files', selecting my Custom index (this setting is not visible in 'Chat' mode), and then switching back to 'Chat'
I suspect we should be able to set the default index on startup. The other use case I'm facing is that I'm trying to keep my curated data separate, so my data is in Custom, and the chat history and context I think should be in the base index.