Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IndexErrors when streaming API returns empty choices #72

Open
jvnn opened this issue Dec 29, 2023 · 0 comments
Open

IndexErrors when streaming API returns empty choices #72

jvnn opened this issue Dec 29, 2023 · 0 comments

Comments

@jvnn
Copy link

jvnn commented Dec 29, 2023

I'm using vim-ai together with the Azure OpenAI service, which is almost compatible with the official OpenAI API but does have its quirks. I'm using a self-made proxy service in between, which is supposed to handle the differences and allow applications that aren't "Azure aware" to use pure OpenAI paths and mechanisms transparently with the Azure models.

Since updating the proxy to use the latest Azure API version, vim-ai stopped working correctly. It throws IndexErrors as the response from the API contains an empty 'choices' array, but vim-ai always assumes that there is at least one item in there. Here's an example response (via debug logging):

[2023-12-29 09:12:56.427605] [chat] response: {'id': '', 'object': '', 'created': 0, 'model': '', 'prompt_filter_results': [{'prompt_index': 0, 'content_filter_results': {'hate': {'filtered': False, 'severity': 'safe'}, 'self_harm': {'filtered': False, 'severity': 'safe'}, 'sexual': {'filtered': False, 'severity': 'safe'}, 'violence': {'filtered': False, 'severity': 'safe'}}}], 'choices': []}

I managed to get it working correctly again with the simple patch below. Sorry for not making a proper PR for this, but I really don't have the time right now (to really think through the change or do proper testing for example) so I decided to create a quick issue instead.

$ git diff
diff --git a/py/chat.py b/py/chat.py
index ff70904..a62d9fc 100644
--- a/py/chat.py
+++ b/py/chat.py
@@ -76,7 +76,10 @@ try:
         response = openai_request(url, request, http_options)
         def map_chunk(resp):
             printDebug("[chat] response: {}", resp)
-            return resp['choices'][0]['delta'].get('content', '')
+            if resp['choices']:
+                return resp['choices'][0]['delta'].get('content', '')
+            else:
+                return ""
         text_chunks = map(map_chunk, response)
         render_text_chunks(text_chunks, is_selection)
 
diff --git a/py/complete.py b/py/complete.py
index debe275..a598394 100644
--- a/py/complete.py
+++ b/py/complete.py
@@ -24,7 +24,10 @@ def complete_engine(prompt):
     response = openai_request(url, request, http_options)
     def map_chunk(resp):
         printDebug("[engine-complete] response: {}", resp)
-        return resp['choices'][0].get('text', '')
+        if resp['choices']:
+            return resp['choices'][0].get('text', '')
+        else:
+            return ""
     text_chunks = map(map_chunk, response)
     return text_chunks
 
@@ -43,7 +46,10 @@ def chat_engine(prompt):
     response = openai_request(url, request, http_options)
     def map_chunk(resp):
         printDebug("[engine-chat] response: {}", resp)
-        return resp['choices'][0]['delta'].get('content', '')
+        if resp['choices']:
+            return resp['choices'][0]['delta'].get('content', '')
+        else:
+            return ""
     text_chunks = map(map_chunk, response)
     return text_chunks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant