Skip to content

Commit

Permalink
Merge pull request #17 from jhakulin/jhakulin/bug-fixes-1
Browse files Browse the repository at this point in the history
Jhakulin/bug fixes 1
  • Loading branch information
jhakulin authored May 2, 2024
2 parents 0af2933 + fd27e4c commit 9617aa7
Show file tree
Hide file tree
Showing 16 changed files with 94 additions and 76 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ Build the wheel for `azure.ai.assistant` library using the following instruction
- Go to the`sdk/azure-ai-assistant` folder
- Build the wheel using following command: `python setup.py sdist bdist_wheel`
- Go to generated `dist` folder
- Install the generated wheel using following command: `pip install --force-reinstall azure_ai_assistant-0.3.0a1-py3-none-any.whl`
- Install the generated wheel using following command: `pip install --force-reinstall azure_ai_assistant-0.3.1a1-py3-none-any.whl`
- This installation will pick the necessary dependencies for the library (openai, python-Levenshtein, fuzzywuzzy, Pillow, requests)

### Step 4: Install Python UI libraries
Expand Down
8 changes: 4 additions & 4 deletions config/ConversationTitleCreator_assistant_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,14 @@ instructions: |-
1. You are required to create title of given text by finding the overall theme.
2. The end result(title) must be only 3 words long at max. Returning more than 3 words will be a failure.
model: ''
assistant_id: ''
knowledge_files: {}
selected_functions: []
knowledge_retrieval: false
assistant_id: null
functions: []
code_interpreter: false
output_folder_path: ''
ai_client_type: AZURE_OPEN_AI
assistant_type: chat_assistant
assistant_role: system
file_references: []
completion_settings: null
tool_resources: null
file_search: false
8 changes: 4 additions & 4 deletions config/FunctionImplCreator_assistant_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,14 +31,14 @@ instructions: |-
7. Ensure function returns result using json.dumps() and where "result" is key and its value is the result.
8. The end result must be only code and must not contain triple backtics, otherwise, it is considered a failure.
model: ''
assistant_id: ''
knowledge_files: {}
selected_functions: []
knowledge_retrieval: false
assistant_id: null
code_interpreter: false
output_folder_path: ''
ai_client_type: AZURE_OPEN_AI
assistant_type: chat_assistant
assistant_role: system
file_references: []
completion_settings: null
tool_resources: null
file_search: false
functions: []
8 changes: 4 additions & 4 deletions config/FunctionSpecCreator_assistant_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -36,14 +36,14 @@ instructions: |-
7. The module value must not be changed from what is in the template.
8. The end result must not contain triple backtics, otherwise, it is considered a failure.
model: ''
assistant_id: ''
knowledge_files: {}
selected_functions: []
knowledge_retrieval: false
assistant_id: null
code_interpreter: false
output_folder_path: ''
ai_client_type: AZURE_OPEN_AI
assistant_type: chat_assistant
assistant_role: system
file_references: []
completion_settings: null
tool_resources: null
file_search: false
functions: []
8 changes: 4 additions & 4 deletions config/InstructionsReviewer_assistant_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,14 @@ instructions: |-
- Provides recommendations on how to revise the instructions to meet the requirements.
- Suggests examples that could enhance the clarity and completeness of the instructions.
model: ''
assistant_id: ''
knowledge_files: {}
selected_functions: []
knowledge_retrieval: false
assistant_id: null
code_interpreter: false
output_folder_path: ''
ai_client_type: AZURE_OPEN_AI
assistant_type: chat_assistant
assistant_role: system
file_references: []
completion_settings: null
tool_resources: null
file_search: false
functions: []
8 changes: 4 additions & 4 deletions config/SpeechTranscriptionSummarizer_assistant_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,14 +29,14 @@ instructions: |-
give me some other context where I could provide more tailored suggestions.
"""
model: ''
assistant_id: ''
knowledge_files: {}
selected_functions: []
knowledge_retrieval: false
assistant_id: null
code_interpreter: false
output_folder_path: ''
ai_client_type: AZURE_OPEN_AI
assistant_type: chat_assistant
assistant_role: system
file_references: []
completion_settings: null
tool_resources: null
file_search: false
functions: []
8 changes: 4 additions & 4 deletions config/TaskRequestsCreator_assistant_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,8 @@ instructions: |-
["Please review the ./folder1/input1.py file and suggest improvements.", "Please review the ./folder2/input2.py file and suggest improvements."]
4. The end result must be always valid list, e.g. ["formatted user request1", "formatted user request2"], otherwise result is considered as failure.
model: ''
assistant_id: ''
knowledge_files: {}
selected_functions:
assistant_id: null
functions:
- type: function
function:
name: find_files_by_extension_in_directory
Expand All @@ -47,11 +46,12 @@ selected_functions:
required:
- directory
- file_extension
knowledge_retrieval: false
code_interpreter: false
output_folder_path: ''
ai_client_type: AZURE_OPEN_AI
assistant_type: chat_assistant
assistant_role: system
file_references: []
completion_settings: null
tool_resources: null
file_search: false
24 changes: 16 additions & 8 deletions gui/assistant_dialogs.py
Original file line number Diff line number Diff line change
Expand Up @@ -574,7 +574,6 @@ def update_model_combobox(self):
self.modelComboBox.setToolTip("Select a model deployment name from the Azure OpenAI resource")

def assistant_selection_changed(self):
self.reset_fields()
selected_assistant = self.assistantComboBox.currentText()
if selected_assistant == "New Assistant":
self.is_create = True
Expand All @@ -583,6 +582,7 @@ def assistant_selection_changed(self):
self.outputFolderPathEdit.setText(self.default_output_folder_path)
# if selected_assistant is not empty string, load the assistant config
elif selected_assistant != "":
self.reset_fields()
self.is_create = False
self.pre_load_assistant_config(selected_assistant)
self.createAsNewCheckBox.setEnabled(True)
Expand Down Expand Up @@ -755,7 +755,7 @@ def pre_load_assistant_config(self, name):
item.setData(Qt.UserRole, file_id)
self.file_search_files[file_path] = file_id
self.fileSearchList.addItem(item)
self.fileSearchCheckBox.setChecked(bool(self.assistant_config.file_search))
self.fileSearchCheckBox.setChecked(bool(self.assistant_config.file_search))

# Load completion settings
self.load_completion_settings(self.assistant_config.text_completion_config)
Expand All @@ -776,10 +776,13 @@ def load_completion_settings(self, text_completion_config):
self.responseFormatComboBox.setCurrentText(completion_settings.get('response_format', 'text'))
self.maxCompletionTokensEdit.setValue(completion_settings.get('max_completion_tokens', 1000))
self.maxPromptTokensEdit.setValue(completion_settings.get('max_prompt_tokens', 1000))
truncation_strategy = completion_settings.get('truncation_strategy', 'auto')
self.truncationTypeComboBox.setCurrentText(truncation_strategy)
if truncation_strategy == 'last_messages':
self.lastMessagesSpinBox.setValue(completion_settings.get('last_messages', 10))
truncation_strategy = completion_settings.get('truncation_strategy', {'type': 'auto'})
truncation_type = truncation_strategy.get('type', 'auto') # Default to 'auto' if 'type' is missing
self.truncationTypeComboBox.setCurrentText(truncation_type)
if truncation_type == 'last_messages':
last_messages = truncation_strategy.get('last_messages')
if last_messages is not None:
self.lastMessagesSpinBox.setValue(last_messages)
elif self.assistant_type == "chat_assistant":
self.frequencyPenaltySlider.setValue(completion_settings.get('frequency_penalty', 0) * 100)
self.maxTokensEdit.setValue(completion_settings.get('max_tokens', 1000))
Expand Down Expand Up @@ -818,6 +821,8 @@ def pre_select_functions(self):
# Check if the function is in the current category and set the corresponding item as checked
for func_config in funcs:
if func_config.name == func_name:
if func_config.get_full_spec() not in self.functions:
self.functions.append(func_config.get_full_spec())
list_widget = self.systemFunctionsList if func_type == 'system' else self.userFunctionsList
for i in range(list_widget.count()):
listItem = list_widget.item(i)
Expand All @@ -839,15 +844,18 @@ def handle_function_selection(self, item):
# Since the method now receives an item, we can check directly if this item is checked
if item.checkState() == Qt.Checked:
functionConfig = item.data(Qt.UserRole)
self.functions.append(functionConfig.get_full_spec())
# if functionConfig is already in the list, don't add it again
if functionConfig.get_full_spec() not in self.functions:
self.functions.append(functionConfig.get_full_spec())

# However, to maintain a complete list of checked items, we still need to iterate over all items
for listWidget in [self.systemFunctionsList, self.userFunctionsList]:
for i in range(listWidget.count()):
listItem = listWidget.item(i)
if listItem.checkState() == Qt.Checked:
functionConfig = listItem.data(Qt.UserRole)
self.functions.append(functionConfig.get_full_spec())
if functionConfig.get_full_spec() not in self.functions:
self.functions.append(functionConfig.get_full_spec())

def add_reference_file(self):
self.fileReferenceList.addItem(QFileDialog.getOpenFileName(None, "Select File", "", "All Files (*)")[0])
Expand Down
2 changes: 1 addition & 1 deletion gui/conversation_sidebar.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@ class ConversationSidebar(QWidget):
def __init__(self, main_window):
super().__init__(main_window)
self.main_window = main_window
self.setMinimumWidth(300)
self.setMinimumWidth(250)
self.assistant_config_manager = AssistantConfigManager.get_instance()
self.assistant_client_manager = AssistantClientManager.get_instance()

Expand Down
2 changes: 1 addition & 1 deletion sdk/azure-ai-assistant/azure/ai/assistant/_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------

VERSION = "0.3.0a1"
VERSION = "0.3.1a1"
Original file line number Diff line number Diff line change
Expand Up @@ -343,8 +343,9 @@ def _update_tool_resources(
existing_file_ids = set()
if assistant.tool_resources.file_search:
existing_vs_ids = assistant.tool_resources.file_search.vector_store_ids
all_files_in_vs = list(self._ai_client.beta.vector_stores.files.list(existing_vs_ids[0], timeout=timeout))
existing_file_ids = set([file.id for file in all_files_in_vs])
if existing_vs_ids:
all_files_in_vs = list(self._ai_client.beta.vector_stores.files.list(existing_vs_ids[0], timeout=timeout))
existing_file_ids = set([file.id for file in all_files_in_vs])

# if there are new files to upload or delete, recreate the vector store
assistant_config_vs = None
Expand Down Expand Up @@ -493,7 +494,8 @@ def _process_messages_non_streaming(
run_start_time = str(datetime.now())
user_request = self._conversation_thread_client.retrieve_conversation(thread_name).get_last_text_message("user").content
self._callbacks.on_run_start(self._name, run.id, run_start_time, user_request)
self._user_input_processing_cancel_requested = False
if self._cancel_run_requested.is_set():
self._cancel_run_requested.clear()
is_first_message = True

while True:
Expand All @@ -512,9 +514,9 @@ def _process_messages_non_streaming(

logger.info(f"Processing run: {run.id} with status: {run.status}")

if self._user_input_processing_cancel_requested:
if self._cancel_run_requested.is_set():
self._ai_client.beta.threads.runs.cancel(thread_id=thread_id, run_id=run.id, timeout=timeout)
self._user_input_processing_cancel_requested = False
self._cancel_run_requested.clear()
logger.info("Processing run cancelled by user, exiting the loop.")
return None

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -136,57 +136,59 @@ def get_config(
logger.warning(f"No configuration found for '{name}'")
return None

# ensure the configurations are up-to-date
self._load_config(name)

# Return the AssistantConfig object for the given name
return self._configs.get(name, None)

def load_configs(self) -> None:
"""
Loads all assistant local configurations from JSON and YAML (both .yaml and .yml) files.
Loads all assistant local configurations from JSON and YAML (both .yaml and .yml) files using the load_config method.
"""
config_files = []
loaded_assistants = set() # Track loaded assistant names to prevent duplicates
try:
config_files = os.listdir(self._config_folder)
except FileNotFoundError:
logger.warning("No assistant configurations found in the config folder.")
return

# Sort to prefer YAML over JSON if both are present for the same configuration
for filename in sorted(config_files, reverse=True):
loaded_assistants = set() # Track loaded assistant names to prevent duplicates
# Identify base names to be processed
for filename in config_files:
if filename.endswith(('_assistant_config.json', '_assistant_config.yaml', '_assistant_config.yml')):
base_name = filename.split('_assistant_config')[0]
if base_name in loaded_assistants:
# Skip if configuration already loaded to avoid duplication
continue
if base_name not in loaded_assistants:
self._load_config(base_name)
loaded_assistants.add(base_name)

if not self._configs:
logger.warning("No valid assistant configurations found.")

self._set_last_modified_assistant()

file_path = os.path.join(self._config_folder, filename)
def _load_config(self, base_name: str) -> None:
# Try loading JSON first, then YAML
extensions = ['_assistant_config.json', '_assistant_config.yaml', '_assistant_config.yml']
for ext in extensions:
file_path = os.path.join(self._config_folder, base_name + ext)
if os.path.exists(file_path):
try:
logger.info(f"Loading assistant configuration from '{file_path}'")
if filename.endswith('.json'):
if file_path.endswith('.json'):
with open(file_path, 'r') as file:
config_data = json.load(file)
elif filename.endswith(('.yaml', '.yml')):
else: # For .yaml or .yml
with open(file_path, 'r') as file:
config_data = yaml.safe_load(file)
else:
# Skip if the file is not in an expected format
continue

assistant_name = config_data.get('name')
if assistant_name:
logger.info(f"Loaded assistant configuration for '{assistant_name}'")
assistant_config = AssistantConfig(config_data)
self._configs[assistant_name] = assistant_config
loaded_assistants.add(base_name)

return # Stop after successfully loading one format to avoid duplicates
except (json.JSONDecodeError, yaml.YAMLError) as e:
logger.warning(f"Invalid format in the assistant configuration file '{file_path}': {e}")
continue # Skip this file and continue with the next

if not self._configs:
logger.warning("No valid assistant configurations found.")

self._set_last_modified_assistant()

def _set_last_modified_assistant(self):
latest_mod_time = None
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -359,8 +359,9 @@ async def _update_tool_resources(
existing_file_ids = set()
if assistant.tool_resources.file_search:
existing_vs_ids = assistant.tool_resources.file_search.vector_store_ids
all_files_in_vs = list(self._async_client.beta.vector_stores.files.list(existing_vs_ids[0], timeout=timeout))
existing_file_ids = set([file.id for file in all_files_in_vs])
if existing_vs_ids:
all_files_in_vs = list(self._async_client.beta.vector_stores.files.list(existing_vs_ids[0], timeout=timeout))
existing_file_ids = set([file.id for file in all_files_in_vs])

# if there are new files to upload or delete, recreate the vector store
assistant_config_vs = None
Expand Down Expand Up @@ -510,7 +511,8 @@ async def _process_messages_non_streaming(
conversation = await self._conversation_thread_client.retrieve_conversation(thread_name)
user_request = conversation.get_last_text_message("user").content
await self._callbacks.on_run_start(self._name, run.id, run_start_time, user_request)
self._user_input_processing_cancel_requested = False
if self._cancel_run_requested.is_set():
self._cancel_run_requested.clear()
is_first_message = True

while True:
Expand All @@ -529,9 +531,9 @@ async def _process_messages_non_streaming(

logger.info(f"Processing run: {run.id} with status: {run.status}")

if self._user_input_processing_cancel_requested:
if self._cancel_run_requested.is_set():
await self._async_client.beta.threads.runs.cancel(thread_id=thread_id, run_id=run.id, timeout=timeout)
self._user_input_processing_cancel_requested = False
self._cancel_run_requested.clear()
logger.info("Processing run cancelled by user, exiting the loop.")
return None

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -202,14 +202,15 @@ async def process_messages(
await self._callbacks.on_run_start(self._name, run_id, run_start_time, user_request)

continue_processing = True
self._user_input_processing_cancel_requested = False
if self._cancel_run_requested.is_set():
self._cancel_run_requested.clear()

response = None
while continue_processing:

if self._user_input_processing_cancel_requested:
if self._cancel_run_requested.is_set():
logger.info("User input processing cancellation requested.")
self._user_input_processing_cancel_requested = False
self._cancel_run_requested.clear()
break

text_completion_config = self._assistant_config.text_completion_config
Expand Down
Loading

0 comments on commit 9617aa7

Please sign in to comment.