Features contained in this repository are in private preview. Preview versions are provided without a service level agreement, and they are not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see Supplemental Terms of Use for Microsoft Azure Previews.
This repository is part of the Azure AI Studio preview.
To get started quickly, you can use a pre-built development environment. Click the button below to open the repo in GitHub Codespaces, and then continue the readme!
If you want to get started in your local environment, first install the packages:
git clone https://github.com/azure/aistudio-copilot-sample
cd aistudio-copilot-sample
pip install -r requirements.txt
Then install the Azure AI CLI, on Ubuntu:
curl -sL https://aka.ms/InstallAzureAICLIDeb | sudo bash
To install the CLI on Windows and MacOS, follow the instructions here.
Run ai init to create and/or connect to existing Azure resources:
ai init
- This will first prompt to you to login to Azure
- Then it will ask you to select or create resources, choose AI Project resource and follow the prompts to create an Azure OpenAI resource, model deployments, and Azure AI search resource
- This will generate a config.json file in the root of the repo, the SDK will use this when authenticating to Azure AI services.
Note: You can open your project in AI Studio to view your projects configuration and components (generated indexes, evaluation runs, and endpoints)
Run the following CLI command to create an index that our code can use for data retrieval:
ai search index update --files "./data/3-product-info/*.md" --index-name "product-info"
Now, generate a .env file that will be used to configure the running code to use the resources we've created in the subsequent steps
ai dev new .env
To run a single question & answer through the sample co-pilot:
python src/run.py --question "which tent is the most waterproof?"
You can try out different sample implementations by specifying the --implementation
flag with promptflow
, semantickernel
, langchain
or aisdk
. To try running with semantic kernel:
python src/run.py --implementation semantickernel --question "what is the waterproof rating of the tent I just ordered?"
To try out the promptflow implementation, check deployment names (both embedding and chat) and index name (if it's changed from the previous steps) in src/copilot_promptflow/flow.dag.yaml
match what's in the .env
file.
python src/run.py --question "which tent is the most waterproof?" --implementation promptflow
The --implementation
flag can be used in combination with the evaluate command below as well.
You can also use the ai
CLI to submit a single question and/or chat interactively with the sample co-pilots, or the default "chat with your data" co-pilot:
ai chat --interactive # uses default "chat with your data" copilot
ai chat --interactive --function src/copilot_aisdk/chat:chat_completion
To run evaluation on a copilot implementations:
python src/run.py --evaluate
You can also run pytest to run tests that use evaluation results to pass/fail
pytest
This will run the tests in src/test_copilot.py
using the evaluation_dataset.jsonl
as a test dataset. This will compute a set of metrics calculated by chatgpt on a 1-5 scale, and will fail that metric if the average score is less than 4.
You can also use the ai
CLI to do bulk runs and evaluations:
ai chat evaluate --input-data src/tests/evaluation_dataset.jsonl # uses default "chat with your data" copilot
ai chat evaluate --input-data src/tests/evaluation_dataset.jsonl --function src/copilot_aisdk/chat:chat_completion
To deploy one of the implementations to an online endpoint, use:
python src/run.py --deploy
To test out the online enpoint, run:
python src/run.py --invoke
For a more detailed tutorial using this notebook, you can follow the Build a co-pilot using the Azure AI SDK tutorial.
You can pip install packages into your development environment but they will disappear if you rebuild your container and need to be reinstalled (re-build is not automatic). You may want this, so that you can easily reset back to a clean environment. Or, you may want to install some packages by default into the container so that you don't need to re-install packages after a rebuild.
To add packages into the default container, you can update the Dockerfile in .devcontainer/Dockerfile
, and then rebuild the development container from the command palette by pressing Ctrl/Cmd+Shift+P
and selecting the Rebuild container
command.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.