This project implements a chain of thought reasoning and entity extraction system using Azure OpenAI GPT-4o, deployed as an Azure Function with comprehensive observability through OpenTelemetry and Azure Application Insights.
Input Text → Chain of Thought Reasoning → Entity Extraction → Structured Output
↓ ↓ ↓ ↓
Azure Function → PromptFlow → Azure OpenAI → Response + Telemetry
↓
Application Insights (Traces, Metrics, Logs)
- Chain of Thought Reasoning: Step-by-step analysis of input text
- Entity Extraction: Structured extraction of persons, organizations, locations, dates, products, and other entities
- Azure Functions: Serverless hosting with HTTP triggers
- OpenTelemetry: Comprehensive observability with traces, metrics, and logs
- Azure Application Insights: Centralized monitoring and analytics
- Error Handling: Robust error handling with detailed logging
- Health Checks: Built-in health monitoring endpoint
-
Azure Subscription with the following resources:
- Azure OpenAI Service with GPT-4o deployment
- Azure Functions (Python 3.9+)
- Azure Application Insights
- Azure Storage Account
-
Development Environment:
- Python 3.9 or higher
- Azure Functions Core Tools
- Azure CLI
- Git
git clone <your-repo-url>
cd Promptflowexpip install -r requirements.txtCopy the template files and update with your Azure resource details:
cp .env.template .env
cp local.settings.json.template local.settings.jsonUpdate the following in your .env and local.settings.json:
AZURE_OPENAI_ENDPOINT: Your Azure OpenAI service endpointAZURE_OPENAI_API_KEY: Your Azure OpenAI API keyAZURE_OPENAI_DEPLOYMENT_NAME: Your GPT-4o deployment nameAPPLICATIONINSIGHTS_CONNECTION_STRING: Your Application Insights connection stringAzureWebJobsStorage: Your Azure Storage connection string
The application will automatically create the PromptFlow connection, or you can create it manually:
# Via Azure CLI (if using Azure ML workspace)
az ml connection create -f azure_openai_connection.yaml -g <resource-group> -w <workspace-name># Start the Azure Functions runtime
func start
# The function will be available at:
# POST http://localhost:7071/api/process_text
# GET http://localhost:7071/api/healthPOST /api/process_text
Request Body:
{
"text": "John Smith from Microsoft will visit our New York office on December 15th, 2024 to discuss the new Azure AI services."
}Response:
{
"status": "success",
"input": "John Smith from Microsoft will visit our New York office on December 15th, 2024 to discuss the new Azure AI services.",
"reasoning": "Step-by-step analysis of the text...",
"extracted_entities": {
"persons": ["John Smith"],
"organizations": ["Microsoft"],
"locations": ["New York"],
"dates": ["December 15th, 2024"],
"products": ["Azure AI services"],
"other_entities": {}
},
"processing_time_ms": 1250.75
}GET /api/health
Response:
{
"status": "healthy",
"service": "promptflow-entity-extraction",
"timestamp": "2024-11-05T10:30:00.000Z",
"version": "1.0.0"
}curl -X POST http://localhost:7071/api/process_text \
-H "Content-Type: application/json" \
-d '{
"text": "Sarah Johnson from Contoso Corporation will attend the quarterly review meeting in Seattle on January 20th, 2025 to present the Q4 financial results."
}'curl -X POST http://localhost:7071/api/process_text \
-H "Content-Type: application/json" \
-d '{
"text": "The deployment of the new microservices architecture on Azure Kubernetes Service is scheduled for March 1st, 2025. The team lead, Alex Chen, will coordinate with the DevOps team in London."
}'# Create resource group
az group create --name rg-promptflow --location eastus
# Create storage account
az storage account create --name stpromptflow --resource-group rg-promptflow --location eastus --sku Standard_LRS
# Create Application Insights
az monitor app-insights component create --app promptflow-insights --location eastus --resource-group rg-promptflow
# Create Function App
az functionapp create --resource-group rg-promptflow --consumption-plan-location eastus --runtime python --runtime-version 3.9 --functions-version 4 --name func-promptflow --storage-account stpromptflow --app-insights promptflow-insights# Set environment variables in Azure Function App
az functionapp config appsettings set --name func-promptflow --resource-group rg-promptflow --settings \
"AZURE_OPENAI_ENDPOINT=<your-endpoint>" \
"AZURE_OPENAI_API_KEY=<your-key>" \
"AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o" \
"OTEL_SERVICE_NAME=promptflow-entity-extraction"# Deploy using Azure Functions Core Tools
func azure functionapp publish func-promptflowThe application includes comprehensive observability:
-
Distributed Tracing:
- Request tracing across all components
- Custom spans for prompt flow operations
- Context propagation and baggage
-
Metrics:
- Request counters with success/error status
- Processing duration histograms
- Custom business metrics
-
Logging:
- Structured logging with correlation IDs
- Error logging with stack traces
- Performance logging
Monitor your application in Azure Application Insights:
- Live Metrics: Real-time request and performance data
- Application Map: Service dependencies and call flows
- Performance: Request duration and throughput analysis
- Failures: Error rates and exception analysis
- Logs: Structured log analysis with KQL queries
// Request performance over time
requests
| where timestamp > ago(1h)
| summarize avg(duration), count() by bin(timestamp, 5m)
// Error analysis
exceptions
| where timestamp > ago(24h)
| summarize count() by type, outerMessage
// Custom metrics
customMetrics
| where name in ("promptflow_requests_total", "promptflow_processing_duration_ms")
| summarize avg(value) by name, bin(timestamp, 1h)Promptflowex/
├── function_app.py # Azure Function app with OpenTelemetry
├── flow.dag.yaml # PromptFlow definition
├── chain_of_thought.jinja2 # Chain of thought prompt template
├── entity_extraction.jinja2 # Entity extraction prompt template
├── requirements.txt # Python dependencies
├── host.json # Azure Functions host configuration
├── azure_openai_connection.yaml # Connection configuration
├── .env.template # Environment variables template
├── local.settings.json.template # Local Azure Functions settings
├── .gitignore # Git ignore rules
└── README.md # This documentation
-
Connection Errors:
- Verify Azure OpenAI endpoint and API key
- Check network connectivity
- Ensure proper RBAC permissions
-
PromptFlow Errors:
- Verify flow.dag.yaml syntax
- Check template file paths
- Ensure connection exists
-
Observability Issues:
- Verify Application Insights connection string
- Check OpenTelemetry configuration
- Ensure proper instrumentation
Enable debug logging by setting:
export PYTHONPATH="."
export AZURE_FUNCTIONS_ENVIRONMENT="Development"- Use Azure Key Vault for sensitive configuration
- Implement proper authentication and authorization
- Enable HTTPS only
- Use managed identities where possible
- Monitor cold start times
- Optimize prompt templates for efficiency
- Implement proper caching strategies
- Use appropriate Function App hosting plans
- Set up alerts for error rates and performance thresholds
- Create custom dashboards for business metrics
- Implement proper log retention policies
- Use structured logging for better analysis
This project is licensed under the MIT License - see the LICENSE file for details.
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests and documentation
- Submit a pull request
For issues and questions:
- Check the troubleshooting section
- Review Azure Functions and PromptFlow documentation
- Open an issue in the repository
- Contact the development team