Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream Analytics CI/CD Yaml file for the Tech Sample #549

Merged
merged 16 commits into from
Oct 27, 2022
25 changes: 25 additions & 0 deletions single_tech_samples/streamanalytics/JobConfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
{
"ScriptType": "JobConfig",
"DataLocale": "en-US",
"OutputErrorPolicy": "Retry",
"EventsLateArrivalMaxDelayInSeconds": 5,
"EventsOutOfOrderMaxDelayInSeconds": 0,
"EventsOutOfOrderPolicy": "Adjust",
"StreamingUnits": 3,
"CompatibilityLevel": "1.2",
"UseSystemAssignedIdentity": false,
"GlobalStorage": {
"AccountName": null,
"AccountKey": null,
"AuthenticationMode": "ConnectionString"
},
"ContentStoragePolicy": "SystemAccount",
"CustomCodeStorage": {
"AccountName": null,
"AccountKey": null,
"ContainerName": null,
"Path": "UserCustomCode.zip"
},
"DataSourceCredentialDomain": null,
"Tags": {}
}
8 changes: 8 additions & 0 deletions single_tech_samples/streamanalytics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,3 +166,11 @@ This sample combines [Azure IoT device SDK](https://www.npmjs.com/package/azure-
![test result output screen capture](docs/images/e2e-test.PNG)

Within the test file [e2e/e2e.ts](e2e/e2e.ts) there is the `EXPECTED_E2E_LATENCY_MS` defined to be 1s. So this would also need to be adjusted for a real implementation.

#### CI/CD

A sample CI/CD Pipeline YAML file is present in this repo under "sampleyaml.yml". In order to use this file, please do the following:

1. Update the values in the curly braces with the current Azure subscription information.
2. Run the build command (azure-streamanalytics-cicd build -project streamanalytics/asaproj.json -outputpath streamanalytics/Output/Deploy) locally, and find the fields where the value is "null". These fields need to be added to the overrideParameters section of the YAML file where it says "key_vault_secret". It is best to store them in a keyvault and reference the secrets as demonstrated in the sample, but it is also possible to reference them directly by removing the $().
3. For additional jobs, please copy the AzureResourceManagerTemplateDeployment@3 task and update where it says "streamanalytics" to the new job name.
7 changes: 6 additions & 1 deletion single_tech_samples/streamanalytics/asaproj.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
{
"name": "streamanalytics",
"startFile": "streamanalytics-tech-sample.asaql",
"configurations": []
"configurations": [
{
"filePath": "JobConfig.json",
"subType": "JobConfig"
}
]
}
33 changes: 33 additions & 0 deletions single_tech_samples/streamanalytics/sampleyaml.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
trigger:
- main
pool:
name: Azure Pipelines
demands: npm
steps:
- task: AzureKeyVault@2
inputs:
azureSubscription: '{RESOURCE MANAGER CONNECTION}'
KeyVaultName: '{KEY VAULT NAME}'
SecretsFilter: '*'
RunAsPreJob: false
- task: Npm@1
displayName: 'Install Azure stream analytics ci cd'
inputs:
command: custom
verbose: false
customCommand: 'install -g azure-streamanalytics-cicd'
- script: 'azure-streamanalytics-cicd build -project streamanalytics/asaproj.json -outputpath streamanalytics/Output/Deploy'
displayName: 'Build Stream Analytics proj'
- task: AzureResourceManagerTemplateDeployment@3
inputs:
deploymentScope: 'Resource Group'
azureResourceManagerConnection: '{RESOURCE MANAGER CONNECTION}'
subscriptionId: '{SUBSCRIPTION ID}'
action: 'Create Or Update Resource Group'
resourceGroupName: '{RESOURCE GROUP}'
location: 'Southeast Asia'
templateLocation: 'Linked artifact'
csmFile: 'streamanalytics/Output/Deploy/streamanalytics.JobTemplate.json'
csmParametersFile: 'streamanalytics/Output/Deploy/streamanalytics.JobTemplate.parameters.json'
overrideParameters: '-Location "Southeast Asia" -Input_example_accesskey $(key_vault_secret) -Output_example_accesskey $(key_vault_secret_2)'
deploymentMode: 'Incremental'
snorris31 marked this conversation as resolved.
Show resolved Hide resolved
123 changes: 123 additions & 0 deletions single_tech_samples/streamanalytics/template-parameters-generator.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,123 @@
import json
snorris31 marked this conversation as resolved.
Show resolved Hide resolved
import typer

singleindent = " " * 2
doubleindent = " " * 4

# To run this, make sure to pass the following parameters. In order to generate the job template path,
# you will need to run the command 'azure-streamanalytics-cicd build -project "asaproj.json"' fom within the project folder.
# This will generate the template file.
def main(
subscription_id: str,
subscription_name: str,
job_name: str,
job_location: str,
job_template_path: str,
resource_group_name: str,
triggerbranch: str,
key_vault_name: str = "",
):
yamlfile = open("asa_cicd_pipeline_yaml", "w")
yamlfile.write("trigger:\n")
yamlfile.write(
f"- {triggerbranch} \npool:\n{singleindent}name: Azure Pipelines\n{singleindent}demands: npm\nsteps:\n"
)
using_key_vault = key_vault_name != ""
if using_key_vault:
yamlfile.write(
f"- task: AzureKeyVault@2\n{singleindent}inputs:\n{doubleindent}azureSubscription: '{subscription_name} ({subscription_id})'\n{doubleindent}KeyVaultName: '{key_vault_name}'\n{doubleindent}SecretsFilter: '*'\n{doubleindent}RunAsPreJob: false\n"
)
yamlfile.write(
f"- task: Npm@1\n{singleindent}displayName: 'Install Azure stream analytics ci cd'\n{singleindent}inputs:\n{doubleindent}command: custom\n{doubleindent}verbose: false\n{doubleindent}customCommand: 'install -g azure-streamanalytics-cicd'"
)
yamlfile.close()
create_all_asa_jobs(
yamlfile,
subscription_id,
using_key_vault,
subscription_name,
job_location,
job_name,
resource_group_name,
job_template_path,
)


def create_all_asa_jobs(
yamlfile,
subscription_id,
using_key_vault,
subscription_name,
job_location,
job_name,
resource_group_name,
job_template_path,
):
yamlfile = open("asa_cicd_pipeline_yaml", "a")
template_params = open(job_template_path, "r")
output = template_params.read()
template_json = json.loads(output)["parameters"]
new_json = create_asa_job_default_values(job_name)
create_project_arr = create_asa_job(
template_json,
job_location,
subscription_id,
job_name,
using_key_vault,
resource_group_name,
subscription_name,
)
yamlfile.write(new_json)
yamlfile.writelines(create_project_arr)

yamlfile.close()


def create_asa_job_default_values(project_name):
new_json = "\n"

new_json += f"\n- script: 'azure-streamanalytics-cicd build -project {project_name}/asaproj.json -outputpath {project_name}/Output/Deploy' \n"
new_json += " displayName: 'Build Stream Analytics proj'"

new_json += "\n"
new_json += f"- task: AzureResourceManagerTemplateDeployment@3\n{singleindent}inputs:\n{doubleindent}"
return new_json


def create_asa_job(
template_json,
project_location,
subscription_id,
project_name,
usingkeyvault,
resourcegroup,
subscription_name,
):
overrideparams = f'-Location "{project_location}"'
for key in template_json:
if template_json[key]["value"] is None:
if usingkeyvault:
overrideparams += (
f"-{key} $([REQUIRED: Add Secret Name From Keyvault]) "
)
else:
overrideparams += f'-{key} "[REQUIRED: Add Name] "'

create_project_arr = [
"deploymentScope: 'Resource Group'\n",
f"{doubleindent}azureResourceManagerConnection: '{subscription_name} ({subscription_id})'\n",
f"{doubleindent}subscriptionId: '{subscription_id}'\n",
f"{doubleindent}action: 'Create Or Update Resource Group'\n",
f"{doubleindent}resourceGroupName: '{resourcegroup}'\n",
f"{doubleindent}location: '{project_location}'\n",
f"{doubleindent}templateLocation: 'Linked artifact'\n",
f"{doubleindent}csmFile: '{project_name}/Output/Deploy/{project_name}.JobTemplate.json'\n",
f"{doubleindent}csmParametersFile: '{project_name}/Output/Deploy/{project_name}.JobTemplate.parameters.json'\n",
f"{doubleindent}overrideParameters: '{overrideparams}'\n",
f"{doubleindent}deploymentMode: 'Incremental'\n",
]
return create_project_arr


if __name__ == "__main__":
typer.run(main)
8 changes: 4 additions & 4 deletions single_tech_samples/streamanalytics/test/testConfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@
"ExpectedOutputs": [
{
"OutputAlias": "bloboutput",
"FilePath": "/dev/null",
"Required": true
"FilePath": "temperature_less_than_27_degrees.json",
"Required": false
snorris31 marked this conversation as resolved.
Show resolved Hide resolved
}
]
},
Expand All @@ -53,8 +53,8 @@
"ExpectedOutputs": [
{
"OutputAlias": "bloboutput",
"FilePath": "/dev/null",
"Required": true
"FilePath": "temperature_equal_to_27_degrees.json",
"Required": false
snorris31 marked this conversation as resolved.
Show resolved Hide resolved
}
]
}
Expand Down