.NET Framework for Windows: https://dotnet.microsoft.com/en-us/download/dotnet-framework
In the context of this project, data streams are event data generated by sensors or other sources that can be analyzed by another technology. Analyzing a data stream is typically done to measure the state change of a component or to capture information on an area of interest. The process of consuming data streams, analyzing them and deriving actionable insights out of them is called them Event Processing. It requires an event producer event processor and event consumer.
- Once you have access to your azure subscription, create a resource group.
- Create a source storage account and a destination storage account.
- Under storage resource, click on containers under the Data storage panel.
- Add a new container and give it an alias.
- After creating container, click on storage browser in the left panel and select blob containers. Navigate to the container that you just created. We will now create a new folder in this container. Click on Add Directory and give it a name. We will name it in the current date format. YYYY-MM-DD
- Repeat steps 2 to 4 to create a destination storage account and give it an alias.
- Now, we will create a stream analytics job in Azure portal that will transfer data from source storage account to destination storage account.
- For hosting environment ensure it is set to Cloud and set streaming units to 1. Click create to create the job.
- Before proceeding to the next step, visit your source storage account. Select Access Keys under Security + Networking. Click ‘show’ for the key under ‘key1’ and copy this key. We will require this in the further steps.
- Once the stream analytics job has been deployed, go to the resource. Select Inputs under Job topology. Since it is a new stream analytics job, you should see no inputs. Let’s add a new stream input. Click on add stream input dropdown and select Blob storage/ADLS Gen2.
- Under Blob storage/ADLS Gen2, enter your desired input alias, the source storage account name that you’d created. Change the authentication mode to connection string. For the storage account key, copy and paste the key that you had copied in step 9. Click Save.
- Your input should look like this.
- To create an output account, select ‘Outputs’ under Job Topology on the left panel. Follow instructions 9 to 11 to arrive at the desired result.
- Your output should look like this.
- Click on query under job topology. Verify your query by making sure that [output] and [input] matches name that you’ve assigned for your output storage and input storage respectively.
- Go to the Overview tab and start the job.
- Once the job has been successfully started, go to storage accounts > source storage account > containers > input. Here, you’ll see the folder that you’d created with the YYYY-MM-DD naming format. Select the folder and upload input-01.json.
- As the stream analytics job is running, you may go to storage accounts > destination storage account > containers > output. Here you’ll find a new json file that has been generated. On checking the file, we can verify that the contents in both the input and output json files are similar.
- Create iot hub by looking up iot hub.
- Go to devices under device management
- Click on Add device enter WindTurbineSensorID
- Create device and select it. Copy the Primary connection string and keep it handy.
- Now we’ll test an IoT device simulator using VSCode.
- Make sure to copy the primary key in line 44 and execute dotnet run in the terminal
- you must have .NET installed in your system https://dotnet.microsoft.com/en-us/download/dotnet?cid=getdotnetcorecli for this step
- Once you have .NET installed in the system, execute dotnet run. You should see that the Iot device simulator is running.
- visit azure portal and navigate to your IoT Hub device. Under Hub Settings select Message routing.
- create stream analytics job for logging route
- We’ll create event hubs
- After creating visit the resource, here you'll create an event hub. Click on add event hub and give it a name, create your event hub.
- Navigate to your iot hub, go to message routing, create a new routing by clicking on add.
- Create and enter telemetry text file code.
- Go to stream analytics job > inputs> add stream input> event hub.
- After saving, testing. Got to output>add>power bi>authorize
- Create an IoT hub using the Azure portal. https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-create-through-portal
- Message routing in IoT hub. https://learn.microsoft.com/en-us/azure/iot-hub/how-to-routing-portal?tabs=eventhubs









