Skip to content

High Integrity Systems Master Project: IoT - From Microcontroller to the Cloud

Notifications You must be signed in to change notification settings

tranhocphuc/IoTProject

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HIS_Project: IoT - From Microcontroller to the Cloud

High integrity Systems' Project - Frankfurt University of Applied Sciences.

Co-authors:

  • Phuc Hoc Tran - 1235133
  • Jaime Sanchez Cotta - 1430488

This Markdown file is written by Phuc Hoc Tran (1235133)

The aim of the project is to create an application which sends an MQTT-SN message with artificial sensor's measurement (client is deployed on FiT/IoT-Lab) as its content to AwS cloud (in this project, we used AwS IoT Core) and visualize the measurement on AwS SageMaker.


Table of Contents


Prerequisites:

  • A registered account on FiT IoT-Lab
  • RIOT application
  • AwS Account (either free tier, or paid)
  • Paho MQTT-SN Gateway
  • Bash/shell knowledge
  • C and Python programming knowledges.

Architecture:

1. Overarching idea:

architecture img

  • In this project, the following applications from RIOT are ultilized:
    • emcute_mqttsn: MQTT-SN client, which publishes the sensor data and subscribes to the MQTT-SN Gateway.
    • Border Router.

2. In details:

Before arriving to the current architecture, we came up with 2 other architectures which could not work due to AwS's IAM permission (our accounts are LearnerLab based, therefore, some limitations existed which prohibit us to continue with the first 2 ones). For each component, their main functions are:

  1. Sensor: This is an emulated sensor, which is used to create random variables follow a Zig Zag pattern. 5 different variables generated by this sensor are: temperature, humidity, wind direction, wind intensity and rain height.
  2. Border router: used to assign public IP address to the node within Grenoble.
  3. MQTT-SN Gateway: Since our Brokers running on EC2 Instance and IoT Core are MQTT Broker, the Gateway is implemented to convert MQTT-SN messages to MQTT messages. The sensor node (or MQTT-SN client) after being assigned an IP address by the Border Router can connect and send message to the Gateway. The Gateway, after successfully connected to the client and the broker on EC2, will perform the conversion and forward the message received from the client to the MQTT Broker.
  4. EC2 Instance: Since there is a firewall from FiT/IoT Lab side which prevents any subscriptions and incoming messages, and direct connection to AwS IoT Core is only possible if the client has TLS certificates, EC2 Instance is ultilized to ease this complication. By setting up a MQTT Bridge, along with providing the necessary certificates, everytime we start the EC2 instance, the MQTT Bridge automatically starts. Therefore, we only need to connect the Gateway to this EC2 instance by using its IPv4 address without having to provide any other certificates.
  5. IoT Core: acts as an MQTT Broker itself. Can also be used as a publisher/subsribers as well since IoT Core offers that service in the MQTT test clients section.
  6. IoT Analytics: The professor requirements was only upto visualizing the data either in text-based or image-based, in other words, successfully sending data from a sensor node to AwS Cloud (either a subscriber on EC2 or IoT Core) is more than adequate. However, we took a step further and tried out other AwS Services, including DynamoDB to save JSON data, and/or using IoT Analytics as a data pipeline to forward messages received at IoT Core endpoint to a visualization tool, and performs some data pre-processing on it. In our case, we add the timestamp value to the messages in the Message Routing Rules of IoT Core before sending them to IoT Analytics.
  7. AwS S3: When data enters IoT Analytics, a pre-processing step is performed here. The data, which was originally JSON based, is extracted its parameters along with the timestamp from the previous step, and saved in CSV format in S3 bucket. Every messages sent from the sensor is saved in this bucket for later visualization.
  8. AwS SageMaker: An obvious overkill just for data visualization. However, after 2 failed attempts (both were hindered with IAM Role permission problem) in trying to use other methods (query data from DynamoDB to local machine and visualize them on a Flask webpage with Matplotlib, or use AwS QuickSight - A B.I. tool from AwS with IoT Analytics acts as a data pipeline), AwS SageMaker was a win at this point, albeit not really suitable. A JupyterLab Notebook instance was created on SageMaker, and we used boto3 + matplotplib to query the data from S3 bucket and visualize them on this notebook.

Disclaimer: After the final presentation, thanks to our colleagues, we realized that there is an open Python library dedicated to retrieve data directly from AwS IoT Core, without having to go through the other AwS Services just for visualization. In other words, the architecture can actually be simplified and trimmed down to great extent. However, if one wishes to explore the AwS Services, this can actually be a good practice example to follow.


Sensors:

Based on the requirement, we decided to create an emulated sensor with values follow a Zig Zag pattern, as follow:

/* function to create Zig Zag pattern */
int posRead =0;
int vMax = 20;
int arrayAux[41];

int zigZag_val(int x){
    if(x>=0&&x<vMax){
    return x;
  }
  else if(x>=vMax&&x<2*vMax){
    return 2*vMax-x;
  }
  else if(x>=2*vMax){
    return zigZag_val(x-2*vMax);
  }
  else if(x<0){
    return zigZag_val(x+2*vMax);
  }
  else return 0;
}

How it works: the temperature value goes from 0 to 20 with an increasing step of 2 after each iteration. The other sensor's parameters follow the same paradigm, but different in their iteration steps:

  • +1 for humidity (value ranges from 1 to 21),
  • -1 for windDirection,
  • +2 for windIntensity,
  • -2 for rainHeight.

And then generate the value:

/* generate Zig Zag pattern */
void gen_sensors_values(t_sensors* sensors, int position){
  int x;
  int i = 0;
  
  x=-2*vMax;
  while(x<=2*vMax){
    arrayAux[i] = zigZag_val(x);
    x=x+2;
    i++;
  }
  sensors->temperature = arrayAux[position];
  sensors->humidity = arrayAux[position]+1;
  sensors->windDirection = arrayAux[position]-1;
  sensors->windIntensity = arrayAux[position]+2;
  sensors->rainHeight = arrayAux[position]-2;
}

One should first familarize oneself with this tutorial from FiT/IoT-Lab. Afterwards, the same chain of commands is used (except for deploying the MQTT-Broker - step 9 in the mentioned tutorial) to initialize the sensor and Border Router. Then, the sensor's values will be sent to IoT Core via MQTT-SN Gateway using:

pub_sensor <GatewayIPv6Address> <GatewayPort> [CustomTopic]

The topic has already been predefined in the below function. If one wishes to publish to another topic, one should also include the topic forwarding rule in the bridge.conf file deployed on EC2 first, then add the topic argument in the pub_sensor command later.

static int sensors_read(int argc, char **argv){
    emcute_topic_t t;
    ...
    //Predefined topic: 
    char topic_buf[100] = "his_project/his_iot/sensor_data";
    char* topic = (char*)&topic_buf;
    ...

EC2 Instance & AwS IoT Core Setup:

Multiple online sources provided ways to set up the EC2 instance and AwS IoT Core. One among which is this really good article from AwS themselves to setup Brokers on both EC2 and AwS IoT Core.

Fortunately, although our accounts are not allowed to use IAM servce, we managed to bypass that by simply creating an EC2 instance as the tutorial instructed, and manually create an AwS IoT Core's Thing and attach policy to it. Then, follow the steps on the article above to setup the brokers.

Afterwards, simply bridging the two brokers by pasting the AwS IoT Core's endpoint (can be found via AwS IoT Core's Website/Setting/Device data endpoint, usually in the form xxx-ats.Iot.REGION.amazonaws.com) onto address line in the bridge.conf:

# ============================================================
# Bridge to AWS IOT
# ============================================================

connection awsiot

## Paste your AWS IoT Core ATS endpoint in the form of xxxxxxxxxxxxxxx-ats.iot.<region>.amazonaws.com:8883

address xxxxxxxxxxxxxxx-ats.iot.<region>.amazonaws.com:8883

# Specifying which topics are bridged and in what fashion
topic awsiot_to_localgateway in 1
topic localgateway_to_awsiot out 1
topic both_directions both 1
## additional line: predefined topic
topic his_project/his_iot/sensor_data both 1
## end addtional line

# Setting protocol version explicitly
bridge_protocol_version mqttv311
bridge_insecure false

# Bridge connection name and MQTT client Id, enabling the connection automatically when the broker starts.
cleansession true
clientid bridgeawsiot
start_type automatic
## 2 additional lines: Allowing receive all messages sent to port 1884
listener 1884 0.0.0.0
allow_anonymous true
## end 2 additional lines
notifications false
log_type all

# ============================================================
# Certificate based SSL/TLS support
# ============================================================

#Path to the rootCA
bridge_cafile /etc/mosquitto/certs/rootCA.pem

# Path to the PEM encoded client certificate
bridge_certfile /etc/mosquitto/certs/cert.crt

# Path to the PEM encoded client private key
bridge_keyfile /etc/mosquitto/certs/private.key

#END of bridge.conf

Afterwards, on EC2 terminal (using ssh -i [EC2Keys].pem ubuntu@[EC2IPv4Address]), run these commands to start the bridge:

sudo service mosquitto stop
mosquitto -c /etc/mosquitto/conf.d/bridge.conf

IMPORTANT: Note that, the bridge.conf is specified on EC2 instance, and those 2 additional lines is for allowing anonymous connection at port 1884. We will use this port for the Gateway deployed on FiT/IoT-Lab to connect to the instance.


MQTT-SN Gateway Setup:

1. Initial setup:

Based on the experience of the co-author, setting up the latest Paho was quite cumbersome and somewhat difficult, since the creator of Paho Mosquitto package changed the name convention in the latest release, and all the online sources used the older version (hence, the old name convention). Therefore, we decided to opt for the old version for this project. Simply enter these commands in the CLI:

wget --progress=dot:giga --no-check-certificate -O paho.mqtt-sn.embedded-c.zip https://github.com/eclipse/paho.mqtt-sn.embedded-c/archive/f2dcda358f21e264de57b47b00ab6165bab4da18.zip
unzip paho.mqtt-sn.embedded-c.zip
rm paho.mqtt-sn.embedded-c.zip 
mv paho.mqtt-sn.embedded-c-f2dcda358f21e264de57b47b00ab6165bab4da18 paho.mqtt-sn.embedded-c
cd paho.mqtt-sn.embedded-c/MQTTSNGateway 

To run the Gateway, one must build the application first

./build.sh udp6
cd bin
./MQTT-SNGateway

2. Custom configuration to deploy on FiT/IoT-Lab:

The Gateway is deployed onto FiT/IoT-Lab's node. Before building the application, make sure that the Broker bridge on EC2 has already started, the IPv4 Address of the EC2 instance is known.

Next, we need to retrieve the IPv6 address of FiT/IoT-Lab's node, by simply run:

root@node-a8-xx: ip -6 -o addr show eth0
>> 2: eth0    inet6 2001:660:3207:400::66/64 scope global        valid_lft forever preferred_lft forever

Once we have both addresses, locate the gateway.conf file at paho.mqtt-sn.embedded-c/MQTTSNGateway and configure:

# config file of MQTT-SN Gateway
BrokerName= <Paste the EC2 IPv4 Address Here>
BrokerPortNo=1884 # When change this, also change the bridge.conf file on EC2
BrokerSecurePortNo=8883
...
# UDP6
GatewayUDP6Bind= <Paste the Node IPv6 Address Here> 
GatewayUDP6Port=1888 # Can also be any port you want
GatewayUDP6Broadcast=FF02::1
GatewayUDP6If=wpan0
GatewayUDP6Hops=1
...

Afterwards, build the Gateway and run as instructed above (Whilst ssh in the node).


AwS S3:

Before moving on to setup the IoT Analytics, we first need to create 3 S3 buckets, one for IoT Analytics channel, one for the data store, and one for the data set. The following are steps to recreate:

  1. Navigate to the S3 Management Console
  2. Choose Create Bucket
    • Bucket name: Give the bucket a unique name (must be globally unqiue) and append it with '-channel'.
    • Region: Should be kept the same.
  3. Click Next and keep all options default. Click on Create bucket to finish the creation.
  4. Repeat steps 1-3 twice to fnish creating the required buckets. Use the appendices '-datastore' and '-dataset' to differentiate the buckets.

On the bucketName-datastore, Appropriate permission for IoT Analytics to access Data Store bucket must be given:

  1. Click on the datastore bucket (ending with -datastore)
  2. Navigate to the permission tab
  3. Click on Bucket Policy and enter the following JSON policy (be sure to clude your s3 bucketname):
{
    "Version": "2012-10-17",
    "Id": "IoTADataStorePolicy",
    "Statement": [
        {
            "Sid": "IoTADataStorePolicyID",
            "Effect": "Allow",
            "Principal": {
                "Service": "iotanalytics.amazonaws.com"
            },
            "Action": [
                "s3:GetBucketLocation",
                "s3:GetObject",
                "s3:ListBucket",
                "s3:ListBucketMultipartUploads",
                "s3:ListMultipartUploadParts",
                "s3:AbortMultipartUpload",
                "s3:PutObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::<your bucket name here>",
                "arn:aws:s3:::<your bucket name here>/*"
            ]
        }
    ]
}
  1. Click Save

AwS IoT Analytics:

1. IoT Analytics Channel

Next, we will create the IoT Analytics channel that will consume data received at the IoT Core Broker and store it into S3 bucket.

  1. Navigate to AwS IoT Analytics console
  2. Navigate to Channels
  3. Create a new channel
    • ID: sensorStreamDataChannel
    • Choose the storage type: Customer Managed S3 Bucket, and choose the S3 channel bucket created in the previous step (see: create S3 buckets)
    • IAM Role: Because of our subscription, choose Lab Role
  4. Click Next and leave everything blank. Then click Create Channel

2. IoT Analytics Data Store for the pipeline

  1. Navigate to AwS IoT Analytics console
  2. Navigate to Data stores
  3. Create:
    • ID: sensorStreamDataStore
    • Choose the Storage Type: Customer Managed S3 Bucket -> choose the S3 data store bucket.
    • IAM ROLE: Choose Lab Role
  4. Click Next and Create data store

3. IoT Analytics Pipeline

  1. Navigate to AwS IoT Analytics console
  2. Navigate to Pipelines
  3. Create:
    • ID: sensorStreamDataPipeline
    • Pipeline source: sensorStreamDataChannel
  4. Click Next
  5. Pipeline Output: click 'Edit' and choose 'sensorDataStreamStore'
  6. Click Create Pipeline

At this step, the IoT Analytics Pipeline is now set up.

4. IoT Analytics Data Set

  1. Navigate to AwS IoT Analytics console
  2. Navigate to Data sets
  3. Choose Create a data set
  4. Select Create SQL
    • ID: sensorStreamDataSet
    • Select data store source: sensorStreamDataStore - this is the S3 bucket containing the data
  5. Click Next
  6. Keep the default SQL statement, which should read SELECT * FROM sensorstreamdatastore and click Next.
  7. Keep all options as default and click Next.
  8. At Set query schedule, choose the Frequency: Every 1 minute. This will query to run regularly to refresh the dataset. click Next until reaching the "Configure dataset content relivery rules".
  9. Click Add rule
  10. Choose Deliver result to S3
    • S3 buket: select the S3 bucket that ends with '-dataset'.
    • Bucket key expression: output.csv
    • IAM Role: LabRole
  11. Click Create data set to finalize the creation of data set.

5. Create Message Routing Rule from IoT Core to IoT Analytics

  1. Navigate to AwS IoT Core console
  2. Navigate to Message routing
  3. Navigate to Rules
  4. Click Create rule
    • Rule name: sensorDataStreamRule, then click Next
    • SQL Statement: SELECT temperature, humidity, windDirection, windIntensity, rainHeight, timestamp() as time FROM 'his_project/his_iot/sensor_data', make sure the SQL version is 2016-03-23. Afterwards, click Next
    • Rule actions: Choose IoT Analytics (Send a message to IoT Analytics)
    • Channel name: sensorstreamdatachannel
    • IAM Role: LabRole
  5. Review the rule, and click Create rule.

At this point, the every received messages will be routed to IoT Analytics and save to the created S3 buckets.


AwS SageMaker:

1. On AwS IoT Analytics:

  1. Navigate to AwS IoT Analytics console
  2. Navigate to Notebooks
  3. Click Create Notebook
  4. At Select a template, choose IoTA blank template
  5. To setup a notebook:
    • Notebook name: sensorDataNotebook
    • Select dataset source: sensorstreamdataset
    • Select a notebook instance: We don't have that yet, therefore, click the Create new instance drop down menu, give it the instance name, instance type, and Role name(again, **LabRole) and click Create new instance. Once a new instance is running, simply choose that instance.
  6. Click Next to review, then click Create notebook.

Now we have a running Notebook to (which is a bit overkill) visualize the sensor data. Wait until the notebook instance status goes from Pending to in service, then we are good to go.

2. In the Notebook:

  1. On the same console, click Actions -> View in AwS SageMaker -> Open JupyterLab.
  2. Navigate to IoTAnalytics folder on the left.
    1. Click the sensordatanotebook.ipynb and paste the code from the *.ipynb in this git repo, or.
    2. Upload the *.ipynb in this git repo onto the SageMaker directory, and runs all the cell to see the visualization!

visualization img


Contribution:

Task/Function Responsible
Sensor's ZigZag function Jaime Sanchez Cotta
Sensor's Publish function Phuc Hoc Tran + Jaime Sanchez Cotta
AwS EC2 and Mosquitto Broker Setup Phuc Hoc Tran
AwS IoT Core, Message Routing Rule Setup and Mosquitto Bridge Phuc Hoc Tran
Paho's MQTT-SN Gateway Setup Phuc Hoc Tran
AwS IoT Analytics Setup Phuc Hoc Tran
AwS S3 Buckets Setup Phuc Hoc Tran
AwS SageMaker Setup Phuc Hoc Tran
Sensor Data Visualization (Python) Phuc Hoc Tran

About

High Integrity Systems Master Project: IoT - From Microcontroller to the Cloud

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published