maker100-robotics-machine-learning-IoT-communication-curriculum
Views better using the README.md here
The original courses are the maker100 using the Arduino PortentaH7 with LoRa Vision Shield and maker100-eco using the Seeedstudio XIAO-esp32S3-Sense
Created August 2024 by Jeremy Ellis LinkedIn. Limited consulting available as I am still a full time Educator.
Webpage Dynamic Price-list.html ranging from Economy ~$2,000 to setup the class to Default ~ $7,000 USD to get started and Kitchen Sink at about $31,000 USD. the default is basically what I use.
The 2024 economy version of this course using the seeedstudio $14 USD XIAO-ESP32s3-Sense is at maker100-eco
The original 2021 version of this course using the $114 USD PortentaH7 is at maker100
My Youtube playlist about this curriculum is called Hands on AI
How can a school or university start a general robotics course for all students when there are only a few educators skilled in robotics and machine learning?
- A versatile, passionate educator
- A computer lab equipped with a few 3D printers
- Strong IT support to manage software installations and updates
- An initial robotics lab stocked with sensors, actuators, IoT modules, basic electronics (wires, breadboards, batteries, resistors, capacitors, etc.), soldering equipment, etc. ~ $2,000.00 - $30,000.00 with a sensible starting point at about $7,000.00 USD. Check out the estimated price-list.html
- A budget for consumables and a set of new microcontrollers every few years. ~ $500 - $3,000
- A well-crafted, asynchronous, student-friendly robotics, machine learning, and IoT curriculum which is right here on this page.
Robotics is fundamentally about solving technology problems. Students must actively engage in overcoming these challenges. Once all the technology problems are solved and there are no more challenges to face, can it truly be called a Robotics Curriculum? This curriculum with new microcontrollers every few years solves that issue and makes solving the technology problems a constant process.
Large Language Models (LLMs) like ChatGPT, coPilot, BingChat, and LLAMA-v2 are revolutionizing most aspects of life and styles of academic instruction. However, the complexity of AI and the datasets these models are trained on makes understanding them difficult to teach to the general public. TinyML, using affordable microcontrollers like Arduinos, offers students a hands-on way to grasp AI concepts. It allows them to train a simplified version of Machine Learning using small, manageable datasets—such as images they create themselves.
This approach is relatively easy to teach within a Robotics, Machine Learning, and IoT course, providing students with an intuitive understanding of the technology that is rapidly transforming our world.
- Start with a microcontroller that has proven successful for other educators. In my case, I recommend the Seeedstudio #XIAO-ESP32S3-Sense which costs around $14 USD. For 30 students, that totals $420. Yes, each student should have their own microcontroller. Additionally, you'll need USB-C cables, microSD cards, and pin headers.
- Class sets of most equipment aren't necessary. Since the course is asynchronous, students can work at their own pace. This means you may only need a few of the more expensive sensors, like the Pixy2, a Lidar, or soldering equipment. While there are benefits to having class sets of all equipment, I’ve never found it necessary. Plus, it can create a storage mess.
- Demand peer teaching. When a student successfully completes a curricular task, have them teach a few other students how to do it. This reinforces their understanding and builds a collaborative learning environment and really is the only way this course will be successful.
- Students can manage their own work. They can download the curriculum from Github as a zip file, unzip it, and upload it to their own GitHub repositories, allowing them to organize and update their work effectively.
- I have students make very short videos on the school network of each project, with a simple circuit diagram shown in the video. First time educators may just want to keep a running tally of the assignments they have seen working.
- The inexpensive Seeedstudio XIAO-SAMD21 microcontroller board for $7 USD which comes with pin headers is a great microcontroller for students to play with when testing new sensors and runs very similar to many Arduino boards, and easily is auto detected the Arduino IDE.
- Note: The Educator decides which assignments to do and in what order and which ones to change, and also decides how many assignments to complete before class time is spent on the final projects.
- Final projects determine the grade. Studentcan pass with a simple unique sensor actuator assignment, A grades can be assigned for multiple sensor and or multiple actuators, and or IoT and/or Machine learning. When students have completed these basic assignments they are expected to get together in groups and use there proven skills to attempt a group project.
- I do not teach each assignment in the order presented, I often jump back and forth from simple Senses and simple Machine learning and simple actuators then back to the main order. Note: For advanced students this coursee is Asychronous so that they can work ahead and solve issues that the other students will benefit from later.
- No final projects using higher than 40 volts, water or drones, without some family safety protocols (such as parent is an electrical Enginer etc)
Note: Any student with previous Arduino experience should breeze through most of the Coding, Sensors and Actuators part of this course!
On this page Quick Links
Machine Learning
Actuaors-motors-LED's etc
IoT-connectivity
-
Base01-Install: Determine the software to install (Best to have some software installed before the class starts) A good software installation starting point is: NodeJS, Python, Arduino Legacy and New IDE (sometimes both of these IDE's don't work well on the same computer, the IT department may choose to make one or both of them "portable"), Pixymon2, OpenMV, Putty, and platformIO, which needs VSCode Note: Good communication with the IT department is essential as new software will need to be installed during the course, especially if important upgrades are released or a new board needs admin access to fully install.
-
Base02-Equipment: Your computer lab needs basic electronic equipment, often best to get sets of electronic basuic equipment.
-
Base03-Language: Determine the computer language to use: Probably best to work with a few standard languages. I mainly use Arduino C/C++ a subset of regular C++ but every sketch has a setup() and loop() funciton instead of a main() function. Other choices are: full GNU MAKE C/C++, microPython, Zepher(RTOS) and many more.
-
Base04-platform: Probably best to work with a few standard platforms. I mainly use the Arduino Legacy and new IDE, the arduino cloud, platformIO all using C/C++ and sometimes openMV (which is python)
-
Base05-Blink: Get the Blink program working using the Arduino IDE and your microcontroller, which means you will need to install the correct board and identify the PORT. Also might need you to tap buttons on the micrcontroller to put it into boot mode so a program can be uploaded. You often have to reset it to run the program.
-
Base06-Hello: Like the blink program except prints to the Arduino serial monitor. I use a blink program that also shows analog read A0 to the Serial Monitor my example
-
Base07-Libraries: Understand libraries as some examples will not work until one or many libraries have been installed. My students install the "Portenta Pro Community Solutions" library in the Arduino IDE and have a look at the long list of examples that match many of the concepts in this curriculum. I made this library for the PortentaH7 produced by Arduino in 2020, many of the examples need to be slightly changed to work with the XIAO-esp32S3
-
Base08-putty: Putty is a windows serial monitor program that can see a serial COM port without having to usee the Arduino IDE. Note loading a DOS window or power shell widow and typing "mode" will show all the serial connections. On Linux or Mac you could use a program called "screen"
Note: Explain VIDEO FLAC as seen below. Have students write arduino code that shows to the serial monitor each of these abilities. Very important for stsudents to try to change and improve their code to learn how it works. I actually do this section as one big assignment, since most of my students have already done a computer programming course. my example -
Code01-Var: Variables, make code to show multiple types of variables in the serial monitor
-
Code02-In-out: Input/Output make code to read a variable from the serial monitor (click send) and print it to the serial monitor
-
Code03-if: Decisions (If statments and possibly case statements). Write code to make a decision based on information sent to the program from the serial monitor
-
Code04-Events: Events things that drive code (This is actually from Javascript programming). Write a menu and have code do different things based on the menu decision, such as WASD, each letter makes something move a different direction
-
Code05-structs: Objects (Structs in some languages like C/CPP) Make a struct a fancy variable that connects a key word with data and presents the data in the serial monitor
-
Code06-Functions: Functions, write a function that prints to the serial monitor and then activate it
-
Code07-Loops: Loops such as For loops (possibly while loops). Using a varible that stores a number print something that many times.
-
Code08-Array: Arrays. Make an Array a fancy variable that numbers each value. a loop can be used to print the whole array to the serial monitor
-
Code09-Class: Classes. A. Use a class. B. make a class from scratch and then use it.
-
Code10-SOS: In as few lines as possible make the onboard LED (LED_BUILTIN) flash an SOS. Which is 3 short flashes, 3 long flashes 3 short flashes. my Example
reminder that all these assignments need a drawn and checked circuit diagram before you begin to connect wires to the microcontroller
-
Sense01-Analog: Find a module sensor that has an analog output and get a reading on your micrcontroller serial monitor on pin A0, reminder to connect GND and 3V3 if needed
-
Sense02-Voltage-Divider: Find a variable resistor sensor (has two prongs) like a thermistor, phtoresistor or flex sensor and use a Voltage Divider to get and control the reading on the serial monitor. my example
-
Sense03-two-prong: same as above using serial monitor analog read and a resistor but use a different 2 prong sensor with the voltage divider. Possible variable resistors are: flex sensor, photoresistor, touch/pressure sensor, rheostat, potentiometer...
-
Sense04-button: Connect a digital sensor like a button to the micrcontroller at show on the serial monitor when the button has been pressed
-
Sense05-led: Actually the first actuator assignment but connect a resistor and an LED and make the LED blinnk like the onboard LED_BUILTIN from the BLINK program.
-
Sense06-button-led: Combine the above two assignments to make your first sensor / actuator asssignment. This is what most Arduino style programs are like. Use a button as a sensor and an LED with serial resistor as the actuator to get a visual response and a response on the serial monitor. This is an important assignment as it connects both sensors and actuator using a microcontroller. my example
-
Sense07-Accel: Use a 3 (or 6 or 9) axis accelerometer to measure x, y, z see if the results make sense knowing that veritacal acceleration due to gravity is about 9.8 m/s^2
-
Sense0-joy-stick: Connect a joy stick to your microcontroller and get a reading. This is almost exactly the same as Sense01-Analog: with A0, 3V3
-
Sense08-range-finder: Connect a range-finder to your microcontroller and determine the distance to an object. Note: The nicla Vision comes with a time-of-flight that work up to about 4 meters. Typical ranges are 10 cm to 100 cm. my example
-
Sense09-image-to-sd-card: Put the image from the microcontroller camera onto an sd card module. Note: the XIAO-ESP32S3-sense has a micro sd card holder onboard the camera sensee attachment. my example
-
Sense10-sound-to-sd-card: Record a sound and have it placed in a useable format on the sd card. my example
-
Sense11-video-to-sd-card: Record a video on your micro sd-card. my example
-
Sense12-Pixy2: use the amazing Pixy2 with an SPI connection to your microcontroller to anlyses shaded objects (shades are all colors except black and white) see Charmed labs Pixy video here and then my Example.
-
Sense13-GPS: Get a GPS module working. If students can just extract the longnitude and latitude that would be very helpful. my example 1. Sense14-Lidar: Connect a lidar to the microcontroller serial monitor, the information will be a mess but proves the lidar works. my example. Better assignment is the lidar-Grayscale-OLED assignment later in the course.
-
ML01-sensecraft: Use a simple way to install machine learning models to your microcontroller such as sensecraft.seeed.cc for the XIAO-ESP32S3-Sense
-
ML02-vision: Use Edgeimpulse.com and your cell phone or another cloud based method to make a vision classification model by taking pictures of pen/pencils labelled "1pen" and things without them labelled "0unknown". The numbers are not needed but really help later when things get more complex. Test your model also from your cell phone. my example
-
ML03-wake-word: Use Edgeimpulse.com and your cell phone or another cloud based method to make a key word using sounds such as "Hi Google". Label recordings appropriately, you may want to record no sound and background sounds. my example
-
ML04-motion: Use Edgeimpulse.com and your cell phone or another cloud based method to make a motion model using a 3 axis accelormeter. Now your labels might be "0still", "1wave", "2punch". my example
-
ML05-FOMO: Use Edgeimpulse.com and your cell phone or another cloud based method to make a Vision Fast objects, More Objects (FOMO) model, this now needs bounding boxes and a data queue to store the images before you draw labelled boxes around each image. my example
-
ML06-deploy-classification: Use Edgeimpulse.com to deploy the above classification model (deploy means to download the Arduino Library with examples for your microcontroller. Note: On widows computers the first compilation can take 15-25 minutes so get it compiling. Also look at the code and see if you can determine when the code prints out the results. A really good idea to try deploying all the EdgeImpulse models to your microcontroller. If deploying to openMV it is much faster but only works on a few Arduino boards.
-
ML07-deploy-wake: Use Edgeimpulse.com to deploy the edgeimpulse sound wake word model to you microcontroller
-
ML08-deploy-FOMO: Use Edgeimpulse.com to deploy the edgeimpulse FOMO vision model to you microcontroller
-
ML09-regression: Use EdgeImpulse.com to make a vision regression model (numerical size) and deploy the model to your device. my example
-
ML10-anomaly: Use EdgeImpulse.com to make an anomaly detection model with two labels that can rate how different the classification is from the label and deploy it to your microcontroller. my example
-
- ML11-sensor-fusion: Use EdgeImpulse.com or another site to make merge different senses over time such as distance and motion like the Nicla Vision is capable of, or if you have the nano33BleSense up to 18 different sense and it is supported by Edgeimpulse. This is a very important part of Machine Learning but few schools will have the equipment and ability in 2024 to do it my example
-
ML12-int8-quatizied: Use Edgeimpulse.com to download the int8-quatizied model of the vision classification model to upload it to sensecraft.seeed.cc for the XIAO-ESP32S3-Sense, if using a different microcontroller try other ways to upload your model, possibly deploy a c/c++ model and locally compile it for your microcontroller.
-
ML13-WebSite-LLM: Make a website that loads a huggingface or other cloud hub for storing pre-trained machine learning models. my example each example is a signle file webpage and can be copied to your storage area.
-
ML14-local-LLM: Download a full chat LLM such as LLAMA-v2 and get it working on your laptop or desktop computer. Be very cafeul if you pay for data as some of these files are large. tinyLLM .... tinyLlama .... gpt4All .... github Market Place Models .... hugging face models
reminder that all these assignments need a drawn and checked circuit diagram before you begin to connect wires to the microcontroller
-
Act01-servo: Connect a servo motor to your microcontroler. Reminder that generally the servo red and brown wires go to their own 6 Volt battery, not the microcontrollers power pins connectors. Also note the ESP32 microcontrollers use a different library than the regular arduino. my example
-
Act02-PNP-transistor: connect a motor with it's own power supply and control it using a PNP transistor. my example
-
Act03-NPN-transistor: connect a motor with it's own power supply and control it using an NPN transistor. my example
-
Act04-small-DC-motor-driver: connect a small motor with it's own battery supply to a motor driver that is safely connected to your microcontroller. my example
-
Act05-large-motor-driver: connect a large motor with it's own battery supply to a large motor driver that is safely connected to your microcontroller. my example
-
Act06-stepper: connect a stepper motor with it's own power supply to stepper motor driver and control it safely with your microcontroller. my eample
-
Act07-I2C-OLED: connect a simple black and white OLED to the microcontroller and show that the library for it works and can produce written text. my example
-
Act08-lidar-and-grayscale-OLED: connect a grayscale OLED to the microcontroller with a Lidar detector and show the entire room. my example
-
Act09-camera-and-grayscale-OLED: connect a grayscale OLED with a camera connected to the microcontroller and show the image. my example
-
Act10-grayscale-OLED: connect a grayscale OLED to the microcontroller and show text and some basic shapes. my example
-
Act11-color-OLED or-TFT: Connect a color possibly with touch ability to the microntroller and show text and basic shapes and if touch is present demonstrate a touch event. seeedstudio round display ... my TFT example-touch never really worked well for me
-
Act12-e-ink: Get an e-ink display connected with the microcontroller showing a different screen every few seconds. my not working example
-
Act13-PCB-build: using easyEDA or some other online or local software design a simple PCB based on a video tutorial such as the EasyEDA Tutorial 2020 Note: It is challenging to find a simple tutorial for creating PCB's, students with CAD, 3D Printing and animation experience will have some advantages in this assignment. Note: JLCPCB is very fast and inexpensive to make these PCB's if you are OK soldering the componenets together. Last one I did was about $50 USD for 5 boards with shipping that arrived 10 days after ordering.
-
IoT01-WiFi-Webserver: Make your microcontroller into a LOCAL WIFI wewbserver. Note: Unless your IT department likes you this webserver will not be connected to the internet. my example
-
IoT02-camera-streaming-webserver: make your camera stream to a local webpage. This is actually a default program that comes with all ESP32S3 boards, you have to comment out some parts of the code. Look for Examples-->ESP32-->Camera-->cameraWebServer
-
IoT03-sound-streaming: Good luck! My students never got this working.
-
IoT04-BLE: Get the microcontroller to connect to an APP like the NRF connect by nordic apple ... Android my example BLE coding is very strange, I would suggest getting assistance using coPilot etc
-
IoT05-LoRa: If you have two LoRa modules try to get it so you can text back and forth between them. This is actually quite advanced and I use an entire different board. my example using the RAK2270 sticker tracker
-
IoT06-LoRaWan: This is also reasonably advanced, one LoRa module should be able to connect to a LoRaWan netwrok like TTN or Helium. If you have connectivity you will also need a cloud connection. my example using the RAK2270 sticker tracker
-
IoT07-ESPNOW: If your microcontroller can chat with other ones like the ESP32, use their default ESPNOW example programs to make and test connections between them. ESPNOW is like WiFi but without using a router that needs a password, it is more like using a radio on a specific channel.
-
IoT08-ethernet-poe: If you have an ethernet module try to make a webserver using ethernet. Ethernet has two huge advantages: 1. no passwords needed, 2 POE (Power over Ethernet) some schools will have POE auto setup and it is a bit of a joy when it works, meaning Ethernet not only gives you web access but also powers your microcontroller. my example but only for the PortentaH7 with Ethernet vision shield last year using the XIAO boards I did not do this assignment.
-
IoT09-multiplexer: Some microcontrollers do not have enough pins for the final projects so connecting a multiplexer makes some sense. I never got this working but did get the below connectivity working.
-
IoT10-UART: connect 2 microcontrollers to exchange information using the UART serial protocol RX criss crossed with TX. my example
-
IoT11-I2C: Use the I2C serial Protocol to connect and exchange information between 2 micrcontrolers. Note: you must pullup the SDA and SCL lines to 3V3 using a 4.7 kOhm resistor. The two pins for I2C are called SDA and SCL my example
-
IoT12-SPI: Use 2 microcontrollers to connect and exchange information using the SPI protocol (MOSI, MISO, SCK, SS) may also be other labels like POCI, PiCO, SC, SS. Note this is fairly hard on many microcontroller and they dipically are the controllers and sensors typically are the peripherals. Good luck getting this one to work.
Note: Be very leary of projects that use other microcontrollers as the student has most likely just followed an online cookbook. These Final Projects should come from the combination of assingments we did this semester put together in novel ways.
-
Final01-simple: (pass) Simple unique for each student sensor and actuator with circuit diagram (Proof of concept)
-
Final02-multi: (possible A or higher) Multiple sensor and/or multiple actuator and/or IoT communication and/or Machine Learning final Project with circuit diagram with 3D Printed structure (Can also be wood, metal, cardboard etc) (Prototype)
-
Final03-group: (possible A+) Based on prievious projects students get in groups and combine their strengths to make a useful or fun final project which must include Machine Learning. The teacher can also suggest student's who's strength may complement each other for an interesting group project. Note: Many students do not have time to finish a group project.
Notes about how to grade students.
Basically as long as the teacher is clear at the start of the course any grading method is fine. What I do is:
- Students must finish all manadatory assignments, (When the entire class has difficulty with an assignment I make it optional until any student can do it or I get it working. In 2024 I never got e-ink working which stayed optional and GPS never worked for latitude and longitude, students gt full marks on that assignment if they generated all the GPS data, but I really wanted someone to parse the data for just latitude and longitude.
- Once the manadatory assignments are complete they can start their final projects, which must be done in order, easy to hard (advanced students will work on a hard project and never get it finished).
- I encourage students to work on multiple projects as some projects just can't be finished before marks are due.
- Final grades come from final individual projects. Group projects just bump grades up a few percent.
- at any point you should be able to ask a student to reproduce an asssignment they have already completed. That way they need to keep good nots and a circuit diagram
-
The 2024 economy version of this course using the seeedstudio $14 USD XIAO-ESP32s3-Sense is at maker100-eco
-
The original 2021 version of this course using the $114 USD PortentaH7 is at maker100
-
Deprecated 2020 Arduino course here
-
Deprecated 2019 Particle.io course here