Camera Follower Bot is a lightweight, real-time camera tracking system that detects faces and poses, then drives a microcontroller via serial communication to physically track them. This project is a robust, testable, and scriptable rework of the original "camera follower robot" project by Will Cogley (https://willcogley.notion.site/).
It is designed for hobbyists and developers who want to build a camera-based tracking robot using Python, MediaPipe, OpenCV, and a microcontroller such as the Raspberry Pi Pico running MicroPython.
Table of Contents
- Features
- Usage
- Development
- Contributing
- License
- Real-time camera processing using MediaPipe / OpenCV
- Simple CLI for running and configuring the processor
- Reconnect/backoff logic for serial communication with Raspberry Pi running the robot
- Displaying of robot logs on computer preview window
- Advanced logging system with configurable log levels and output destinations
- Unit tests and pytest configuration
- Migrated to new MediaPipe APIs
- Helper scripts to setup venv, run tests, and run the processor
- Python 3.8 or newer on host computer
- Micro Python on microcontroller
- Camera follower robot mechanics (Will Cogley's design or similiar)
- Raspberry Pi Pico or similar microcontroller running MicroPython to control the robot, connected via serial port to receive commands
- Webcam connected to host computer mounted on robot and controlled by its eye movement
- Create the virtual environment and install dependencies (scripted):
./scripts/setup.sh-
Download the MediaPipe BlazeFace TFLite model and place it into
models/. Seemodels/README.txtfor instructions and an example filename (blaze_face_short_range.tflite). -
Run tests (uses the repo venv python):
./scripts/test.sh-
Connect a microcontroller runnning micro python via USB to Serial and upload the files in
src/camera_follower_bot/rpi_pico_code/to the device. Make sure to havesrc/camera_follower_bot/rpi_pico_code/follower_bot.pyto be auto started (e.g. renaming it to main.py when uploaded). Reboot the microcontoller and leave it connected to your computer. -
Run the camera processor on your computer and point it to the model file:
./scripts/run_camera.sh --model-path models/blaze_face_short_range.tflite- Check if the camera preview window opens and the computer connects to the microcontroller (visible when you see the microcontroller's output). If this fails, adapt your settings/parameters (see below)
Running the scripts/run_camera.sh script you can pass the following parameters:
--model-pathPath to a TFLite model file (default: /models/blaze_face_short_range.tflite)--serial-portSerial device path (default: /dev/cu.usbmodem101)--baudSerial baud rate (default: 115200)--camera-idCamera device id for OpenCV (default: 0)--no-serialRun without serial hardware, e.g. useful for testing (default: disabled)--rotate180/--no-rotate180Rotate camera image by 180 degrees (default: enabled)--flip/--no-flipFlip camera image horizontally (default: enabled)--log-levelSet logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL; default: INFO)--log-filePath to log file (default: stdout only)
The application uses Python's built-in logging library for all log output. You can control the logging behavior using command-line arguments.
Any log from the robot is forwarded to the logger on the host machine.
The logging system provides different log levels:
- DEBUG: Detailed information for diagnosing problems (e.g., serial communication details)
- INFO: Confirmation that things are working as expected (default)
- WARNING: Indication that something unexpected happened, but the application is still working
- ERROR: A serious problem that prevented a function from completing
- CRITICAL: A very serious error that may prevent the application from continuing
- The package lives under
src/camera_follower_bot. Tests live intests/andpytest.iniaddssrcto PYTHONPATH. - To run tests using the venv python explicitly:
.venv/bin/python -m pytest- To run the main module directly (developer use):
.venv/bin/python -m camera_follower_bot.run_cameraContributions are welcome.
See CONTRIBUTING.md for details and follow the CODE_OF_CONDUCT.md when contributing.
This project is distributed under the MIT license. See LICENSE for details.
