A comprehensive Python toolkit for robot control, computer vision, and workflow automation with Universal Robots.
# Clone the repository
git clone https://github.com/AccelerationConsortium/ur_toolkit.git
cd ur_toolkit
# Set up virtual environment with all dependencies
bash setup/setup_venv.sh
python scripts/teach_positions.py
Interactive tool for teaching and managing robot positions with remote freedrive capability.
# Run sample workflow
python scripts/run_workflow.py
# Run custom workflow
python scripts/run_workflow.py examples/workflows/sample_workflow.yaml
Complete robot vision system with UR robot control, camera capture, and AprilTag detection.
ur_toolkit/
βββ src/
β βββ ur_toolkit/ # Main Python package
β βββ camera/ # Camera interface modules
β βββ robots/ # Robot-specific implementations
β βββ workflow/ # Workflow execution system
β βββ visual_servo/ # Visual servoing engine
β βββ positions/ # Position management
β βββ camera_calibration/ # Camera calibration tools
β βββ apriltag_detection.py # AprilTag detection
β βββ config_manager.py # Configuration management
βββ scripts/ # Executable CLI scripts
β βββ teach_positions.py # Interactive position teaching
β βββ run_workflow.py # Workflow runner
βββ config/ # Configuration files
β βββ config.yaml # System configuration
βββ examples/ # Example workflows
β βββ workflows/ # Sample workflow YAML files
βββ docs/ # Documentation
βββ tests/ # Test files
βββ pi_cam_server/ # Camera server (separate deployment)
βββ setup/ # Setup and installation scripts
- Interactive Position Teaching - Remote freedrive with automatic safe offset positioning
- YAML Workflow System - Sequential robot operations with step-by-step execution
- AprilTag Integration - Computer vision-based positioning and calibration
- Camera Calibration - Tools for camera intrinsic calibration
- Robot Control - Universal Robots interface with gripper support
- Visual Servoing - PID-based iterative pose correction
See the docs/
directory for detailed guides:
- Workflow System - Complete workflow usage guide
- Position Teaching - Position teaching workflow
- AprilTag Workflow - Computer vision integration
- Configuration Guide - System setup and config
- Changelog - Project change history
See setup/requirements.txt
for Python dependencies. Compatible with Universal Robots and Robotiq grippers.
# Start interactive position teaching
python scripts/teach_positions.py
# Teach positions for AprilTag ID 5
python scripts/teach_positions.py --tag-id 5
# Execute a specific workflow file
python scripts/run_workflow.py examples/workflows/visual_servo_test.yaml
# Run in step-by-step mode
python scripts/run_workflow.py examples/workflows/sample_workflow.yaml --step-mode
Configure the system using the unified configuration file config/config.yaml
:
# Robot Configuration
robot:
ip_address: "192.168.0.10" # Your UR robot IP
default_speed: 0.03
default_acceleration: 0.08
# Camera Configuration
camera:
server:
host: "192.168.1.100" # Your Pi camera IP
port: 2222
# AprilTag Configuration
apriltag:
family: "tag36h11"
tag_size: 0.023 # 23mm tags
See docs/CONFIGURATION_GUIDE.md
for complete configuration options.
On your Raspberry Pi, run:
curl -sSL https://raw.githubusercontent.com/AccelerationConsortium/ur_toolkit/main/pi_cam_server/install.sh | bash
This will:
- Clone this repo
- Install all dependencies using system packages
- Set up camera server as systemd service
- Enable auto-start on boot
- Start the service immediately
After setting up the virtual environment and configuration:
# Activate environment (if not already active)
source venv/bin/activate
# Test the camera connection
python tests/test_camera_capture.py
# Test UR robot connection
python tests/test_ur_robot.py --robot-ip 192.168.0.10
# Test AprilTag detection
python tests/test_apriltag_detection.py
from ur_toolkit.camera.picam.picam import PiCam, PiCamConfig
# Load config and capture photo
config = PiCamConfig.from_yaml("config/config.yaml")
cam = PiCam(config)
photo_path = cam.capture_photo()
if photo_path:
print(f"Photo saved: {photo_path}")
from ur_toolkit.apriltag_detection import AprilTagDetector
# Initialize detector
detector = AprilTagDetector(
tag_family='tag36h11',
tag_size=0.023, # 23mm tags
camera_calibration_file='camera_calibration/camera_calibration.yaml'
)
# Detect tags in image
import cv2
image = cv2.imread('photo.jpg')
detections = detector.detect_tags(image)
for detection in detections:
print(f"Tag {detection['tag_id']} at distance {detection['distance']:.3f}m")
print(f"Position: {detection['pose']['tvec']}")
print(f"Orientation: {detection['pose']['rvec']}")
from ur_toolkit.robots.ur.ur_controller import URController
# Connect to robot
robot = URController('192.168.0.10')
# Get current pose
current_pose = robot.get_tcp_pose()
print(f"TCP Position: {current_pose[:3]}")
# Move to new position (relative)
new_pose = current_pose.copy()
new_pose[2] += 0.1 # Move up 10cm
robot.move_to_pose(new_pose)
For accurate AprilTag pose estimation:
# 1. Print the calibration chessboard pattern
# 2. Capture calibration photos
python src/ur_toolkit/camera_calibration/capture_calibration_photos.py
# 3. Calculate camera intrinsics
python src/ur_toolkit/camera_calibration/calculate_camera_intrinsics.py
The project uses a src/
layout for better packaging and testing:
- Source code is in
src/ur_toolkit/
- Executable scripts are in
scripts/
- Configuration is in
config/
- Examples are in
examples/
- Tests import from installed package for accuracy
For development installations:
pip install -e .
See LICENSE file for details. β βββ ur_robot_interface.py # RTDE-based UR interface βββ tests/ # Test scripts β βββ test_camera_capture.py # Basic camera test β βββ test_apriltag_detection.py # AprilTag detection test β βββ test_robot_vision.py # Complete vision system test β βββ test_ur_robot.py # UR robot interface test βββ pi_cam_server/ # Pi camera server β βββ camera_server.py # Main server application β βββ camera_config.yaml # Server configuration β βββ setup.sh # Pi setup script β βββ install.sh # One-line installer β βββ requirements.txt # Python dependencies βββ camera_calibration/ # Camera calibration workflow β βββ capture_calibration_photos.py # Capture calibration images β βββ calculate_camera_intrinsics.py # Calculate intrinsics β βββ camera_calibration.yaml # Generated camera intrinsics β βββ Calibration chessboard (US Letter).pdf # Chessboard pattern β βββ QUALITY_GUIDE.md # Quality metrics guide β βββ README.md # Calibration documentation βββ camera_client_config.yaml # Client configuration βββ README.md # This file
## π€ Robot Vision Workflow
### Complete AprilTag Detection and Robot Control
#### 1. Camera Server Setup
On your Raspberry Pi:
```bash
curl -sSL https://raw.githubusercontent.com/kelvinchow23/robot_system_tools/master/pi_cam_server/install.sh | bash
Edit camera_client_config.yaml
with your Pi's IP address, then test:
python tests/test_camera_capture.py
cd camera_calibration
python capture_calibration_photos.py
python calculate_camera_intrinsics.py
This creates camera_calibration.yaml
with camera intrinsic parameters for accurate pose estimation.
python tests/test_apriltag_detection.py
Test AprilTag detection and pose estimation with your calibrated camera.
# Complete robot vision workflow
from robots.ur.ur_robot_interface import URRobotInterface
from camera.picam.picam import PiCam, PiCamConfig
from apriltag_detection import AprilTagDetector
# Initialize systems
robot = URRobotInterface('192.168.0.10')
camera = PiCam(PiCamConfig.from_yaml('camera_client_config.yaml'))
detector = AprilTagDetector(
tag_family='tag36h11',
tag_size=0.023,
camera_calibration_file='camera_calibration/camera_calibration.yaml'
)
# Capture and analyze
photo_path = camera.capture_photo()
image = cv2.imread(photo_path)
detections = detector.detect_tags(image)
# Use detection results for robot control
for detection in detections:
print(f"AprilTag {detection['tag_id']} detected")
print(f"Distance: {detection['distance']:.3f}m")
print(f"Position: {detection['pose']['tvec']}")
# Implement your robot control logic here
- Robot + Laptop: Same subnet (e.g., 192.168.0.x)
- Pi Camera + Laptop: Same subnet (configure in
camera_client_config.yaml
)
- Default: tag36h11 family, 23mm size
- Customize in detection code for your specific tags
- Ensure tags are printed at exact scale for accurate pose estimation
- Robot + Pi Camera: Different subnets OK
# Main dependencies (installed by setup_venv.sh)
pip install opencv-python numpy scipy ur-rtde requests pyyaml pupil-apriltags
# Additional development dependencies
pip install pytest black flake8 mypy
- Verify Pi IP address in
camera_client_config.yaml
- Check network connectivity:
ping <pi-ip>
- Ensure camera server is running on Pi:
systemctl status camera-server
- Ensure camera is calibrated (
camera_calibration.yaml
exists) - Verify tag size matches physical measurement
- Check lighting conditions and tag visibility
- Use
--continuous
mode for real-time debugging
- Verify robot IP address
- Check robot is in remote control mode
- Ensure robot safety system is active
- Test with minimal robot movements first
camera_calibration/README.md
- Camera calibration guidedocumentation/ARCHITECTURE.md
- System architecturepi_cam_server/README.md
- Pi camera server setup
See .github/copilot-instructions.md
for development practices and coding standards.
server:
host: "192.168.1.100" # Your Pi's IP
port: 2222
client:
download_directory: "photos"
timeout: 10
On the Pi:
# Check status
sudo systemctl status camera-server
# View logs
sudo journalctl -u camera-server -f
# Restart service
sudo systemctl restart camera-server
# Stop/start service
sudo systemctl stop camera-server
sudo systemctl start camera-server
- Raspberry Pi Zero 2W β Tested
- Raspberry Pi 5 β Tested
- Pi Camera v1/v2/v3 β All supported
- USB Cameras β Via libcamera
Simple TCP protocol on port 2222:
- Client connects to Pi server
- Sends "CAPTURE" command
- Server captures photo and returns image data
- Client saves photo locally
The system uses systemd for reliability and auto-start.
# Check service status
sudo systemctl status camera-server
# View error logs
sudo journalctl -u camera-server -n 50
# Test camera hardware
rpicam-still --timeout 1 -o test.jpg
# Test connectivity
ping your-pi-ip
# Check port access
telnet your-pi-ip 2222
- Camera not found: Enable camera with
sudo raspi-config
- Service won't start: Check logs and camera hardware
- Connection refused: Verify Pi IP in
client_config.yaml
- Permission denied: Ensure setup script ran with proper permissions
MIT License