Argus is an advanced simulation that combines machine learning and real-time 3D physics to create a self-guided missile system. The project demonstrates how computer vision can be integrated into game engines for autonomous target tracking and interception.
Named after the Argus Panoptes from Greek mythology—a giant with a hundred eyes—this project embodies the concept of constant visual surveillance and intelligent tracking.
- YOLOv8 Object Detection running in real-time on GPU
- Unity Sentis Integration for neural network inference
- Single-class aircraft detection optimized for accuracy
- Real-time bounding box visualization with confidence scores
- Realistic aerodynamic model with lift/drag calculations
- Angle of Attack (AoA) based flight dynamics
- Proportional Navigation guidance system
- Manual plane controls with throttle, pitch, and roll
[Camera Feed] → [640×640 Tensor] → [YOLOv8 Sentis] → [Detections]
↑ ↓
└──[Missile Rotation]←[Proportional Navigation]←[Offset Calculation]
- Camera captures view → 640×640 render texture
- Sentis runs YOLOv8 inference → detections
[x, y, w, h, confidence] - Target center computed in screen space
- Offset calculated:
(targetX - screenCenterX, targetY - screenCenterY) - Missile rotates toward target: yaw ∝ offsetX, pitch ∝ offsetY
The missile uses a classic guidance law:
- Detection: YOLOv8 finds target bounding box
[x, y, w, h] - Center Calculation:
targetCenter = (x + w/2, y + h/2) - Screen Offset:
offset = targetCenter - screenCenter - Normalize:
offsetNormalized = offset / screenSize × 2→ range[-1, 1] - Apply Rotation:
yaw += offsetX × rotationSpeed × deltaTime pitch += offsetY × rotationSpeed × deltaTime
This creates a smooth pursuit trajectory that leads the target!
- Uses the Kaggle Military Aircraft Detection Dataset
- All aircraft types merged into a single "Plane" class for maximum accuracy
- Trained on Kaggle GPU (NVIDIA P100) in ~10 minutes
- Architecture: YOLOv8n (nano variant for real-time performance)
- Input Size: 640×640 pixels
- Output: Bounding boxes with confidence scores
# main.py - Optimized export for Unity Sentis
model.export(
format='onnx',
imgsz=640,
simplify=True,
opset=15, # Sentis compatibility
batch=1, # Fixed batch size
dynamic=False # Static shapes for performance
)| Metric | Value |
|---|---|
| Input Size | 640×640 RGB |
| Inference Time | ~16-30ms (GPU) |
| FPS | 30-60 |
| Confidence Threshold | 0.9 |
| Architecture | YOLOv8n |
Engine: Unity 2023.2.20f1
| Parameter | Default Value |
|---|---|
| Air Density | 1.225 kg/m³ |
| Wing Area | 16 m² |
| Max Thrust | 190 N |
| Lift Slope (ClAlpha) | 5.5 |
| Induced Drag (k) | 0.04 |
- Hybrid Tracking: Combine detection with Kalman filtering for smoother tracking
- Multi-Target: Track and prioritize multiple aircraft
- Better Physics: Add wind resistance, thrust vectoring, fuel consumption
- Advanced UI: HUD with radar, lock indicators, target info
- Multiplayer: Network-based dogfighting simulation

