the (non-)moving objects tracking system via two axis camera motion and n joint robotic arm for raspberry pi distributions
Operating systems | Linux |
Python versions | Python 3.x (64-bit) |
Distros | Raspbian |
Package managers | APT, pip |
Languages | English |
- Raspberry Pi 2,3 B, B + or higher
- Raspberry Pi Camera
- n servo motors
- 2 axis as pan-tilt motions for t_system's locking target ability
- n-2 axis for t_system's robotic arm feature (Optional)
- All requirement libraries automatically installing via installation scripts. To see these libraries look at here
Clone the GitHub repository and run
sudo ./install.sh
in the repository directory.
for development mode: sudo ./install-dev.sh
If there is a failure try sudo -H ./install-dev.sh
usage: t_system [-h] [--interface {official_stand,augmented,remote_ui,None}]
[--stand-gpios RED-LED GREEN-LED FAN] [--host HOST]
[--port PORT] [--debug] [-l] [-s]
[--detection-model DETECTION_MODEL] [--cascades CASCADES] [-j]
[--encoding-file ENCODING_FILE] [--use-tracking-api]
[--tracker-type {BOOSTING,MIL,KCF,TLD,MEDIANFLOW,GOTURN,MOSSE,CSRT}]
[--camera-rotation CAMERA_ROTATION]
[--resolution WIDTH HEIGHT] [--framerate FRAMERATE]
[--chunk CHUNK] [--rate RATE] [--channels CHANNELS]
[--audio_device_index AUDIO_DEVICE_INDEX]
[--shoot-formats VIDEO AUDIO MERGED] [--shot-format SHOT] [-x]
[--sd-channels SD_CHANNELS] [--arm-name ARM]
[--ls-gpios PAN TILT] [--ls-channels PAN TILT]
[--AI AI | --non-moving-target | --arm-expansion] [-p]
[--ap-wlan AP_WLAN] [--ap-inet AP_INET] [--ap-ip AP_IP]
[--ap-netmask AP_NETMASK] [--ssid SSID] [--password PASSWORD]
[--wlan WLAN] [--inet INET] [--static-ip STATIC_IP]
[--netmask NETMASK] [--country-code COUNTRY_CODE]
[--environment {production,development,testing}]
[--no-emotion] [-S]
[-m {single_rect,rotating_arcs,partial_rect,animation_1,None}]
[-r] [-v] [--version]
{id,remote-ui-authentication,encode-face,self-update,arm,live-stream,r-sync,log}
...
positional arguments:
{id,remote-ui-authentication,encode-face,self-update,arm,live-stream,r-sync,log}
officiate the sub-jobs
id Make identification jobs of T_System.
remote-ui-authentication
Remote UI administrator authority settings of the
secret entry point that is the new network connection
panel.
encode-face Generate encoded data from the dataset folder to
recognize the man T_System is monitoring during
operation.
self-update Update source code of t_system itself via `git pull`
command from the remote git repo.
arm Management jobs of Denavit-Hartenberg transform matrix
models of robotic arms of T_System.
live-stream Make Online Stream jobs of T_System.
r-sync Make remote synchronization jobs of T_System.
log Make logging jobs of T_System.
optional arguments:
-h, --help show this help message and exit
For detailed output, look at Help
t_system user-interfaces {official_stand,augmented,remote_ui,None}
is standard running command.
official_stand
, augmented
and remote_ui
are mentioned here, here and here as respectively.
Detailed usage available inside USAGE.md
Portable usage interface v0.6
Special thanks to Uğur Özdemir for the awesome design idea of this Stand.
-
- Raspberry pi 4 model B/B+.
- 2 pieces mg995, 3 pieces sg90 or mg90s servo motors.
-
Has 1.125 times longer body and arm length than the previous version.
-
8x8mm dimensions 8MP resolution micro camera.
-
Automatically activatable IR led for advanced night vision.
-
Cut the electiric current directly.
-
Seri connected 2 pieces for feeding Raspbbery Pi and other seri connected 2 pieces for servo motor.
-
12 bit 16 channel PWM servo driver with I2C communication.
-
2 pieces 30x30x10mm micro fan and the aluminyum block for falling down the cpu temperature.
-
Scan the around networks. If there is no network connection become an Access Point and serve Remote UI ınternally.
-
No control by tapping. Accessing with Remote UI from mobile or desktop.
-
To see the old version explaining go here
The remotely controlling interface v1.2.7
-
-
2 kind control type for the arm. 1: axis based control. move all axes separately. 2: direction based control. move according the direction (up-down/forward-backward/right-left).
-
create scenarios by specifying arm positions and generating motion paths with them for behaving like a camera dolly.
-
watch the live video stream during creating scenarios and monitor what is it recording on working.
-
Add, update or delete Wi-Fi connection info.
-
Add, update and delete the photos of the people for recognizing them. Choose one, more or all and recognize during the job.
-
Get preview or download the video records with the date based sorting system.
-
Start Live video streaming on available 8 different popular websites include Facebook, YouTube, Periscope and Twitch.
-
Powered by flask
as an embedded framework. Available on mobile and desktop.
Augmented usage explained here into the AUGMENTED.md
.
Supported Distributions: Raspbian. This release is fully supported. Any other Debian based ARM architecture distributions are partially supported.
If you want to contribute to T_System then please read this guide.