-
Notifications
You must be signed in to change notification settings - Fork 4
Robot control with colored object detection
This tutorial will be about detecting a green colored object and enabling any robot such as the Roomba or turtlesim to move towards the object.
This tutorial will involve the following steps:
-
Create a new python file in your
ros_package/scripts
folder and name it ball_tracking1.py -
Import the necessary packages
#!/usr/bin/env python
import cv2
import numpy as np
import rospy
from std_msgs.msg import Float32
from collections import deque
import argparse
import imutils
import sys
- The following code will initiate the script and call a
talker()
function that is responsible for detecting the individuals face and publishing angle values. Also when the node is stopped usingCTRL+C
the camera would be released by OpenCV and all the window frames would be destroyed, until then the talker function would be run in a loop.
if __name__ == '__main__':
try:
talker()
except rospy.ROSInterruptException:
video_capture.release()
cv2.destroyAllWindows()
pass
-
In case you do not have any green object around you, we have provided a video
ball_tracking_example.mp4
of a person playing with a green colored ball in the code section ofWeek 12
. Download the video and place it in yourros_package/scripts
folder. -
The talker function will run the video script or the webcam depending upon your preference. The
talker()
will detect any object whose color value ranges betweengreenLower
andgreenUpper
. The object would be detected and a circular contour would be placed over each frame. The centroid coordinates of the circular contour would then be published usingcoordinates
topic
def talker():
rospy.init_node('ball_tracking1', anonymous=True)
pub=rospy.Publisher('coordinates',Float32,queue_size=1)
rate = rospy.Rate(20) # 10hz
ap = argparse.ArgumentParser()
ap.add_argument("-v", "--video", dest="/home/arabinda/catkin_ws/src/ros_seminar/scripts/ball_tracking_example.mp4",help="path")
ap.add_argument("-b", "--buffer", type=int, default=64,help="max buffer size")
args = vars(ap.parse_args())
greenLower = (29, 86, 6)
greenUpper = (64, 255, 255)
pts = deque(maxlen=64)
if not args.get("video", False):
camera = cv2.VideoCapture(0)
else:
camera = cv2.VideoCapture('args["video"]')
while not rospy.is_shutdown():
(grabbed, frame) = camera.read()
if args.get("video") and not grabbed:
break
frame = imutils.resize(frame, width=600)
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(hsv, greenLower, greenUpper)
mask = cv2.erode(mask, None, iterations=2)
mask = cv2.dilate(mask, None, iterations=2)
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center = None
if len(cnts) > 0:
c = max(cnts, key=cv2.contourArea)
((x, y), radius) = cv2.minEnclosingCircle(c)
M = cv2.moments(c)
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
#print(center)
pub.publish(int(M["m01"] / M["m00"]))
rospy.loginfo(int(M["m01"] / M["m00"]))
rate.sleep()
if radius > 10:
cv2.circle(frame, (int(x), int(y)), int(radius),(0, 255, 255), 2)
cv2.circle(frame, center, 5, (0, 0, 255), -1)
pts.appendleft(center)
for i in xrange(1, len(pts)):
if pts[i - 1] is None or pts[i] is None:
continue
thickness = int(np.sqrt(64 / float(i + 1)) * 2.5)
cv2.line(frame, pts[i - 1], pts[i], (0, 0, 255), thickness)
cv2.imshow("Frame", frame)
if cv2.waitKey(1) & 0xFF==ord('q'):
break
rospy.spin()
Step 2: Create a node to find the y-coordinate value and based on it publish a linear speed over /cmd_vel topic
-
Create a new python file in your
ros_package/scripts
folder and name it y_coordinate.py -
Import the necessary packages and create an object of
Twist
class
#!/usr/bin/env python
import rospy
from std_msgs.msg import Float32
from geometry_msgs.msg import Twist
msg=Twist()
- Initiate the python script by running the
__main__
function and calling thelistener
function
if __name__ == '__main__':
listener()
- The
listener()
is responsible for initializing they_coordinate
node and subscribing to thecoordinates
topic. It also passes the y_coordinate data obtained from the topic to thecallback
function
def listener():
# In ROS, nodes are uniquely named. If two nodes with the same
# name are launched, the previous one is kicked off. The
# anonymous=True flag means that rospy will choose a unique
# name for our 'listener' node so that multiple listeners can
# run simultaneously.
rospy.init_node('rotate_robot', anonymous=True)
rospy.Subscriber('rotation_angle', Float32, callback)
# spin() simply keeps python from exiting until this node is stopped
rospy.spin()
- The callback function determines the linear velocity of the irobot/turtlesim based on how far the y-coordinate is and publishes it over
turtle1/cmd_vel
.
def callback(data):
rospy.loginfo(rospy.get_caller_id() + "I heard %f", data.data)
pub=rospy.Publisher('turtle1/cmd_vel',Twist,queue_size=10)
if(data.data>50):
msg.linear.x=0.1
else:
msg.linear.x=0
pub.publish(msg)
- Make irobot/turtlesim move linearly
-
If you have the robot installed with a camera, then run the above nodes and launch the create2 and usb_cam drivers
-
If you do not have an iRobot Roomba launch the turtlesim by using the command
$ rosrun turtlesim turtlesim_node
-
When you place a green colored object in front of the webcam the turtlesim/create2 robot will move towards it
-
A demonstration video is available here: https://youtu.be/-P3i7L_g1OM