You will complete this exercise in pairs.
To get started on this exercise, update the intro_robo
class package to get the
lab_b_line_follower
ROS package and starter code that we'll be using for this activity.
$ cd ~/catkin_ws/src/intro_robo
$ git pull
$ git submodule update --init --recursive
$ cd ~/catkin_ws && catkin_make
$ source devel/setup.bash
Next, launch Bringup on the Turtlebot. In one terminal, run:
$ roscore
In a second terminal, run:
$ ssh pi@IP_OF_TURTLEBOT
$ set_ip LAST_THREE_DIGITS
$ bringup
In a third terminal, run the following commands to start receiving ROS messages from the Raspberry Pi camera:
$ ssh pi@IP_OF_TURTLEBOT
$ set_ip LAST_THREE_DIGITS
$ bringup_cam
bringup_cam
is an alias for the command
roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch
.
Additionally, running bringup_cam
does not open the camera feed window. You must run code
to see the camera feed.
In a fourth terminal, run the following command to decompress the camera messages:
$ rosrun image_transport republish compressed in:=raspicam_node/image raw out:=camera/rgb/image_raw
Finally, in a fifth terminal, run the Ros node for the line follower:
$ rosrun lab_b_line_follower line_follower.py
The starter code implements a helpful debugging window to help visualize the center of the computed orange pixels. Once you've correctly identified the center of the orange pixels, your window should look something like the following:
cv2.imgshow()
to run on the main thread.
cv2.namedWindow("window", 1)
Read through the starter code in line_follower.py
with your partner. Discuss it together so that
both of you understand what's going on. We encourage you to look up what certain OpenCV
functions do to better understand what's going on. Make sure that you all discuss and understand what the
following variables represent and what values they will hold: h
, w
, d
,
cx
, cy
.
This programming exercise contains 2 main components:
H: 0-179, S: 0-255, V: 0-255
. This means
that any colors defined using the standard HSV ranges (H: 0-360, S: 0.0-1.0, V: 0.0-1.0
) will
need to have their components normalized accordingly.
cv2.cvtColor
function, as demonstrated in the
following example (except you'll want to investigate
orange instead of green):
green = np.uint8([[[0,255,0 ]]])
hsv_green = cv2.cvtColor(green,cv2.COLOR_BGR2HSV)
print(hsv_green)
To run your code:
$ rosrun lab_b_line_follower line_follower.py
Once you've successfully implemented your proportional control line follower, it should look something like the following:
If you and your partner(s) finish early, feel free to use this time to work independently on your Warmup Project assignment.
cv2.imgshow()
Being Called Outside the Main Thread
If your camera view shows a black image when running the starter code (which we've observed when using the remote CSIL machines), you'll need to make the following changes to the starter code to enable cv2.imgshow()
to run on the main thread (if it doesn't run on the main thread, the function freezes). Please make the following changes:
In the init()
function, add the following two lines:
self.latest_image = None
self.new_image_flag = False
In the image_callback()
function, change the following lines:
cv2.imshow("window", image)
cv2.waitKey(3)
to:
self.latest_image = image
self.new_image_flag = True
And finally, in the run()
function, change the following line:
rospy.spin()
to:
ate = rospy.Rate(30) # Set an appropriate rate (e.g., 30Hz)
while not rospy.is_shutdown():
if self.new_image_flag:
cv2.imshow("window", self.latest_image)
cv2.waitKey(3)
self.new_image_flag = False
rate.sleep()
The line-following exercise and code was taken and modified from Gaitech EDU.