Lab B: Robot Line Follower


Today's Lab



Coding Exercise: Line Following with Turtlebot3


You will complete this exercise in pairs.

Getting Started


To get started on this exercise, update the intro_robo class package to get the lab_b_line_follower ROS package and starter code that we'll be using for this activity.

$ cd ~/catkin_ws/src/intro_robo
$ git pull
$ git submodule update --init --recursive
$ cd ~/catkin_ws && catkin_make
$ source devel/setup.bash      

Next, launch Bringup on the Turtlebot. In one terminal, run:

$ roscore

In a second terminal, run:

$ ssh pi@IP_OF_TURTLEBOT
$ set_ip LAST_THREE_DIGITS
$ bringup

In a third terminal, run the following commands to start receiving ROS messages from the Raspberry Pi camera:

$ ssh pi@IP_OF_TURTLEBOT
$ set_ip LAST_THREE_DIGITS
$ bringup_cam
Note: bringup_cam is an alias for the command roslaunch turtlebot3_bringup turtlebot3_rpicamera.launch.

Additionally, running bringup_cam does not open the camera feed window. You must run code to see the camera feed.

In a fourth terminal, run the following command to decompress the camera messages:

$ rosrun image_transport republish compressed in:=raspicam_node/image raw out:=camera/rgb/image_raw

Finally, in a fifth terminal, run the Ros node for the line follower:

$ rosrun lab_b_line_follower line_follower.py

The starter code implements a helpful debugging window to help visualize the center of the computed orange pixels. Once you've correctly identified the center of the orange pixels, your window should look something like the following:

Red dot visualization
Note: If your starter code isn't working, here are two fixes we know for common problems:

Understanding the Starter Code


Read through the starter code in line_follower.py with your partner. Discuss it together so that both of you understand what's going on. We encourage you to look up what certain OpenCV functions do to better understand what's going on. Make sure that you all discuss and understand what the following variables represent and what values they will hold: h, w, d, cx, cy.


Implementing the Line Follower


This programming exercise contains 2 main components:

  1. Defining the range for what you will consider a "orange" pixel in the image feed.
    • This task is best done using the HSV (Hue, Saturation, Value) color model, as it is more intuitive to use for defining color ranges than the additive RGB model. The HSV model is commonly used in color pickers; you may find it helpful to play around with one and observe how the HSV values change, then define a range of colors based on certain cutoffs.
    • OpenCV uses the following ranges for H, S, and V: H: 0-179, S: 0-255, V: 0-255. This means that any colors defined using the standard HSV ranges (H: 0-360, S: 0.0-1.0, V: 0.0-1.0) will need to have their components normalized accordingly.
      • For example, the HSV Wikipedia page contains swatches for certain hues. The standard HSV color \(H = 60^{\circ}, S = \frac{1}{4}, V = \frac{5}{8}\) converts to the OpenCV HSV values of \(H = \big(\frac{60^{\circ}}{360^{\circ}} \cdot 180 - 1\big) = 29, S = \big(\frac{1}{4} \cdot 256 - 1\big) = 63, V = \big(\frac{5}{8} \cdot 256 - 1\big) = 159\)
    • If you want to work in RGB colorspace, this OpenCV documentation recommends using the cv2.cvtColor function, as demonstrated in the following example (except you'll want to investigate orange instead of green):
      green = np.uint8([[[0,255,0 ]]])
      hsv_green = cv2.cvtColor(green,cv2.COLOR_BGR2HSV)
      print(hsv_green)
  2. Implementing proportional control to enable the robot to follow the orange line. This will involve:
    • Setting up a ROS publisher to control the movement of the robot.
    • Computing an "error" term (the difference between the goal and the sensed reality). This most important part here is defining the "goal" and the "reality" comparison.
    • Using the error term to determine how the robot moves in accordance with the principles of proportional control.

To run your code:

$ rosrun lab_b_line_follower line_follower.py

Once you've successfully implemented your proportional control line follower, it should look something like the following:

line follower demo

If you and your partner(s) finish early, feel free to use this time to work independently on your Warmup Project assignment.


Tips:



A Fix to the Known Problem of cv2.imgshow() Being Called Outside the Main Thread


If your camera view shows a black image when running the starter code (which we've observed when using the remote CSIL machines), you'll need to make the following changes to the starter code to enable cv2.imgshow() to run on the main thread (if it doesn't run on the main thread, the function freezes). Please make the following changes:

In the init() function, add the following two lines:

self.latest_image = None
self.new_image_flag = False


In the image_callback() function, change the following lines:

cv2.imshow("window", image)
cv2.waitKey(3)
to:
self.latest_image = image
self.new_image_flag = True


And finally, in the run() function, change the following line:

rospy.spin()
to:
ate = rospy.Rate(30)  # Set an appropriate rate (e.g., 30Hz)
while not rospy.is_shutdown():
  if self.new_image_flag:
      cv2.imshow("window", self.latest_image)
      cv2.waitKey(3)
      self.new_image_flag = False
  rate.sleep()


Acknowledgments


The line-following exercise and code was taken and modified from Gaitech EDU.