In the previous post, a simple “Hello World” Gazebo robot simulation was parsed into its components. I was interested in the node named “turtlebot3_drive” and I’ve figured out how to go from the rqt_graph diagram shown yesterday to its source code.
- Decide the node /turtlebot3_drive was interesting.
- Look back at the command lines executed and determine it’s most likely launched from
roslaunch turtlebot3_gazebo turtlebot3_simulation.launch
- Look at the launch file by running
rosed turtlebot3_gazebo turtlebot3_simulation.launch
- Look through the XML file to find the node launch element
<node name="$(arg name)_drive" pkg="turtlebot3_gazebo" type="turtlebot3_drive" required="true" output="screen"/>
- Go into the turtlebot3_gazebo package with
- Look at its makefile
- See the executable turtlebot3_drive declared in the line
- Look at the source file
rosed turtlebot3_gazebo turtlebot3_drive.cpp
Now we can look at the actual nuts and bolts of a simple ROS control program. I had hoped it would be pretty bare-bones and was happy to find that I was correct!
I had feared the laser rangefinder data parsing code would be super complicated, because the laser scanner looks all around the robot. As it turns out, this simple random walk only looks at distance in three directions: straight ahead (zero degrees), 30 degrees one way (30 degrees) and 30 degrees the other (330 degrees) inside the laser scanner data callback
Turtlebot3Drive::laserScanMsgCallBack() This particular piece of logic would have worked just as well with three cheap individual distance sensors rather than the sophisticated laser scanner.
The main decision-making is in the
GET_TB3_DIRECTION case of the switch statement inside
Turtlebot3Drive::controlLoop(). It goes through three cases – if straight ahead is clear, proceed straight ahead. If there’s an obstacle near the right, turn left. And vice versa for right.
This is a great simple starting point for experimentation. We could edit this logic, go back to catkin root and run catkin_make, then see the new code in action inside Gazebo. This feels like the kind of thing I would write for competitions like RoboRodentia, where there’s a fixed scripted task for the robot to perform.
I could stay and play with this for a while, but honestly the motivation is not strong. The attraction of learning ROS is to buid on top of the work of others, and to play with recent advances in AI algorithms. Hand-coding robot logic would be an excellent exercise in using ROS framework but the result would not be novel or innovative.
Maybe I’ll have the patience to sit down and do my homework later, but for now, it’s off to chasing shiny objects elsewhere in the ROS ecosystem.