Sensors

The real robot could have several sensors to perceive the world. We can have many nodes to receive this data and perform processing, whereas the navigation stack can only use the planar range sensor by design. Here, the sensor node must publish the data with one of these types: /sensor_msgs::LaserScan or /sensor_msgs::PointCloud2.

We will use the laser located in front of the simulated mobile robot to navigate the Gazebo world. This laser is simulated on Gazebo, and it publishes data on the hokuyo_link reference frame with the topic name /robot/laser/scan. Here, we do not have to configure anything for the laser to use in the navigation stack since TF already configured it in the .urdf file, and the laser is publishing data the correct way.

In the case of a real laser sensor, we will have to develop a driver for it. Indeed, in Chapter 5, Accessing Sensors and Actuators through ROS, we have already discussed how to connect the Hokuyo laser device to ROS.

We can view the workings of the laser sensor in a simulation by using the following command:

$roslaunch chapter7_tutorials gazebo_xacro.launch model:="'rospack find chapter7_tutorials'/urdf/robot_model_04.xacro"

The following screenshot shows the laser sensor in Gazebo:

Laser in Gazebo

The following screenshot also shows the visualization of the laser data sensor in RViz:

Laser data visualization in RViz
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.42.94