Inserting Gazebo controllers into URDF

After inserting the link and assigning joints, we need to insert Gazebo controllers for simulating differential drive and the depth camera plugin, which is done with software models of actual robots. Here is a snippet of the differential drive Gazebo plugin. You can find this code snippet in urdf/chefbot_base_gazebo.urdf.xacro.

        <gazebo> 
          <plugin name="kobuki_controller" 
    filename="libgazebo_ros_kobuki.so"> 
 
            <publish_tf>1</publish_tf> 
        <left_wheel_joint_name>wheel_left_joint
        </left_wheel_joint_name> 
          <right_wheel_joint_name>wheel_right_joint
          </right_wheel_joint_name> 
            <wheel_separation>.30</wheel_separation> 
            <wheel_diameter>0.09</wheel_diameter> 
            <torque>18.0</torque> 
            <velocity_command_timeout>0.6</velocity_command_timeout> 
         
            <imu_name>imu</imu_name> 
          </plugin> 
        </gazebo> 

In this plugin, we are providing the designed values of the robot, such as motor torque, wheel diameter, and wheel separation. The differential drive plugin that we are using here is kobuki_controller, which is used in the TurtleBot simulation.

After creating this controller, we need to create a depth sensor plugin for mapping and localization. Here is the code snippet to simulate the Kinect, a depth sensor. You can find the code snippet from urdf/chefbot_gazebo.urdf.xacro.

           <plugin name="kinect_camera_controller" 
    filename="libgazebo_ros_openni_kinect.so"> 
              <cameraName>camera</cameraName> 
              <alwaysOn>true</alwaysOn> 
              <updateRate>10</updateRate> 
              <imageTopicName>rgb/image_raw</imageTopicName> 
              <depthImageTopicName>depth/image_raw      
         </depthImageTopicName> 
              <pointCloudTopicName>depth/points</pointCloudTopicName> 
              <cameraInfoTopicName>rgb/camera_info 
                   </cameraInfoTopicName>        
       <depthImageCameraInfoTopicName>depth/camera_info        
        </depthImageCameraInfoTopicName> 
              <frameName>camera_depth_optical_frame</frameName> 
              <baseline>0.1</baseline> 
              <distortion_k1>0.0</distortion_k1> 
              <distortion_k2>0.0</distortion_k2> 
              <distortion_k3>0.0</distortion_k3> 
              <distortion_t1>0.0</distortion_t1> 
              <distortion_t2>0.0</distortion_t2> 
              <pointCloudCutoff>0.4</pointCloudCutoff> 
            </plugin> 

In the depth sensor plugin, we can provide necessary design values inside it for simulating the same behavior.

You can clone the section code using the following command: $ git clone https://github.com/qboticslabs/ros_robotics_projects
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.134.229