Chapter 6. Extending the MRDS Visual Simulation Environment

The previous chapter showed how to use the MRDS Visual Simulation Environment, including making simple edits to the environment using the Simulation Editor. The robot entities and environments provided with the MRDS SDK are great, but they only tap a small part of the simulator's potential.

This chapter demonstrates how to add your own custom entities and services to the simulation environment. You will define a new four-wheel-drive robot with a camera and IR distance sensors, along with the services needed to drive the motors and read the sensor values. In the next chapter, you'll use this robot in a simulation of the SRS Robo-Magellan contest.

By the time you complete these two chapters, you will know how to build an entire simulation scenario, complete with special environmental entities, a custom robot, services to interface with the entities, and a high-level orchestration service to control the behavior of the robot. Figure 6-1 shows the Corobot entity defined in this chapter.

Figure 6-1

Figure 6.1. Figure 6-1

Simulation DLLs and Types

Before you set out on any great adventure, it pays to know what resources are available to you. When you're writing services that interact with the simulation engine, you need to create classes and use types that are defined in the following DLLs.

RoboticsCommon.DLL

This DLL defines a number of common types and generic contracts that both hardware and simulation services can use. Most simulation services will need to reference this DLL so that they can use at least some of the types defined in the Microsoft.Robotics.PhysicalModel namespace. Some of the types you'll use in this chapter are as follows:

  • Vector2, Vector3, Vector4: A structure that contains the indicated number of floating-point values. Vectors are typically used to represent 2D, 3D, or homogeneous coordinate vectors. Vector3 and Vector4 are also sometimes used to represent colors.

  • Quaternion: 3D rotations are represented by the physics engine as quaternions. It is beyond the scope of this chapter to completely explain the math behind quaternions, but you can reference the following link to learn more: http://en.wikipedia.org/wiki/Quaternion. The Quaternion type contains four floating-point values: X, Y, Z, and W.

  • Pose: A Pose defines the position and orientation of an entity within the simulation environment. It consists of a Vector3 position and a Quaternion orientation.

  • Matrix: As you might expect, this is a 4 × 4 array of floating-point values that represents the transform for a point in the simulation environment. There is also a matrix type provided by the XNA library. This matrix type is used more commonly than the RoboticsCommon matrix type because it supports more built-in operations.

  • ColorValue: This contains four floating-point values (Alpha, Red, Green, and Blue) that range from 0 to 1 to define a color.

In addition to these basic types, RoboticsCommon.dll also defines a number of generic contracts, which define particular operations but don't necessarily associate any behavior with the operations. It is often useful to write a hardware service and a simulation service that implement a generic contract. The orchestration service that drives the robot can then work properly in simulation and in the real world by making a simple manifest change. Some of the generic contracts defined in RoboticsCommon are as follows:

  • AnalogSensor: A generic analog sensor that returns a floating-point value representing the current state of a continuously varying hardware sensor

  • AnalogSensorArray: Multiple analog sensors

  • Battery: A generic battery contract that enables the system to report on the state of the battery and provide notifications when the battery level falls below a critical threshold.

  • ContactSensor: A generic sensor that has a pressed and unpressed state. This is suitable for bumpers, pushbuttons, and other similar sensors.

  • Drive: A generic two-wheel differential drive that provides operations such as DriveDistance, RotateDegrees, AllStop, and so on. It provides control for two wheels that are driven independently.

  • Encoder: A generic wheel encoder sensor that provides information about the current encoder state

  • Motor: Provides a way to control a generic single motor

  • Sonar: Exposes a generic sonar device, including information about the current distance measurement and angular range and resolution

  • Webcam: Provides a way to retrieve images from a generic camera such as a webcam

You'll be using the AnalogSensor, Drive, and Webcam generic contracts as we develop the Robo-Magellan simulation.

SimulationCommon.DLL

This DLL defines types that are used only in the simulation environment. The types specific to the simulation engine are contained in the Microsoft.Robotics.Simulation namespace. Some of these types include the following:

  • Entity: The base type for all entities in the simulation environment. This type contains all of the information common to both the simulation engine and the physics engine.

  • EntityState: This type contains information about the entity such as its Pose, Velocity, Angular Velocity, and Name. In addition, it contains a list of all the physics primitives associated with the entity, as well as physical properties such as the mass and density of the object and visual properties such as the default texture, mesh, and rendering effect. Miscellaneous flags are provided to control various aspects of the rendering or physics behavior of the entity.

  • LightEntity: This is a deprecated type, only present in MRDS 1.5 to provide backward compatibility. Lights are now represented by LightEntities.

  • SimulationState: This is what is returned from a Get operation on the simulator. It contains information about the Main Camera, and some other information such as the current render mode and whether the physics engine is paused. The most important thing it contains is a list of all the entities in the simulation environment.

The types that the physics engine uses are defined under the namespace Microsoft.Robotics.Simulation.Physics. These types are too numerous to completely list here but the most commonly used types are as follows:

  • BoxShapeProperties, CapsuleShapeProperties, SphereShapeProperties, ConvexMeshShapeProperties, TriangleMeshShapeProperties, HeightFieldShapeProperties, WheelShapeProperties: These types hold state information about each of the shape objects supported by the physics engine. Some of the information is specific to a particular shape, such as the ProcessedMeshResource in the ConvexMeshShapeProperties. Much of the information is common to all or some shapes, such as Dimensions, Radius, LocalPose, etc. These shape properties are covered in more detail in the previous chapter.

  • BoxShape, CapsuleShape, SphereShape, ConvexMeshShape, TriangleMeshShape, HeightFieldShape, WheelShape: These types are the shapes created from their associated shape properties. They contain a reference to the actual physical shape representation in the AGEIA physics engine.

  • UIMath: This type contains static methods that you can use to convert between an Euler angle rotation representation and a quaternion rotation representation. A method is also provided to round a double value to the nearest hundredth.

SimulationEngine.DLL

This DLL contains most of the simulator functionality. From a programmer's perspective, the most important types that it contains are in the Microsoft.Robotics.Simulation.Engine namespace. These are the built-in entity types provided with the simulator, such as CameraEntity, SkyDomeEntity, SingleShapeEntity, and so on. Many of these entities are described in the previous chapter. The full source code for all of these entities is in samplessimulationentitiesentities.cs. You can use this code as an example for your own custom entities.

Note that all of these entities inherit from the VisualEntity class. Some of the members and properties of the VisualEntity class are important to understand when creating new entities and services for the simulation environment.

VisualEntity Methods and Members

Initialize, Update, and Render are three virtual methods on VirtualEntity that can be overridden in a subclass to define new behavior for the entity. This section describes these methods, as well as other important methods and member variables on the VisualEntity class.

  • Initialize: This method is called after the entity has been inserted into the simulation environment. In this method, the state values of the entity are used to instantiate run-time objects, such as shapes and meshes, which enable the entity to function in the simulator. It is important to keep all of the code that creates run-time objects in the Initialize function and not in the constructor. When an entity is deserialized from an XML file, such as when it is pasted using the Simulation Editor, the deserializer calls the default constructor for the entity. This is the constructor with no parameters. It then sets all of the state variables according to the XML data and calls Initialize on the entity to instantiate run-time objects. If there are run-time objects that are initialized in a nondefault constructor, the entity will fail to initialize properly when it is deserialized. It is a good idea to enclose most of the code in the Initialize method within a Try/Catch block and to set the value of the InitError field with the text of any errors that are encountered.

  • Update: This method is called once each frame while the physics engine is not processing the frame. This is important because some physics engine functions cannot be called while the physics engine is actively processing the frame. In the Update method, the entity calculates its transformation matrix, which is used in rendering. Custom behavior can be implemented here as well, such as setting the pose of an entity or the axle speed of a wheel, for example.

  • Render: This method is called once each frame after the Update method for all the other entities has completed. This is where the mesh is rendered so that the entity appears on the screen. Several entities override the default behavior for this method to implement specialized rendering effects.

  • State: This class is actually defined as part of the Entity class from which VisualEntity is subclassed, but it is important enough to mention it here. The EntityState contains important information about the entity, such as its name, its pose, its velocity, its physics shapes, and its rendering assets. More information is provided about the EntityState in Chapter 5.

  • InsertEntity, InsertEntityGlobal, RemoveEntity: These methods are used to add children entities. InsertEntity assumes that the pose of the child entity is relative to the parent entity, whereas InsertEntityGlobal assumes that the pose of the child is given in global coordinates. RemoveEntity removes a child entity. An example of a child entity is a camera that is mounted to a parent robot. The camera doesn't have a physics shape associated with it so its position is updated in its Update method, enabling it to move with its parent entity. Other entities such as the BumperArrayEntity do have a physics shape, so they are moved by the physics engine each frame. To keep them attached to their parent entity, a joint is created between the parent entity and the child entity. This joint is stored in the ParentJoint field of the child entity. Joints are covered in more detail in Chapter 7.

  • PhysicsEntity: This class represents the link between this entity and the physics engine. If the entity has no physics shapes associated with it, then this member will be null. The more common case is that the entity will have one or more physics shapes associated with it and the PhysicsEntity member is initialized with a call to CreateAndInsertPhysicsEntity after all of the physics shapes have been created and added to the State.PhysicsPrimitives list. The PhysicsEntity member contains several methods that affect the entity within the simulation environment, such as SetPose, SetLinearVelocity, SetAngularVelocity, ApplyForce, and ApplyTorque.

  • DeferredTaskQueue: This is a list of tasks that need to be executed during the Update method. The Update method is the only time when the physics engine is guaranteed to not be busy processing a frame. Most of the methods on the PhysicsEntity object cannot be called when the physics engine is busy. When one of these methods needs to be called outside of the Update method, it is added as a task to the DeferredTaskQueue and its execution is deferred until Update runs again.

  • LoadResources: This method is used to load the mesh, texture, and effect resources associated with the entity. It is typically called from the Initialize method and it stores references to the loaded meshes in the Meshes field.

The SimulationEngine Class

The SimulationEngine class defines a static member called GlobalInstance, which holds a pointer to the single instance of the SimulationEngine. The DssHost environment will not allow multiple instances of the SimulationEngine class to be instantiated. Public properties such as a reference to the graphics device and a reference to the ServiceInfo class associated with the SimulationEngine service can be accessed from GlobalInstance. Public methods such as IntersectRay and IsEntityNameInUse can also be accessed from this reference.

The SimulationEngine class defines another static member called GlobalInstancePort, which contains a reference to the simulation engine port that can be used to insert, update, and remove entities. These two global variables are useful to services that must interact closely with the simulation engine.

SimulationEngine.Proxy.DLL

This DLL is necessary to access the service elements of the simulation engine. Items such as contracts and port definitions should always be accessed from the Proxy DLL.

PhysicsEngine.DLL

This DLL contains the definitions for the objects used in the simulation environment that represent objects in the physics engine. The PhysicsEntity object described in the previous section is defined in this DLL, so your service must add a reference to PhysicsEngine.DLL if you want to call any of the methods on the PhysicsEntity object.

This DLL also defines the PhysicsEngine object. This object represents the physics engine scene that contains all of the entities in the simulation environment. The methods on this object are not typically called by a service directly, but the PhysicsEngine object must be passed into several of the physics engine APIs.

Microsoft.Xna.Framework.DLL

This DLL contains all of the rendering code. It provides a managed DirectX interface and eventually calls the underlying native DirectX DLLs. You must have both DirectX and XNA installed for the simulator to work properly. Some simulation services must use the XNA types, so they must have a reference to the XNA Framework DLL.

In some cases, it is necessary to convert between XNA types and their counterparts defined in RoboticsCommon.DLL. There are several functions defined in the Microsoft.Robotics.Simulation.Engine namespace in the TypeConversion class that make this easy. This class provides a FromXNA method for Vector3, Vector4, Quaternion, and Matrix that converts a type from XNA to its RoboticsCommon counterpart. Similar ToXNA methods are provided to convert in the opposite direction.

There are two namespaces that contain types used in the simulator: Microsoft.Xna.Framework and Microsoft.Xna.Framework.Graphics. Both are discussed in the following sections.

The Microsoft.Xna.Framework Namespace

You will see several of the basic types defined in this namespace used in various simulation services:

  • Vector2, Vector3, Vector4: These are functionally equivalent to the corresponding types found in RoboticsCommon.DLL but these types often have more utility methods associated with them. In addition, some APIs work with the XNA typed vectors and other APIs work with the RoboticsCommon vectors. It is sometimes necessary to convert between XNA types and their RoboticsCommon counterparts.

  • Matrix: This is a standard 4 × 4 matrix, but the XNA version has more utility functions associated with it than does the RoboticsCommon version.

  • Quaternion: The XNA Quaternion equivalent.

The Microsoft.Xna.Framework.Graphics Namespace

This is where most of the graphics-related types are defined. Because there are so many types defined in this namespace, only a few are covered in this section:

  • GraphicsDevice: This object represents the hardware graphics device. It is initialized by the simulator, so it is rare for a service to interact directly with this object. However, if you override the Initialize method, you have to include a reference to the XNA DLL because the GraphicsDevice is passed as a parameter along with a reference to the PhysicsEngine object.

  • Effect, IndexBuffer, VertexBuffer, RenderState, Texture2D, TextureCube: These types are used to represent data that is directly accessed by the graphics hardware. Effects are used to tell the graphics hardware how to draw geometry. IndexBuffers and VertexBuffers hold the geometry to be drawn. RenderStates further specify how the graphics hardware should draw the objects. Texture2D and TextureCube objects hold texture map data.

It is beyond the scope of this book to completely cover the XNA graphics APIs. Numerous books and online articles cover XNA thoroughly. You can find the Microsoft XNA forums at http://forums.xna.com, and the forum that deals specifically with the framework APIs is at http://forums.xna.com/56/ShowForum.aspx.

Using Statements and DLL References

In summary, add the following DLLs as references to your service to enable it to work with the simulator:

Microsoft.Xna.Framework
PhysicsEngine
RoboticsCommon
SimulationCommon
SimulationEngine
SimulationEngine.proxy

Add the following using statements to properly resolve references to objects in these DLLs:

#region Simulation namespaces
using Microsoft.Robotics.Simulation;
using Microsoft.Robotics.Simulation.Engine;
using engineproxy = Microsoft.Robotics.Simulation.Engine.Proxy;
using Microsoft.Robotics.Simulation.Physics;
using Microsoft.Robotics.PhysicalModel;
using xna = Microsoft.Xna.Framework;
using xnagrfx = Microsoft.Xna.Framework.Graphics;
#endregion

Building Your Own SRS-Robo-Magellan Simulation

In the following sections, you'll begin building an SRS Robo-Magellan simulation from the ground up. If you are the type of person who likes to skip to the end of the book to see how it all turns out, feel free to go to the Chapter6 directory of the sample software and build and run the solution to see the simulated Corobot in action. Chapter 7 includes a referee and orchestration service to complete the Robo-Magellan simulation.

This chapter describes how to build a custom robot, the CoroWare Corobot, along with its sensors and associated simulation services. You'll also define an environment to test the Corobot. The next chapter describes how to use the robot you've defined in a custom simulation scenario, the simulated SRS Robo-Magellan scenario.

If you prefer to follow the step-by-step implementation of these services, it's recommended that you make a new directory parallel to the Chapter6 directory. You can call it something like MyChapter6.

Simulation Services

A simulation service is a service that creates, manipulates, or reads data from entities in the simulation environment. It typically specifies the SimulationEngine service as a partner, and it references the DLLs listed in the previous sections. A simulation service follows the same rules as other services and it only communicates with the SimulationEngine service through a SimulationEnginePort. It can insert, replace, or delete entities by sending messages to this port. It can also subscribe to a particular entity by name so that it receives a notification when that entity is inserted, replaced, or deleted.

Simulation services that are running on the same node as the SimulationEngine service can take advantage of a shortcut that dramatically improves the performance of setting and retrieving entity data. When a service is running on the same node as the simulator and it receives an insert notification for a particular entity, the body of the insert notification message contains a reference to the actual entity in the simulation environment. This object is "live," meaning its fields are updated as each frame is processed in the simulator. In this case, it is only necessary for a simulation service to receive a single notification when an entity is inserted in the environment, after which it can use the reference to that entity for all subsequent operations. Because simulation services typically need to interact with their associated entities frequently, this can speed things up substantially.

A typical simulation scenario has one or more of these services running on the same node as the simulator and one or more higher-level orchestration services that may or may not be running on the same node. Only the simulation services interact directly with entities in the simulator. The simulation and orchestration services for the Robo-Magellan simulation are shown in Figure 6-2.

Figure 6-2

Figure 6.2. Figure 6-2

The SimulationEngine service, the Dashboard service, and the SimulatedWebCam service are provided as part of the SDK. The rest of the services will be developed in this chapter and in Chapter 7.

Notice that neither the Dashboard service nor the SimMagellan service interacts directly with the simulator. These services rely on information passed to them from the simulation services. The orchestration services can run on a different node or even a different machine from the simulator, but the simulation services must run on the same node.

Creating a Simulation Service

The first simulation service you'll create is the Corobot service. This service will add sky, ground, and a few other simulation entities to the simulation environment. Eventually, it will contain the definition for a new robot entity modeled after the CoroWare Corobot.

One way to create a new simulation service is to use the DssNewService utility. Make sure that you have installed the ProMRDS samples from the CD. Open an MRDS command prompt by clicking Start

Creating a Simulation Service
C:Microsoft Robotics Studio (1.5)>
C:Microsoft Robotics Studio (1.5)>cd ProMRDS
C:Microsoft Robotics Studio (1.5)ProMRDS>mkdir MyChapter6
C:Microsoft Robotics Studio (1.5)ProMRDS>cd MyChapter6
C:Microsoft Robotics Studio (1.5)ProMRDSMyChapter6>dssnewservice
/Service:"Corobot" /Namespace:"ProMRDS.Simulation.Corobot" /year:"2007" /month:"07"

The /Service parameter specifies the name of the service. The /Namespace parameter specifies the namespace that will be used in the service. The /year and /month options are typically not necessary because they default to the current year and month. They are used to make the contract identifier for the service. In this case, you want the contract identifier to match the Corobot service in the Chapter6 directory so you specify the year and month in which this contract was created.

After executing the command, a directory called Corobot is created with all of the files necessary to build a complete service. The contract identifier associated with the service is as follows:

"http://schemas.tempuri.org/2007/07/corobot.html"

Note that several files have been generated for you:

  • AssemblyInfo.cs: This file identifies the assembly as a Dss service.

  • Corobot.cs: This file contains the service implementation.

  • Corobot.manifest.xml: This is a manifest that can be used to start the Corobot service. Visual Studio is set up to run this manifest when you press F5 to start debugging your service. This is set up in the file Corobot.csproj.user.

  • CorobotTypes.cs: This file contains the contract identifier and the definition of the state associated with the service, and the operations defined on the main port.

Go ahead and build and run the service. At this point, it doesn't do much, but you can start a browser window and navigate to http://localhost:50000 and select Service Directory from the left side of the window and verify that the /corobot service is indeed running. You can even click it to retrieve its state, although you haven't put anything interesting in the service yet.

Now you'll modify the service so that it can interact with the simulator:

  1. Begin by adding the following using statements to the top of Corobot.cs as described in the preceding section:

    #region Simulation namespaces
    using Microsoft.Robotics.Simulation;
    using Microsoft.Robotics.Simulation.Engine;
    using engineproxy = Microsoft.Robotics.Simulation.Engine.Proxy;
    using Microsoft.Robotics.Simulation.Physics;
    using Microsoft.Robotics.PhysicalModel;
    using xna = Microsoft.Xna.Framework;
    using xnagrfx = Microsoft.Xna.Framework.Graphics;
    #endregion
  2. Now add references to the following DLLs. Remember that the XNA DLL has to be handled specially according to the instructions provided in the section "Microsoft Xna.Framework.DLL," earlier in this chapter.

    Microsoft.Xna.Framework
    PhysicsEngine
    RoboticsCommon
    SimulationCommon
    SimulationEngine
    SimulationEngine.proxy
  3. After you have added a reference to a new external DLL, select the DLL so that its properties are displayed. Set the CopyLocal and Specific Version properties to False. Build the service again to ensure that no mistakes were made. Change the Description attribute, which precedes the CorobotService class, to something more descriptive if you like.

  4. At the top of the CorobotService class, add the following partner specification:

    [Partner("Engine",
    Contract = engineproxy.Contract.Identifier,
    CreationPolicy = PartnerCreationPolicy.UseExistingOrCreate)]
    private engineproxy.SimulationEnginePort _engineStub =
        new engineproxy.SimulationEnginePort();

    This identifies the SimulationEngine service as a partner to the Corobot service and instructs DSS to start this service if it isn't already running. At this point, when you run the service, it will at least start up the simulation engine even though the simulation environment is empty.

  5. Now you'll add code to the Start method to set up the simulation environment. The first thing to do is define the Main Camera initial view. Insert the following code after the call to base.Start. This code defines a CameraView message, sets it EyePosition and LookAtPoint and then sends it to the SimulationEngine service. This is all pretty straightforward until the last line where you post the CameraView message to the SimulationEngine.GlobalInstancePort. The SimulationEngine service holds a static global pointer to its own main operation port. It is often convenient for other simulation services to interact with the SimulationEngine through this port.

    // MainCamera initial view
    CameraView view = new CameraView();
    view.EyePosition = new Vector3(-1.65f, 1.63f, −0.29f);
    view.LookAtPoint = new Vector3(0, 0, 0);
    SimulationEngine.GlobalInstancePort.Update(view);
  6. Adding a sky to the simulation environment is as easy as adding the following two lines of code:

    // Add a SkyDome.
    SkyDomeEntity sky = new SkyDomeEntity("skydome.dds", "sky_diff.dds");
    SimulationEngine.GlobalInstancePort.Insert(sky);

    Here, you create a new SkyDomeEntity and specify a visual texture of "skydome.dds" and a lighting texture of "sky_diff.dds". When the entity has been created, you insert it into the simulation environment.

  7. Now you'll add a directional light to simulate the sun. Without this light source, entities in the simulation environment are only lit by ambient light and by the light from the SkyDome entity.

    LightSourceEntity sun = new LightSourceEntity();
    sun.State.Name = "Sun";
    sun.Type = LightSourceEntityType.Directional;
    sun.Color = new Vector4(0.8f, 0.8f, 0.8f, 1);
    sun.Direction = new Vector3(0.5f, -.75f, 0.5f);
    SimulationEngine.GlobalInstancePort.Insert(sun);
  8. The next step is to add a ground entity. If you forget to add this entity, your other simulation entities will fall into nothingness as soon as they are inserted in the environment. The ground is a good thing.

    // create a large horizontal plane, at zero elevation.
    HeightFieldEntity ground = new HeightFieldEntity(
        "Ground", // name
        "Gravel.dds", // texture image
        new MaterialProperties("ground",
            0.2f, // restitution
            0.5f, // dynamic friction
            0.5f) // static friction
        );
    SimulationEngine.GlobalInstancePort.Insert(ground);

    Here, you create a HeightFieldEntity with zero-elevation height specified for each height field sample, which gives you a nice flat ground plane. Notice that you're using a gravel texture for the ground plane instead of the infinite carpet texture that most of the MRDS Simulation Tutorials use. There is something vaguely unsettling about a world of infinite carpet. You also specify a material for the ground, consisting of restitution and dynamic and static friction values, as described in Chapter 5.

  9. Finally, add a giant box to the environment. Giant boxes can be handy. This particular giant box is positioned so that its center is exactly 29 inches + 1 meter in the +Z direction from the origin of the simulation world. The origin of all SingleShapeEntities is at their center, so you must specify a Y offset equal to half the height of the box to keep the box from being partially buried in the ground. You might surmise that the reason the box is offset 29 inches + 1 meter in the +Z direction is that you really want its edge to be about 29 inches from the origin. This will be useful later.

Vector3 dimensions = new Vector3(2f, 2f, 2f); // meters
SingleShapeEntity box = new SingleShapeEntity(
    new BoxShape(
        new BoxShapeProperties(
        100, // mass in kilograms.
        new Pose(), // relative pose
        dimensions)), // dimensions
        new Vector3(0, 1f, 29f * 2.54f / 100f + 1f));

// Name the entity. All entities must have unique names
box.State.Name = "box";

// Insert entity in simulation.
SimulationEngine.GlobalInstancePort.Insert(box);

Let's see—you've got a sky, a ground, and a giant box. Build and run your service and verify that it looks something like what is shown in Figure 6-3.

Figure 6-3

Figure 6.3. Figure 6-3

If the scene does not look like you expect, you can start the Simulation Editor by pressing F5 and then examine the list of entities in the upper-right pane. If any of the entities have a red exclamation point next to them, they did not initialize properly for some reason. This is often due to a missing texture or mesh file. Select the entity with a problem and look at the InitError property to find more information about the error.

If your entity fails to show up in the list at all, the simulation engine could not create it. This is often because you have forgotten to specify an entity name or you have duplicated a name already in use. Examine the DssHost window to see whether it contains any error messages that can shed some light on the problem.

Congratulations! You have just created your first simulation environment. Now let's build a robot. If you ran into any trouble building your service or if your results are different, compare your Corobot.cs file with the corresponding file in the Chapter6 directory.

Defining a Custom Robot Entity

Granted, a world of infinite gravel and a giant box is not very exciting, so let's define a new robot entity to move some of that gravel around (Figure 6-4 shows the CoroWare Corobot in the real world). In this section, you'll model a new robot entity similar to the Corobot manufactured by CoroWare. You can find more information about this robot at www.corobot.net.

Figure 6-4

Figure 6.4. Figure 6-4

The features of this robot that are most interesting in the simulator are its four-wheel differential drive, its front and rear infrared distance sensors, and its front-mounted camera.

The first step in creating this robot in the simulator is to define a new class that inherits from VisualEntity as follows:

/// <summary>
/// An entity which represents a Corobot. This entity is created facing
/// in the +Z direction meaning that its front has positive Z
/// coordinates. Front and Rear IR sensors are included along with
/// a front-facing camera.
/// </summary>
[DataContract]
[DataMemberConstructor]
public class CorobotEntity: VisualEntity
{
}

The [DataContract] attribute indicates that this entity has state that needs to be serialized when it is sent across nodes. The [DataMemberConstructor] attribute indicates that the nondefault constructor defined in the class should also appear in the generated proxy class. All of the other two-wheel differential drive entities in the simulator inherit from DifferentialDriveEntity. This entity will be different enough that it makes sense to incorporate the functionality of DifferentialDriveEntity without subclassing it.

The next thing you need to do is define constructors for this entity. Every simulation entity must have a default constructor that has no parameters. This constructor is called when the entity is deserialized, which occurs when the entity is inserted into the simulator from a remote node or when an entity is pasted into the environment using the Simulation Editor. It also occurs when an entire scene is loaded into the simulator. Here are the Corobot constructors:

/// <summary>
/// Default constructor used when this entity is deserialized
/// </summary>
public CorobotEntity()
{
}

/// <summary>
/// Initialization constructor used when this entity is built
/// programmatically.
/// </summary>
/// <param name="initialPos"></param>
public CorobotEntity(string name, Vector3 initialPos)
{
    base.State.Name = name;
    base.State.Pose.Position = initialPos;
}

The nondefault constructor enables the entity name and initial position to be specified. The only action the constructor takes is to modify its state with the passed parameters. The parameters do not need to be initialized when the default constructor is called during deserialization because the entity state will be restored to the value it had when the entity was serialized.

Next, you need to define some dimensions. Add the following code to the top of the CorobotEntity class:

private static float InchesToMeters(float inches)
{
    return (float)(inches * 2.54 / 100.0);
}

static float mass = 3.63f; // kg
static float chassisClearance = InchesToMeters(1.5f);
static float wheelGap = InchesToMeters(3f / 8f);
static float wheelWidth = InchesToMeters(2.2f);
static float wheelDiameter = InchesToMeters(4.75f);
static float wheelMass = 0.1f; // kg
static float platformClearance = InchesToMeters(5.75f);

static Vector3 platformDimensions = new Vector3(
    InchesToMeters(11.0f),
    InchesToMeters(3.0f),
    InchesToMeters(8.5f));
static Vector3 chassisDimensions = new Vector3(
    platformDimensions.X—2 * wheelWidth—2 * wheelGap,
    InchesToMeters(2.5f),
    platformDimensions.Z);
static Vector3 wheelFRPosition = new Vector3(
    chassisDimensions.X / 2.0f + wheelGap + wheelWidth / 2.0f,
    wheelDiameter / 2.0f,
    -InchesToMeters(5.75f—2.125f));
static Vector3 wheelFLPosition = new Vector3(
    -wheelFRPosition.X,
    wheelFRPosition.Y,
    wheelFRPosition.Z);
static Vector3 wheelRRPosition = new Vector3(
    wheelFRPosition.X,
    wheelFRPosition.Y,
    -wheelFRPosition.Z);
static Vector3 wheelRLPosition = new Vector3(
    -wheelFRPosition.X,
    wheelFRPosition.Y,
    -wheelFRPosition.Z);

The motor box is called the chassis and the upper box containing the processor is called the platform. The height of the chassis above the ground is defined by chassisClearance, and the height of the platform is defined by platformClearance. The width of the chassis is calculated to be just wide enough to allow the outer edge of the wheels to be even with the sides of the platform. The position of the front right wheel is calculated and the positions of the other three wheels are derived from it. This gives us enough information to specify the basic physics shapes that make up the physics model of the Corobot.

The wheels are more than just shapes—they are full entities with their own meshes and physics shapes. You need a place to store all four wheel entities when you create them. Add this code just before the constructors:

// instance variables
WheelEntity _wheelFR;
WheelEntity _wheelFL;
WheelEntity _wheelRR;
WheelEntity _wheelRL;

[Category("Wheels")]
[DataMember]
public WheelEntity FrontRightWheel
{
    get { return _wheelFR; }
    set { _wheelFR = value; }
}

[Category("Wheels")]
[DataMember]
public WheelEntity FrontLeftWheel
{
    get { return _wheelFL; }
    set { _wheelFL = value; }
}

[Category("Wheels")]
[DataMember]
public WheelEntity RearRightWheel
{
    get { return _wheelRR; }
    set { _wheelRR = value; }
}
[Category("Wheels")]
[DataMember]
public WheelEntity RearLeftWheel
{
    get { return _wheelRL; }
    set { _wheelRL = value; }
}

The Category attribute just groups these properties together so that they are better organized in the Simulation Editor view. Each WheelEntity property has the [DataMember] attribute. This attribute tells the proxy generator that this is a property that must be serialized when the entity is sent to a remote node or when it is saved to disk. Only public properties can be marked with this attribute. You'll test the entity later to ensure that it can be properly serialized and deserialized.

These wheel entities and chassis and platform shapes are created in the Initialize method. Add the following code to override the base class Initialize method:

public override void Initialize(
    xnagrfx.GraphicsDevice device,
    PhysicsEngine physicsEngine)
{
    try
    {
        // chassis
        BoxShapeProperties chassisDesc = new BoxShapeProperties(
            "chassis",
            mass / 2.0f,
            new Pose(new Vector3(
                0,
                chassisClearance + chassisDimensions.Y / 2.0f,
                0)),
            chassisDimensions);

        chassisDesc.Material =
            new MaterialProperties("chassisMaterial", 0.0f, 0.5f, 0.5f);

        BoxShape chassis = new BoxShape(chassisDesc);
        chassis.State.Name = "ChassisShape";
        base.State.PhysicsPrimitives.Add(chassis);

        // platform
        BoxShapeProperties platformDesc = new BoxShapeProperties(
            "platform",
            mass / 2.0f,
            new Pose(new Vector3(
                0,
                platformClearance + platformDimensions.Y / 2.0f,
                0)),
            platformDimensions);
platformDesc.Material = chassisDesc.Material;
        BoxShape platform = new BoxShape(platformDesc);
        platform.State.Name = "PlatformShape";
        base.State.PhysicsPrimitives.Add(platform);

The first thing the Initialize method does is create two box shapes to represent the chassis and the platform. The mass of both objects is assumed to be the same, so the total mass is split between them. They are given the dimensions and position specified. These positions are relative to the origin of the entity. Both box shapes are given the same material definition, which specifies a restitution of 0 and mid-range static and dynamic friction. After each shape is created, it is added to the PhysicsPrimitives list:

base.CreateAndInsertPhysicsEntity(physicsEngine);
        base.PhysicsEntity.SolverIterationCount = 128;

When the physics entity is created, it passes all of the objects in the PhysicsPrimitives list to the physics engine, which creates its own representation for them based on the attributes specified in the BoxShape objects. The SolverIterationCount determines how many iterations the physics engine will use to resolve constraints on the entity, such as contact points or joints. The higher the count, the more accurate the results become. A value of 128 is probably overkill because the AGEIA documentation states that the default value is 4 and AGEIA developers have never needed to use a value higher than 30. This code follows the example of the DifferentialDriveEntity in the SDK in setting this value to 128:

// Wheels
        WheelShapeProperties wheelFRprop = new WheelShapeProperties(
            "FrontRightWheel", wheelMass, wheelDiameter / 2.0f);
        WheelShapeProperties wheelFLprop = new WheelShapeProperties(
            "FrontLeftWheel", wheelMass, wheelDiameter / 2.0f);
        WheelShapeProperties wheelRRprop = new WheelShapeProperties(
            "RearRightWheel", wheelMass, wheelDiameter / 2.0f);
        WheelShapeProperties wheelRLprop = new WheelShapeProperties(
            "RearLeftWheel", wheelMass, wheelDiameter / 2.0f);

        wheelFRprop.Flags |= WheelShapeBehavior.OverrideAxleSpeed;
        wheelFLprop.Flags |= WheelShapeBehavior.OverrideAxleSpeed;
        wheelRRprop.Flags |= WheelShapeBehavior.OverrideAxleSpeed;
        wheelRLprop.Flags |= WheelShapeBehavior.OverrideAxleSpeed;

        wheelFRprop.InnerRadius = 0.7f * wheelDiameter / 2.0f;
        wheelFLprop.InnerRadius = 0.7f * wheelDiameter / 2.0f;
        wheelRRprop.InnerRadius = 0.7f * wheelDiameter / 2.0f;
        wheelRLprop.InnerRadius = 0.7f * wheelDiameter / 2.0f;

        wheelFRprop.LocalPose = new Pose(wheelFRPosition);
        wheelFLprop.LocalPose = new Pose(wheelFLPosition);
        wheelRRprop.LocalPose = new Pose(wheelRRPosition);
        wheelRLprop.LocalPose = new Pose(wheelRLPosition);

Next, the WheelShapeProperties for each wheel are specified. The OverrideAxleSpeed flag tells the physics engine not to calculate the axle speed based on motor torque and friction. Instead, you specify the axle speed each frame. This turns out to be a better way to simulate the types of motors that typically drive wheeled robots. The LocalPose of each shape is useful when an entity contains multiple shapes. The LocalPose specifies the position and orientation of each shape relative to the origin of the entity. In the preceding code, each pose is initialized with a position vector and the orientation part of the pose defaults to 0.

_wheelFR = new WheelEntity(wheelFRprop);
        _wheelFR.State.Name = base.State.Name + " FrontRightWheel";
        _wheelFR.Parent = this;
        _wheelFR.Initialize(device, physicsEngine);

        _wheelFL = new WheelEntity(wheelFLprop);
        _wheelFL.State.Name = base.State.Name + " FrontLeftWheel";
        _wheelFL.Parent = this;
        _wheelFL.Initialize(device, physicsEngine);

        _wheelRR = new WheelEntity(wheelRRprop);
        _wheelRR.State.Name = base.State.Name + " RearRightWheel";
        _wheelRR.Parent = this;
        _wheelRR.Initialize(device, physicsEngine);

        _wheelRL = new WheelEntity(wheelRLprop);
        _wheelRL.State.Name = base.State.Name + " RearLeftWheel";
        _wheelRL.Parent = this;
        _wheelRL.Initialize(device, physicsEngine);

After the WheelShapeProperties have been initialized, you can create the WheelEntities. Each WheelEntity is given a name based on the parent entity name.

The parent reference of each WheelEntity is set to the CorobotEntity. The WheelEntity uses this reference in an unusual way. You can see this code in the Initialize method of the WheelEntity in samplesentitiesentities.cs. Instead of calling CreateAndInsertPhysicsEntity, the WheelEntity calls InsertShape on its parent's PhysicsEntity. This adds the WheelEntity shape to the set of shapes that makes up the parent. As far as the physics engine is concerned, the wheel shapes are just part of the parent entity. This reduces the amount of computation required by the physics engine because it doesn't have to calculate the interactions between the wheels and the chassis and platform shapes. It assumes that they are rigidly joined.

You must explicitly call the Initialize method for each WheelEntity because these entities have not been inserted into the parent entity as children using the InsertEntity method.

Finally, you call the base.Inialize method, which, among other things, loads any meshes or textures associated with the entity. If no mesh was specified in State.Assets.Mesh, simple meshes are constructed from the shapes in the entity.

Now that you have the Initialize method completed, only one thing remains to add the Corobot entity into the simulation environment. Add the following lines of code to the Start method just after the code inserting the giant box into the simulation environment:

// create a Corobot
SimulationEngine.GlobalInstancePort.Insert(
    new CorobotEntity("Corobot", new Vector3(0, 0, 0)));

This creates a new CorobotEntity with the name "Corobot" at the simulation origin. Compile and run the service. You should see something similar to what is shown in Figure 6-5.

Figure 6-5

Figure 6.5. Figure 6-5

You can test your new entity by going into the Simulation Editor and selecting it by holding down the Ctrl key while pressing the right mouse button with the mouse pointer on the entity. You can examine the fields in the EntityState and each of the WheelEntities to verify they are correct. You can even press Ctrl+X and then Ctrl+V to cut and paste the entity back into the simulator to verify that serialization and deserialization of the entity work properly.

You still need to add some properties to the entity to make it compatible with the DifferentialDriveEntity. The MotorTorqueScaling property scales the speed of the motors to model the gear ratio on the physical robot. The IsEnabled property allows other services to enable or disable the drive. Add the following properties to the CorobotEntity class:

bool _isEnabled;
/// <summary>
/// True if drive mechanism is enabled
/// </summary>
[DataMember]
[Description("True if the drive mechanism is enabled.")]
public bool IsEnabled
{
    get { return _isEnabled; }
    set { _isEnabled = value; }
}

float _motorTorqueScaling;
/// <summary>
/// Scaling factor to apply to motor torque requests
/// </summary>
[DataMember]
[Description("Scaling factor to apply to motor torgue requests.")]
public float MotorTorqueScaling
{
    get { return _motorTorqueScaling; }
    set { _motorTorqueScaling = value; }
}

Next, you initialize the MotorTorqueScaling in the nondefault constructor. The value set here is an arbitrary value. You'll reexamine this value later in the section "Tuning MotorTorqueScaling."

_motorTorqueScaling = 20f;

The CurrentHeading property simulates a compass sensor. It returns the heading of the entity based on State.Pose:

public float CurrentHeading
{
    get
    {
        // return the axis angle of the quaternion
        xna.Vector3 euler = UIMath.QuaternionToEuler(State.Pose.Orientation);
        // heading is the rotation about the Y axis.
        return xna.MathHelper.ToRadians(euler.Y);
    }
}

The next few methods enable explicit control of the speed of each motor by setting the left and right target velocities. If one of these methods is called, then any current DriveDistance or RotateDegrees commands are terminated with an error:

float _leftTargetVelocity;
float _rightTargetVelocity;
public void SetMotorTorque(float leftWheel, float rightWheel)
{
    ResetRotationAndDistance();
    SetAxleVelocity(
        leftWheel * _motorTorqueScaling,
        rightWheel * _motorTorqueScaling);
}

public void SetVelocity(float value)
{
    ResetRotationAndDistance();
    SetVelocity(value, value);
}

/// <summary>
/// Sets angular velocity on the wheels
/// </summary>
/// <param name="left"></param>
/// <param name="right"></param>
public void SetVelocity(float left, float right)
{
ResetRotationAndDistance();
    if (_wheelFR == null || _wheelFL == null)
        return;

    left = ValidateWheelVelocity(left);
    right = ValidateWheelVelocity(right);

    // v is in m/sec—convert to an axle speed
    //  2Pi(V/2PiR) = V/R
    SetAxleVelocity(
        left / _wheelFR.Wheel.State.Radius,
        right / _wheelFL.Wheel.State.Radius);
}

private void SetAxleVelocity(float left, float right)
{
    _leftTargetVelocity = left;
    _rightTargetVelocity = right;
}

const float MAX_VELOCITY = 20.0f;
const float MIN_VELOCITY = -MAX_VELOCITY;

float ValidateWheelVelocity(float value)
{
    if (value > MAX_VELOCITY)
        return MAX_VELOCITY;
    if (value < MIN_VELOCITY)
        return MIN_VELOCITY;

    return value;
}

All that remains are the DriveDistance and RotateDegrees methods and their associated helper functions and variables:

Pose _startPoseForDriveDistance;
double _distanceToTravel;
SuccessFailurePort _driveDistancePort = null;

public void DriveDistance(
    float distance,
    float power,
    SuccessFailurePort responsePort)
{
    // reset drivedistance or rotatedegrees commands not yet completed
    ResetRotationAndDistance();

    // keep track of the response port for when we complete the request
    _driveDistancePort = responsePort;

    // handle negative distances
    if (distance < 0)
{
        distance = -distance;
        power = -power;
    }
    _startPoseForDriveDistance = State.Pose;
    _distanceToTravel = distance;
    SetAxleVelocity(
        power * _motorTorqueScaling,
        power * _motorTorqueScaling);
}

// DriveDistance and RotateDegrees variables
Queue<double> progressPoints = new Queue<double>();
const int averageKernel = 6;
const int decelerateThreshold = 6;
const float twoPI = (float)(2 * Math.PI);

// RotateDegrees variables
double _targetRotation = double.MaxValue;
double _currentRotation = 0;
double _previousHeading = 0;
const float acceptableRotationError = 0.005f;
SuccessFailurePort _rotateDegreesPort = null;

public void RotateDegrees(
    float degrees,
    float power,
    SuccessFailurePort responsePort)
{
    // reset drivedistance or rotatedegrees commands not yet completed
    ResetRotationAndDistance();

    // keep track of the response port for when we complete the request
    _rotateDegreesPort = responsePort;

    _targetRotation = xna.MathHelper.ToRadians(degrees);
    _currentRotation = 0;
    _previousHeading = CurrentHeading;

    if (degrees < 0)
        SetAxleVelocity(
            power * _motorTorqueScaling,
            -power * _motorTorqueScaling);
    else
        SetAxleVelocity(
            -power * _motorTorqueScaling,
            power * _motorTorqueScaling);
}

void ResetRotationAndDistance()
{
    progressPoints.Clear();
    _distanceToTravel = 0;
_targetRotation = double.MaxValue;
    if (_driveDistancePort != null)
    {
        _driveDistancePort.Post(
            new Exception("Request superceded prior to completion."));
        _driveDistancePort = null;
    }
    if (_rotateDegreesPort != null)
    {
        _rotateDegreesPort.Post(
            new Exception("Request superceded prior to completion."));
        _rotateDegreesPort = null;
    }
}

The Drive Methods

The DifferentialDriveEntity provides several methods that drive its two wheels. Your entity needs to implement these same methods with four wheels in mind. The DifferentialDriveEntity keeps track of a target velocity for each wheel. Each frame, the axle speed of each wheel is adjusted according to this target velocity. The Update method also adjusts the target speed of each wheel if a DriveDistance or RotateDegrees command is currently being executed. It attempts to slow the wheels as the target distance or heading approaches so that the target is not overshot.

Add the following Update method override to the CorobotEntity class:

const float SPEED_DELTA = 0.5f;
public override void Update(FrameUpdate update)
{
    // update state from the physics engine
    PhysicsEntity.UpdateState(true);

This call updates the entity state Pose and Velocity from the physics engine:

if (_distanceToTravel > 0)
    {
        // DriveDistance update
        double currentDistance =
            Vector3.Length(State.Pose.Position -
                           _startPoseForDriveDistance.Position);
        if (currentDistance >= _distanceToTravel)
        {
            _wheelFR.Wheel.AxleSpeed = 0;
            _wheelFL.Wheel.AxleSpeed = 0;
            _wheelRR.Wheel.AxleSpeed = 0;
            _wheelRL.Wheel.AxleSpeed = 0;
            _leftTargetVelocity = 0;
            _rightTargetVelocity = 0;
            _distanceToTravel = 0;
            // now that we're finished, post a response
            if (_driveDistancePort != null)
{
                SuccessFailurePort tmp = _driveDistancePort;
                _driveDistancePort = null;
                tmp.Post(new SuccessResult());
            }
        }
        else
        {
            // need to drive further, check if we should slow down
            if (progressPoints.Count >= averageKernel)
            {
                double distanceRemaining =
                    _distanceToTravel—currentDistance;
                double framesToCompletion =
                    distanceRemaining * averageKernel /
                   (currentDistance—progressPoints.Dequeue());
                if (framesToCompletion < decelerateThreshold)
                {
                    _leftTargetVelocity *= 0.5f;
                    _rightTargetVelocity *= 0.5f;
                    progressPoints.Clear();
                }
            }
            progressPoints.Enqueue(currentDistance);
        }
    }

The preceding code handles the behavior of the entity while a DriveDistance command is being executed. Suffice it to say that the code attempts to slow the entity as it approaches the distance goal so that it doesn't overshoot it. This is a better implementation than the one found in the DifferentialDriveEntity but there is still room for improvement:

else if (_targetRotation != double.MaxValue)
    {
        // RotateDegrees update
        float currentHeading = CurrentHeading;
        double angleDelta = currentHeading—_previousHeading;
        while (angleDelta > Math.PI)
            angleDelta -= twoPI;
        while (angleDelta <= -Math.PI)
            angleDelta += twoPI;
        _currentRotation += angleDelta;
        _previousHeading = currentHeading;  // for next frame

        float angleError;
        if (_targetRotation < 0)
            angleError = (float)(_currentRotation—_targetRotation);
        else
            angleError = (float)(_targetRotation—_currentRotation);

        if (angleError < acceptableRotationError)
{
            // current heading is within acceptableError or has overshot
            // end the rotation
            _targetRotation = double.MaxValue;
            _wheelFR.Wheel.AxleSpeed = 0;
            _wheelFL.Wheel.AxleSpeed = 0;
            _wheelRR.Wheel.AxleSpeed = 0;
            _wheelRL.Wheel.AxleSpeed = 0;
            _leftTargetVelocity = 0;
            _rightTargetVelocity = 0;
            // now that we're finished, post a response
            if (_rotateDegreesPort != null)
            {
                SuccessFailurePort tmp = _rotateDegreesPort;
                _rotateDegreesPort = null;
                tmp.Post(new SuccessResult());
            }
        }
        else
        {
            if (angleDelta != 0)
            {
                // need to turn more, check if we should slow down
                if (progressPoints.Count >= averageKernel)
                {
                    double framesToCompletion =
                        Math.Abs(angleError * averageKernel /
                                 (_currentRotation—progressPoints.Dequeue()));
                    if (framesToCompletion < decelerateThreshold)
                    {
                        _leftTargetVelocity *= 0.5f;
                        _rightTargetVelocity *= 0.5f;
                        progressPoints.Clear();
                    }
                }
                progressPoints.Enqueue(_currentRotation);
            }
        }
    }

The following code handles the RotateDegrees command. Just like DriveDistance, it attempts to slow the rotation of the entity as the target heading approaches:

float left = _wheelFL.Wheel.AxleSpeed + _leftTargetVelocity;
    float right = _wheelFR.Wheel.AxleSpeed + _rightTargetVelocity;

    if (Math.Abs(left) > 0.1)
    {
        if (left > 0)
            _wheelFL.Wheel.AxleSpeed -= SPEED_DELTA;
        else
            _wheelFL.Wheel.AxleSpeed += SPEED_DELTA;
    }
if (Math.Abs(right) > 0.1)
    {
        if (right > 0)
            _wheelFR.Wheel.AxleSpeed -= SPEED_DELTA;
        else
            _wheelFR.Wheel.AxleSpeed += SPEED_DELTA;
    }

The AxleSpeed is the negative of the target velocity. When the two are nearly equal, left and right will be close to zero. If they are not nearly equal, then the axle speed is adjusted by SPEED_DELTA.

Here, the four-wheel-drive is implemented. The rear wheels are given axle speeds comparable to the front wheels:

// match the rear wheels with the front wheels
    _wheelRL.Wheel.AxleSpeed = _wheelFL.Wheel.AxleSpeed;
    _wheelRR.Wheel.AxleSpeed = _wheelFR.Wheel.AxleSpeed;

Finally, the Update method for the wheel entities is called, along with the base Update method:

// update entities in fields
    _wheelFL.Update(update);
    _wheelFR.Update(update);
    _wheelRL.Update(update);
    _wheelRR.Update(update);

    // sim engine will update children
    base.Update(update);
}

The SimulatedQuadDifferentialDrive Service

Now you know how to create a custom simulation entity, but let's face it: it's pretty boring. It just sits there doing nothing. Wouldn't it be great to be able to drive it around in the simulation environment? That is the topic of this next section.

A SimulatedDifferentialDrive service is provided with the MRDS SDK. It can be controlled with the SimpleDashboard service and it drives two-wheeled robots that subclass the DifferentialDriveEntity class. However, you have a four-wheeled robot that doesn't use the DifferentialDriveEntity class, so you are going to build a custom service that supports the same generic drive contract, enabling you to use the SimpleDashboard to control it.

Just as you previously used DssNewService.exe to create the Corobot service, you now use it to create your SimulatedQuadDifferentialDrive service. However, there is a twist this time. You're going to tell DssNewService that you want the new service to support the generic drive contract as an alternate contract. This means that your service will support two different ports, each identified with a different contract. The alternate port will look just like the port that the SimulatedDifferentialDrive service supports.

Go back to the MyChapter6 directory and use the following command line (shown in bold below) to create the SimulatedQuadDifferentialDrive service. Each command-line option to dssnewservice is shown on a separate line due to word wrap, but you should type them all on one line. The bold code indicates text that you should type:

C:Microsoft Robotics Studio (1.5)ProMRDSMyChapter6>dssnewservice
/s:SimulatedQuadDifferentialDrive
/i:"Microsoft Robotics Studio (1.5)inRoboticsCommon.dll"
/alt:"http://schemas.microsoft.com/robotics/2006/05/drive.html"
/Namespace:"ProMRDS.Simulation.QuadDifferentialDrive"
/year:"2007" /month:"07"

The /s parameter names the new service. The /alt parameter specifies the contract that you want to implement as an alternate contract, and the /i parameter specifies where that contract is implemented. You use the /year and /month parameters to ensure that the contract for this service will be the same as the SimulatedQuadDifferentialDrive service in the Chapter6 directory.

Take a moment to look at the service code that has been generated. DssNewService has generated a service call SimulatedQuadDifferentialDrive service that supports an alternate contract identified by "http://schemas.microsoft.com/robotics/2006/05/drive.html". This is the generic drive contract you specify on the command line. It has defined a _mainPort of type pxdrive.DriveOperations and a service state of type pxdrive.DriveDifferentialTwoWheelState. You're going to shuffle things a bit because you want to support two ports: an alternate port that supports the generic drive contract, and the main port that supports the new SimulatedQuadDifferentialDrive contract.

Begin modifying SimulatedQuadDifferentialDrive.cs by replacing the using statements at the top with the following to prepare you to access the simulator and the Corobot entity you just defined:

using Microsoft.Ccr.Core;
using Microsoft.Dss.Core;
using Microsoft.Dss.Core.Attributes;
using Microsoft.Dss.ServiceModel.Dssp;
using Microsoft.Dss.ServiceModel.DsspServiceBase;
using Microsoft.Dss.Services.SubscriptionManager;
using System;
using W3C.Soap;
using System.Collections.Generic;
using dssphttp = Microsoft.Dss.Core.DsspHttp;
using pxdrive = Microsoft.Robotics.Services.Drive.Proxy;
using xml = System.Xml;
using xna = Microsoft.Xna.Framework;
using submgr = Microsoft.Dss.Services.SubscriptionManager;
using simtypes = Microsoft.Robotics.Simulation;
using simengine = Microsoft.Robotics.Simulation.Engine;
using physics = Microsoft.Robotics.Simulation.Physics;
using corobot = ProMRDS.Simulation.Corobot;
using Microsoft.Robotics.PhysicalModel;

The new project will already have a reference to RoboticsCommon.Proxy.dll because it is referencing the generic drive contract from that DLL. You'll also need to add references to the following DLLs. Don't forget to edit the properties of each DLL you add to set Copy Local and Specific Version to False:

Corobot.Y2007.M07.dll
Microsoft.Xna.Framework.dll
PhysicsEngine.dll
RoboticsCommon.dll
SimulationCommon.dll
SimulationEngine.dll

You may be wondering why you need a reference to RoboticsCommon.dll when the project already has a reference to RoboticsCommon.proxy.dll. The definition of the generic drive contract is drawn from the proxy DLL, and the common types such as Vector3 are drawn from RoboticsCommon.dll.

Now you add a port to receive the generic drive commands:

// Port for receiving generic differential drive commands
[AlternateServicePort(
    AllowMultipleInstances = true,
    AlternateContract = pxdrive.Contract.Identifier)]
private pxdrive.DriveOperations _diffDrivePort = new
    Microsoft.Robotics.Services.Drive.Proxy.DriveOperations();

Change the definition of _mainPort to support the extended quad drive operations:

/// <summary>
/// Main service port for quad drive commands
/// </summary>
[ServicePort("/simulatedquaddifferentialdrive",
    AllowMultipleInstances = true)]
private QuadDriveOperations _mainPort = new QuadDriveOperations();

You also need to extend the state for this service because you need to keep track of four wheels now, rather than two. Replace the _state declaration with the following:

[InitialStatePartner(
    Optional = true,
    ServiceUri = "SimulatedQuadDifferentialDriveService.Config.xml")]
private DriveDifferentialFourWheelState _state =
    new DriveDifferentialFourWheelState();

You'll also add the following definition of this class in SimulatedQuadDifferentialDriveTypes.cs just after the Contract class definition:

public class DriveDifferentialFourWheelState: pxdrive.DriveDifferentialTwoWheelState
{
    private pxmotor.WheeledMotorState _rearLeftWheel;
    private pxmotor.WheeledMotorState _rearRightWheel;
    private Vector3 _position;

    [Description("The rear left wheel's state.")]
    [DataMember]
public pxmotor.WheeledMotorState RearLeftWheel
    {
        get { return _rearLeftWheel; }
        set { _rearLeftWheel = value; }
    }
    [DataMember]
    [Description("The rear right wheel's state.")]
    public pxmotor.WheeledMotorState RearRightWheel
    {
        get { return _rearRightWheel; }
        set { _rearRightWheel = value; }
    }
    [DataMember]
    [Description("The current position of the entity.")]
    public Vector3 Position
    {
        get { return _position; }
        set { _position = value; }
    }
}

DriveDifferentialFourWheelState inherits from pxdrive.DriveDifferentialTwoWheelState, so all you need to do is add a definition for the rear wheels. You'll also add the current position of the entity, which you'll use as a crude way to simulate a GPS system later.

As long as you're modifying SimulatedQuadDifferentialDriveTypes.cs, you may as well add the definition for QuadDriveOperations, which are the operations supported by the main port. It will support DsspDefaultLookup, DsspDefaultDrop, HttpGet, Get, and a new operation: SetPose. Add this code for the operations after the state definition:

/// <summary>
/// QuadDrive Operations Port
/// </summary>
[ServicePort]
public class QuadDriveOperations:
    PortSet<DsspDefaultLookup,DsspDefaultDrop,HttpGet,Get,SetPose>
{
}

/// <summary>
/// Operation Retrieve Drive State
/// </summary>
[Description("Gets the drive's current state.")]
public class Get:
    Get<GetRequestType, PortSet<DriveDifferentialFourWheelState, Fault>>
{
}
/// <summary>
/// Operation Set Entity Pose
/// </summary>
[Description("Sets the pose of the quadDifferentialDrive entity.")]
public class SetPose:
    Update<SetPoseRequestType, PortSet<DefaultUpdateResponseType, Fault>>
{
}

/// <summary>
/// Set entity pose request
/// </summary>
[DataMemberConstructor]
[DataContract]
public class SetPoseRequestType
{
    Pose _entityPose;

    [DataMember]
    public Pose EntityPose
    {
        get { return _entityPose; }
        set { _entityPose = value; }
    }

    public SetPoseRequestType()
    {
    }
}

Notice that you've defined a new operation, SetPose, which is an Update operation. It has a SetPoseRequestType class as its body, which in turn contains an EntityPose. Before you leave the file, replace the using statements at the top with the following:

using Microsoft.Ccr.Core;
using Microsoft.Dss.Core.Attributes;
using Microsoft.Dss.ServiceModel.Dssp;
using Microsoft.Dss.Core.DsspHttp;
using System;
using System.Collections.Generic;
using System.ComponentModel;
using W3C.Soap;
using pxdrive = Microsoft.Robotics.Services.Drive.Proxy;
using pxmotor = Microsoft.Robotics.Services.Motor.Proxy;
using Microsoft.Robotics.PhysicalModel;

To summarize, you've changed the _mainPort to be a QuadDriveOperations port and added another port called _diffDrivePort, which supports the generic drive operations. You also changed the service state to DriveDifferentialFourWheelState, which includes all of the generic drive state as well as two additional wheels and a Pose.

Simulation Entity Notifications

All simulation services have one thing in common: They need to manipulate or read data from entities in the simulation environment. This is easy if the service created and inserted the entity, as was the case with the Corobot service. The SimulatedQuadDifferentialDrive service needs to interact with the Corobot entity in the simulation environment, so it needs to request a notification from the SimulationEngine service when the entity it needs is inserted into the environment. It does this by sending a subscribe message to the SimulationEngine service, which includes the entity name and a port to receive the notification. To support this, add the following variables at the top of the SimulatedQuadDifferentialDriveService class definition in SimulatedQuadDifferentialDrive.cs:

#region Simulation Variables
corobot.CorobotEntity _entity;
simengine.SimulationEnginePort _notificationTarget;
#endregion

_entity will eventually hold a reference to the Corobot entity in the simulation environment, and _notificationTarget is the port that will receive the notification.

The subscribe message is sent to the simulator in the Start method even before the service calls base.Start to insert itself into the service directory:

protected override void Start()
{
    if (_state == null)
        CreateDefaultState();

    _notificationTarget = new simengine.SimulationEnginePort();

    // PartnerType.Service is the entity instance name.
    simengine.SimulationEngine.GlobalInstancePort.Subscribe(
        ServiceInfo.PartnerList, _notificationTarget);

    // don't start listening to DSSP operations, other than drop,
    // until notification of entity
    Activate(new Interleave(
        new TeardownReceiverGroup
        (
            Arbiter.Receive<simengine.InsertSimulationEntity>(
                false,
                _notificationTarget,
               InsertEntityNotificationHandlerFirstTime),
            Arbiter.Receive<DsspDefaultDrop>(
                false,
                _mainPort,
                DefaultDropHandler),
            Arbiter.Receive<DsspDefaultDrop>(
                false,
                _diffDrivePort,
                DefaultDropHandler)
        ),
        new ExclusiveReceiverGroup(),
        new ConcurrentReceiverGroup()
    ));
}

As a convenience, the Subscribe method on the simulation engine port takes a PartnerList as a parameter. The name of the entity is contained in the partner list because it is specified as a partner to the SimulatedQuadDifferentialDrive service in the manifest. Update the manifest line at the top of the Corobot manifest to create a simcommon namespace to save typing:

<Manifest
    xmlns="http://schemas.microsoft.com/xw/2004/10/manifest.html"
    xmlns:dssp="http://schemas.microsoft.com/xw/2004/10/dssp.html"
    xmlns:simcommon="http://schemas.microsoft.com/robotics/2006/04/simulation.html"
    >

Add the following lines to the Corobot manifest to start the drive service and to specify the entity named "Corobot" as a partner. This ServiceRecordType is inserted immediately after the </ServiceRecordType> line that ends the Corobot service record. ProMRDSconfigCorobot.manifest.xml provides an example.

<ServiceRecordType>

<dssp:Contract>http://schemas.tempuri.org/2007/07/simulatedquaddifferentialdrive.html</dssp:Contract>
    <dssp:PartnerList>
        <dssp:Partner>
            <!—The partner name must match the entity name—>
            <dssp:Service>http://localhost/Corobot</dssp:Service>
            <dssp:Name>simcommon:Entity</dssp:Name>
        </dssp:Partner>
    </dssp:PartnerList>
</ServiceRecordType>

The entity name is in the form of a URI, so an entity name of "Corobot" becomes "http://localhost/ Corobot".

When a partner is defined this way in the manifest, it shows up in the PartnerList and the simulation engine will parse the passed PartnerList until it finds an entity name. When it receives the subscribe request, it scans through all of the entities in the environment. If that entity already exists in the environment, then the simulation engine immediately sends a notification to the _notificationTarget port with a reference to the named entity. If the entity does not exist in the environment, then the simulation engine waits until it is inserted before sending the notification. If the entity is never inserted, then no notification is ever sent.

In the Start method, a new Interleave is activated, which activates handlers for the DsspDefaultDrop message on each port and a handler for an InsertSimulationEntity handler on the _notificationTarget port. No other messages are processed and the service won't even show up in the service directory until it receives a notification from the simulation engine.

The next step is to add the InsertEntityNotificationHandlerFirstTime method, which handles the InsertSimulationEntity notification:

void InsertEntityNotificationHandlerFirstTime(
    simengine.InsertSimulationEntity ins)
{
    // insert ourselves into the directory
    base.Start();

    InsertEntityNotificationHandler(ins);
}

When this handler is called, the service finally lets the rest of the world know that it exists by calling base.Start and then calling InsertEntityNotificationhandler. This is the method that will handle subsequent insert notifications because base.Start only needs to be called the first time a notification arrives.

Add the following code for the InsertEntityNotificationHandler method:

void InsertEntityNotificationHandler(
    simengine.InsertSimulationEntity ins)
{
    _entity = (corobot.CorobotEntity)ins.Body;
    _entity.ServiceContract = Contract.Identifier;

You store a reference to the entity in _entity so that it can be used by the other handlers. You also set the ServiceContract property on the entity to be the contract of your service to indicate that it is being controlled by this service.

// create default state based on the physics entity
    xna.Vector3 separation =
        _entity.FrontLeftWheel.Position—_entity.FrontRightWheel.Position;
    _state.DistanceBetweenWheels = separation.Length();

    _state.LeftWheel.MotorState.PowerScalingFactor = _entity.MotorTorqueScaling;
    _state.RightWheel.MotorState.PowerScalingFactor = _entity.MotorTorqueScaling;

Your service state is updated based on the properties of the entity. These values don't change over time so they are copied to the state only at this initialization time.

// enable other handlers now that we are connected
    Activate(new Interleave(
        new TeardownReceiverGroup
        (
            Arbiter.Receive<DsspDefaultDrop>(false, _mainPort, DefaultDropHandler),
            Arbiter.Receive<DsspDefaultDrop>(
                false, _diffDrivePort, DefaultDropHandler)
        ),
        new ExclusiveReceiverGroup
        (
            Arbiter.Receive<SetPose>(true, _mainPort, SetPoseHandler),
            Arbiter.ReceiveWithIterator<pxdrive.DriveDistance>(
                true, _diffDrivePort, DriveDistanceHandler),
            Arbiter.ReceiveWithIterator<pxdrive.RotateDegrees>(
                true, _diffDrivePort, RotateHandler),
Arbiter.ReceiveWithIterator<pxdrive.SetDrivePower>(
                true, _diffDrivePort, SetPowerHandler),
            Arbiter.ReceiveWithIterator<pxdrive.SetDriveSpeed>(
                true, _diffDrivePort, SetSpeedHandler),
            Arbiter.ReceiveWithIterator<pxdrive.AllStop>(
                true, _diffDrivePort, AllStopHandler),
            Arbiter.Receive<simengine.InsertSimulationEntity>(
                true, _notificationTarget, InsertEntityNotificationHandler),
            Arbiter.Receive<simengine.DeleteSimulationEntity>(
                true, _notificationTarget, DeleteEntityNotificationHandler)
        ),
        new ConcurrentReceiverGroup
        (
            Arbiter.ReceiveWithIterator<dssphttp.HttpGet>(
                true, _mainPort, MainPortHttpGetHandler),
            Arbiter.ReceiveWithIterator<Get>(true, _mainPort, MainPortGetHandler),
            Arbiter.ReceiveWithIterator<dssphttp.HttpGet>(
                true, _diffDrivePort, HttpGetHandler),
            Arbiter.ReceiveWithIterator<pxdrive.Get>(
                true, _diffDrivePort, GetHandler),
            Arbiter.ReceiveWithIterator<pxdrive.Subscribe>(
                true, _diffDrivePort, SubscribeHandler),
            Arbiter.ReceiveWithIterator<pxdrive.ReliableSubscribe>(
                true, _diffDrivePort, ReliableSubscribeHandler),
            Arbiter.ReceiveWithIterator<pxdrive.EnableDrive>(
                true, _diffDrivePort, EnableHandler)
        )
    ));
}

Finally, all of the handlers for _mainPort and _diffDrivePort are activated. Any messages queued up on these ports can now be processed because you have connected to the entity in the simulator. The handlers are also activated for subsequent DeleteSimulationEntity and InsertSimulationEntity notifications. This enables the service to handle the case where entities are deleted from the environment and added again.

DeleteSimulationEntity sets the global _entity reference to null and then disables all handlers except Drop and InsertEntityNotification:

void DeleteEntityNotificationHandler(simengine.DeleteSimulationEntity del)
{
    _entity = null;

    // disable other handlers now that we are no longer connected to the entity
    Activate(new Interleave(
        new TeardownReceiverGroup
        (
            Arbiter.Receive<simengine.InsertSimulationEntity>(
                false, _notificationTarget,
                InsertEntityNotificationHandlerFirstTime),
            Arbiter.Receive<DsspDefaultDrop>(false, _mainPort, DefaultDropHandler),
            Arbiter.Receive<DsspDefaultDrop>(
                false, _diffDrivePort, DefaultDropHandler)
),
        new ExclusiveReceiverGroup(),
        new ConcurrentReceiverGroup()
    ));
}

Before you add the handler code, add the following methods to initialize and update the state. CreateDefaultState is called from the Start method if no configuration file is present. UpdateStateFromSimulation is called from handlers to update the state before it is broadcast:

void CreateDefaultState()
{
    _state = new DriveDifferentialFourWheelState();
    _state.LeftWheel =
        new Microsoft.Robotics.Services.Motor.Proxy.WheeledMotorState();
    _state.RightWheel =
        new Microsoft.Robotics.Services.Motor.Proxy.WheeledMotorState();
    _state.LeftWheel.MotorState =
        new Microsoft.Robotics.Services.Motor.Proxy.MotorState();
    _state.RightWheel.MotorState =
        new Microsoft.Robotics.Services.Motor.Proxy.MotorState();
    _state.LeftWheel.EncoderState =
        new Microsoft.Robotics.Services.Encoder.Proxy.EncoderState();
    _state.RightWheel.EncoderState =
        new Microsoft.Robotics.Services.Encoder.Proxy.EncoderState();
    _state.RearLeftWheel =
        new Microsoft.Robotics.Services.Motor.Proxy.WheeledMotorState();
    _state.RearRightWheel =
        new Microsoft.Robotics.Services.Motor.Proxy.WheeledMotorState();
    _state.RearLeftWheel.MotorState =
        new Microsoft.Robotics.Services.Motor.Proxy.MotorState();
    _state.RearRightWheel.MotorState =
        new Microsoft.Robotics.Services.Motor.Proxy.MotorState();
}

void UpdateStateFromSimulation()
{
    if (_entity != null)
    {
        _state.TimeStamp = DateTime.Now;
        _state.LeftWheel.MotorState.CurrentPower =
            _entity.FrontLeftWheel.Wheel.MotorTorque;
        _state.RightWheel.MotorState.CurrentPower =
            _entity.FrontRightWheel.Wheel.MotorTorque;
        _state.RearLeftWheel.MotorState.CurrentPower =
            _entity.RearLeftWheel.Wheel.MotorTorque;
        _state.RearRightWheel.MotorState.CurrentPower =
            _entity.RearRightWheel.Wheel.MotorTorque;
        _state.Position = _entity.State.Pose.Position;
    }
}

All that is left now is to implement the message handlers for each port. The following handlers are methods on the SimulatedQuadDifferentialDrive service:

public IEnumerator<ITask> SubscribeHandler(
    pxdrive.Subscribe subscribe)
{
    Activate(Arbiter.Choice(
        SubscribeHelper(
            _subMgrPort,
            subscribe.Body,
            subscribe.ResponsePort),
        delegate(SuccessResult success)
        {
            _subMgrPort.Post(new submgr.Submit(
                subscribe.Body.Subscriber,
                DsspActions.UpdateRequest, _state, null));
        },
        delegate(Exception ex) { LogError(ex); }
    ));

    yield break;
}

public IEnumerator<ITask> ReliableSubscribeHandler(
    pxdrive.ReliableSubscribe subscribe)
{
    Activate(Arbiter.Choice(
        SubscribeHelper(
            _subMgrPort,
            subscribe.Body,
            subscribe.ResponsePort),
        delegate(SuccessResult success)
        {
            _subMgrPort.Post(new submgr.Submit(
                subscribe.Body.Subscriber,
                DsspActions.UpdateRequest, _state, null));
        },
        delegate(Exception ex) { LogError(ex); }
    ));
    yield break;
}

These two handlers use the SubscriptionManager to handle subscribe requests from other services. This is described in more detail in the MRDS documentation in Service Tutorial 4.

public IEnumerator<ITask> MainPortHttpGetHandler(dssphttp.HttpGet get)
{
    UpdateStateFromSimulation();
    get.ResponsePort.Post(new dssphttp.HttpResponseType(_state));
    yield break;
}

public IEnumerator<ITask> MainPortGetHandler(Get get)
{
    UpdateStateFromSimulation();
get.ResponsePort.Post(_state);
    yield break;
}

These two handlers support Get and HttpGet requests on the main port. The state is updated with the most current information from the Corobot entity and then it is posted as a response to the request.

The following handler supports the SetPost update request:

public void SetPoseHandler(SetPose setPose)
{
    if (_entity == null)
        throw new InvalidOperationException(
            "Simulation entity not registered with service");

    Task<corobot.CorobotEntity, Pose> task = new
        Task<corobot.CorobotEntity,Pose>(
            _entity, setPose.Body.EntityPose, SetPoseDeferred);

    _entity.DeferredTaskQueue.Post(task);
}

void SetPoseDeferred(corobot.CorobotEntity entity, Pose pose)
{
    entity.PhysicsEntity.SetPose(pose);
}

If you are connected to an entity, it creates a task that sets the pose of that entity to the pose specified in the request. The task is added to the DeferredTaskQueue on the entity so that it executes during the Update method when the physics engine is not running.

The next two handlers respond to Get and HttpGet requests on the _diffDrive port:

public IEnumerator<ITask> HttpGetHandler(dssphttp.HttpGet get)
{
    UpdateStateFromSimulation();
    pxdrive.DriveDifferentialTwoWheelState _twoWheelState =
        (pxdrive.DriveDifferentialTwoWheelState)
        ((pxdrive.DriveDifferentialTwoWheelState)_state).Clone();
    get.ResponsePort.Post(new dssphttp.HttpResponseType(
        _twoWheelState));
    yield break;
}

public IEnumerator<ITask> GetHandler(pxdrive.Get get)
{
    UpdateStateFromSimulation();

  pxdrive.DriveDifferentialTwoWheelState _twoWheelState =
        (pxdrive.DriveDifferentialTwoWheelState)
        ((pxdrive.DriveDifferentialTwoWheelState)_state).Clone();
    get.ResponsePort.Post(_twoWheelState);
    yield break;
}

It is not valid to return the service state in response to a Get request on the _diffDrive port. It must first be converted to a pxdrive.DriveDifferentialTwoWheelState class by casting it to that type and then calling the Clone method to make a copy. This ensures that services which send a Get message to this port will receive the state they are expecting.

public IEnumerator<ITask> DriveDistanceHandler(pxdrive.DriveDistance driveDistance)
{
    SuccessFailurePort entityResponse = new SuccessFailurePort();
    _entity.DriveDistance(
        (float)driveDistance.Body.Distance,
        (float)driveDistance.Body.Power,
        entityResponse);

    yield return Arbiter.Choice(entityResponse,
        delegate(SuccessResult s)
        {
            driveDistance.ResponsePort.Post(DefaultUpdateResponseType.Instance);
        },
        delegate(Exception e)
        {
            driveDistance.ResponsePort.Post(new W3C.Soap.Fault());
        });

    yield break;
}

public IEnumerator<ITask> RotateHandler(pxdrive.RotateDegrees rotate)
{
    SuccessFailurePort entityResponse = new SuccessFailurePort();
    _entity.RotateDegrees(
        (float)rotate.Body.Degrees,
        (float)rotate.Body.Power,
        entityResponse);

    yield return Arbiter.Choice(entityResponse,
        delegate(SuccessResult s)
        {
            rotate.ResponsePort.Post(DefaultUpdateResponseType.Instance);
        },
        delegate(Exception e)
        {
            rotate.ResponsePort.Post(new W3C.Soap.Fault());
        });

    yield break;
}

Both of these handlers forward the request to the entity by calling either DriveDistance or RotateDegrees. They wait for the asynchronous response from the entity before posting their own response.

public IEnumerator<ITask> SetPowerHandler(pxdrive.SetDrivePower setPower)
{
    if (_entity == null)
        throw new InvalidOperationException(
            "Simulation entity not registered with service");

    // Call simulation entity method for setting wheel torque
    _entity.SetMotorTorque(
        (float)(setPower.Body.LeftWheelPower),
        (float)(setPower.Body.RightWheelPower));

    UpdateStateFromSimulation();
    setPower.ResponsePort.Post(DefaultUpdateResponseType.Instance);

    // send update notification for entire state
    _subMgrPort.Post(new submgr.Submit(_state, DsspActions.UpdateRequest));
    yield break;
}

public IEnumerator<ITask> SetSpeedHandler(pxdrive.SetDriveSpeed setSpeed)
{
    if (_entity == null)
        throw new InvalidOperationException(
            "Simulation entity not registered with service");

    _entity.SetVelocity(
        (float)setSpeed.Body.LeftWheelSpeed,
        (float)setSpeed.Body.RightWheelSpeed);

    UpdateStateFromSimulation();
    setSpeed.ResponsePort.Post(DefaultUpdateResponseType.Instance);

    // send update notification for entire state
    _subMgrPort.Post(new submgr.Submit(_state, DsspActions.UpdateRequest));
    yield break;
}

These two handlers change the axle speed of the wheels, and send a notification to any services that have subscribed that the service state has changed.

The following handler sets the IsEnabled property on the entity according to the request and sends a notification to subscribers:

public IEnumerator<ITask> EnableHandler(pxdrive.EnableDrive enable)
{
    if (_entity == null)
        throw new InvalidOperationException(
            "Simulation entity not registered with service");

    _state.IsEnabled = enable.Body.Enable;
    _entity.IsEnabled = _state.IsEnabled;

    UpdateStateFromSimulation();
enable.ResponsePort.Post(DefaultUpdateResponseType.Instance);

    // send update for entire state
    _subMgrPort.Post(new submgr.Submit(_state, DsspActions.UpdateRequest));
    yield break;
}

The AllStopHandler sets the axle speed of each wheel to 0 and sends a notification to subscribers:

public IEnumerator<ITask> AllStopHandler(pxdrive.AllStop estop)
{
    if (_entity == null)
        throw new InvalidOperationException(
            "Simulation entity not registered with service");

    _entity.SetMotorTorque(0, 0);
    _entity.SetVelocity(0);

    UpdateStateFromSimulation();
    estop.ResponsePort.Post(DefaultUpdateResponseType.Instance);

    // send update for entire state
    _subMgrPort.Post(new submgr.Submit(_state, DsspActions.UpdateRequest));
    yield break;
}

Testing the SimulatedQuadDifferentialDrive Service

The service is now at a point where you can use it to drive around the simulation environment. Modify the Corobot.manifest.xml file to start the SimpleDashboard service by adding the following ServiceRecord:

<!—Start the Dashboard service—>
<ServiceRecordType>
    <dssp:Contract>
         http://schemas.microsoft.com/robotics/2006/01/simpledashboard.html
     </dssp:Contract>
</ServiceRecordType>

When you run this manifest, it should start three separate services: the Corobot service, the SimulatedQuadDifferentialDrive service, and the SimpleDashboard service. Once the simulator starts up and the Corobot service inserts a Corobot entity in the environment, the SimulatedQuadDifferentialDrive service should make itself visible in the service directory.

Type localhost into the Machine textbox and press Enter. A service should be displayed in the service list, as shown in Figure 6-6.

Figure 6-6

Figure 6.6. Figure 6-6

The SimpleDashboard service recognizes the SimulatedQuadDifferentialDrive service because it implements the generic Drive contract as an alternate port. Double-click the service and click the Drive button. You should now be able to drive the Corobot entity around the environment by dragging the trackball icon forward and backward. (The trackball icon is the circle with the crossbar in it.) If you have trouble compiling either of the services or driving the Corobot entity, compare the behavior of your service with the corresponding service in the Chapter6 directory to find the problem.

You should also verify that the proper state is returned in response to an HttpGet request to either port on the service. Run a web browser and navigate to http://localhost:50000. Select Service Directory from the left column. You should see something similar to Figure 6-7, which shows the Corobot and SimulatedQuadDifferentialDrive services.

Figure 6-7

Figure 6.7. Figure 6-7

Click the first /simulatedquaddifferentialdrive service listed and verify that DriveDifferentialFourWheelState is displayed. Click the second /simulatedquaddifferentialdrive service listed and verify that DriveDifferentialTwoWheelState is displayed.

Testing DriveDistance and RotateDegrees

The two most complex methods on the Corobot entity are DriveDistance and RotateDegrees, so it is appropriate to expend some effort in testing them. A simple way to do this is to add a short test routine to the SimulatedQuadDifferentialDrive service, which runs after the notification from the simulation engine. This test routine is normally commented out.

Add this line as the last line of InsertEntityNotificationHandlerFirstTime:

SpawnIterator(TestDriveDistanceAndRotateDegrees);

Now add the following method to the SimulatedQuadDifferentialDriveService class:

// Test the DriveDistance and RotateDegrees messages
public IEnumerator<ITask> TestDriveDistanceAndRotateDegrees()
{
    Random rnd = new Random();
    bool success = true;

    // drive in circles
    while (success)
    {
        double distance = rnd.NextDouble() * 1 + 0.5;
        double angle = rnd.NextDouble() * 90—45;

        // first leg
        yield return Arbiter.Choice(
            _diffDrivePort.RotateDegrees(angle, 0.2),
            delegate(DefaultUpdateResponseType response) { },
            delegate(Fault f) { success = false; }
        );

        yield return Arbiter.Choice(
            _diffDrivePort.DriveDistance(distance, 0.2),
            delegate(DefaultUpdateResponseType response) { },
            delegate(Fault f) { success = false; }
        );

        // return
        yield return Arbiter.Choice(
            _diffDrivePort.RotateDegrees(180, 0.2),
            delegate(DefaultUpdateResponseType response) { },
            delegate(Fault f) { success = false; }
        );

        yield return Arbiter.Choice(
            _diffDrivePort.DriveDistance(distance, 0.2),
            delegate(DefaultUpdateResponseType response) { },
            delegate(Fault f) { success = false; }
        );
// reset position
        yield return Arbiter.Choice(
            _diffDrivePort.RotateDegrees(180-angle, 0.2),
            delegate(DefaultUpdateResponseType response) { },
            delegate(Fault f) { success = false; }
        );
    }
}

This test method sends DriveDistance and RotateDegrees messages to the _diffDrivePort to drive the Corobot a random distance in a random direction. It then turns around and returns to its starting point. Over time, the errors in rotation and driving distance accumulate and the robot will fail to return to its exact starting point. This code is a good example of how the Corobot can be driven from another service. You'll see more examples of this later.

Tuning the Corobot Entity

Once you have a basic robot entity defined, it is a good idea to tune its properties so that it more closely resembles its real-world counterpart. In the following sections, you'll adjust the top speed of the Corobot entity, the grip of its tires, and its appearance. You'll also adjust the model so that its wheels actually turn.

Tuning MotorTorqueScaling

The MotorTorqueScaling property of the entity is a multiplier on the axle speed. When the entity is moved by calling SetMotorTorque, the torque that is passed as a parameter is scaled by MotorTorqueScaling to set the axle speed. A torque value of 1.0 represents the greatest torque that the motor can apply to the axle. Therefore, MotorTorqueScaling determines the speed of the entity when its torque is at the maximum. The CoroWare engineers indicate that the top speed of the Corobot is somewhere around two feet per second. In the initial implementation of the entity, a value of 20 was arbitrarily chosen to initialize MotorTorqueScaling. Now it is time to determine what the actual value should be.

You need to cause the entity to move forward at its maximum speed and somehow measure that speed to determine whether it is close to the rated top speed of two feet per second. The first step is to add the following line as the last line of the Corobot entity's Initialize method:

SetMotorTorque(1, 1);

This will cause the Corobot to move forward at its maximum speed. Change the first few lines of the Update method as follows to measure the speed of the entity:

const float SPEED_DELTA = 0.5f;
Pose startPose;
double totalTime = −1;
public override void Update(FrameUpdate update)
{
    if (totalTime < 0)
    {
        startPose = State.Pose;
totalTime = update.ElapsedTime;
    }
    else
    {
        double distance = Vector3.Length(
            State.Pose.Position—startPose.Position);
        double speed = distance / totalTime;
        totalTime += update.ElapsedTime;
        Console.WriteLine(speed.ToString());
    }

The first time Update is called, the pose of the entity is stored. On each subsequent call to Update, the total distance traveled is calculated, along with the accumulated time since the first call. The speed is calculated and displayed. The value displayed on the console will eventually converge to the actual speed of the entity.

An initial run with this code yielded a speed of approximately 1.2 meters per second. Two feet per second is equal to 0.61 meters per second, so the entity's top speed with a MotorTorqueScaling value of 20 is about twice as fast as it should be. The new MotorTorqueScaling value can be calculated by multiplying the initial value (20) by the ratio of the desired top speed to the current top speed:

NewMotorTorqueScaling = InitialValue * TopSpeedDesired / TopSpeedMeasured

The proper MotorTorqueScaling factor should be about 10.16. When that value is updated in the CorobotEntity constructor and the test is executed again, the top speed comes very close to 0.61 meters per second. After you have determined the appropriate value for MotorTorqueScaling, you can comment out this code so that it doesn't interfere with normal operation of the entity.

Tuning the Tire Friction

After driving your Corobot around in the simulation environment, you may notice that the tires grip the ground very well. If you drive up to the giant box, the tires grip the ground and the box so well that the Corobot flips on its back. Furthermore, when the Corobot is turning, it shakes and jitters because the tires grip the ground very tightly and slip very little. This is different behavior than what is observed on the actual Corobot, so you need to adjust the tire friction.

As discussed in Chapter 5, the simulation environment allows each shape to have a material definition describing its bounciness (restitution) and its static and dynamic friction. The wheel shapes have a more advanced friction model that enables the friction along the longitudinal direction (in the direction of rotation) to be specified separately from the lateral direction (perpendicular to the direction of rotation). The friction model utilizes a spline function to determine the amount of wheel slippage as a function of the amount of force acting on the wheel.

Four parameters specify the spline function: ExtremumSlip, ExtremumValue, AsymptoteSlip, and AsymptoteValue. A fifth factor, StiffnessFactor, acts as a multiplier on the tire forces. Higher values cause the wheel to grip the ground more strongly.

The AGEIA SDK documentation describes the effect of modifying the four spline function parameters. Changing the StiffnessFactor from its default value of 1000000.0 in the lateral direction will have the desired effect for the simulation. The values for the spline function are the same as the AGEIA defaults.

Add the following code just before the first WheelEntity is created in the Corobot Initialize method:

TireForceFunctionDescription LongitudalFunction =
    new TireForceFunctionDescription();
LongitudalFunction.ExtremumSlip = 1.0f;
LongitudalFunction.ExtremumValue = 0.02f;
LongitudalFunction.AsymptoteSlip = 2.0f;
LongitudalFunction.AsymptoteValue = 0.01f;
LongitudalFunction.StiffnessFactor = 1000000.0f;

TireForceFunctionDescription LateralFunction = new TireForceFunctionDescription();
LateralFunction.ExtremumSlip = 1.0f;
LateralFunction.ExtremumValue = 0.02f;
LateralFunction.AsymptoteSlip = 2.0f;
LateralFunction.AsymptoteValue = 0.01f;
LateralFunction.StiffnessFactor = 100000.0f;

wheelFRprop.TireLongitudalForceFunction = LongitudalFunction;
wheelFLprop.TireLongitudalForceFunction = LongitudalFunction;
wheelRRprop.TireLongitudalForceFunction = LongitudalFunction;
wheelRLprop.TireLongitudalForceFunction = LongitudalFunction;

wheelFRprop.TireLateralForceFunction = LateralFunction;
wheelFLprop.TireLateralForceFunction = LateralFunction;
wheelRRprop.TireLateralForceFunction = LateralFunction;
wheelRLprop.TireLateralForceFunction = LateralFunction;

The LateralFunction reduces the slip by a factor of 10. This has the desired effect of making the turns smoother due to tire slippage.

It is difficult to measure the exact tire friction of the actual robot, so these parameters must typically be tuned by hand until the simulated behavior closely matches the observed real-world behavior.

Making the Wheels Turn

One of the first things you might have noticed about your Corobot entity is that its tires don't actually turn as the robot moves. This doesn't affect the simulation behavior but it does reduce the visual realism of the scene quite a bit. In addition, you likely noticed that the wheels never turn on the robots in the MRDS simulation tutorials, either. Let's fix that problem now.

You need to define a new WheelEntity that can keep track of its current rotation and adjust the rendering of its mesh accordingly. The following code shows how to create a new entity called RotatingWheelEntity, which inherits from WheelEntity but adds this additional functionality:

[DataContract]
public class RotatingWheelEntity: WheelEntity
{
    const float rotationScale = (float)(-1.0 / (2.0 * Math.PI));
    public float Rotations = 0;

    public RotatingWheelEntity()
    {
    }
public RotatingWheelEntity(WheelShapeProperties wheelShape)
     : base(wheelShape)
    {
    }

    public override void Initialize(
        Microsoft.Xna.Framework.Graphics.GraphicsDevice device,
        PhysicsEngine physicsEngine)
    {
        base.Initialize(device, physicsEngine);
    }

    public override void Update(FrameUpdate update)
    {
        base.Update(update);

        // set the wheel to the current position
        Wheel.State.LocalPose.Orientation =
            TypeConversion.FromXNA(
                xna.Quaternion.CreateFromAxisAngle(
                    new xna.Vector3(-1, 0, 0),
                    (float)(Rotations * 2 * Math.PI)));

        // update the rotations for the next frame
        Rotations += (float)(Wheel.AxleSpeed *
            update.ElapsedTime * rotationScale);
    }
}

Add the code for this entity outside of the Corobot entity definition but within the same namespace. This entity has a public data member called Rotations. For each frame, the LocalPose of the wheel shape is set according to the current rotation of the wheel. The rotation of the wheel for the next frame is calculated by converting the wheel axle speed to radians per second and multiplying by the elapsed time. The Rotation variable serves as an encoder for the current wheel rotation.

The Render function for the WheelEntity is already set up to take the LocalPose of the wheel shape into account when rendering the mesh; all you have left to do is ensure that the Render function for each wheel is called. Add the following override to the Render method in the Corobot class to accomplish this:

public override void Render(
    RenderMode renderMode,
    MatrixTransforms transforms,
    CameraEntity currentCamera)
{
    base.Render(renderMode, transforms, currentCamera);
    _wheelFL.Render(renderMode, transforms, currentCamera);
    _wheelFR.Render(renderMode, transforms, currentCamera);
    _wheelRL.Render(renderMode, transforms, currentCamera);
    _wheelRR.Render(renderMode, transforms, currentCamera);
}

This method first takes care of rendering the Corobot entity by calling the base.Render method. It then it takes care of rendering each WheelEntity by explicitly calling the Render method for each wheel. Remember that you must do this explicitly because the WheelEntities are not actually children of the Corobot entity.

Additionally, at this point you need to change the definition of _wheelFR (and the others) to use RotatingWheelEntity instead of WheelEntity. The accessor functions for the wheels must be changed to return a RotatingWheelEntity instead of a WheelEntity. In addition, the initialization of the private wheel members (_wheelFR, etc.) must be changed to create a RotatingWheelEntity. The code changes for the FrontRightWheel are shown here:

RotatingWheelEntity _wheelFR;

[Category("Wheels")]
[DataMember]
public RotatingWheelEntity FrontRightWheel
{
    get { return _wheelFR; }
    set { _wheelFR = value; }
}

_wheelFR = new RotatingWheelEntity(wheelFRprop);

Recompile your Corobot project and run the manifest again. You should now notice the wheels move as the robot moves around. It may be a little difficult to tell if they are moving because each wheel is just a uniform gray disc. You'll fix that soon.

Adding Encoders

Now that each wheel has a concept of its current rotation, it is possible for you to add simulated wheel encoders. The Corobot has encoders on its front wheels. Each encoder has a resolution of 600 ticks per revolution of the wheel. Add the following two properties to access the simulated encoder values on these wheels:

[Category("Wheels")]
[DataMember]
public int FrontRightEncoder
{
    get { return (int)(_wheelFR.Rotations * 600f); }
    set { _wheelFR.Rotations = (float)value / 600f; }
}

[Category("Wheels")]
[DataMember]
public int FrontLeftEncoder
{
    get { return (int)(_wheelFL.Rotations * 600f); }
    set { _wheelFL.Rotations = (float)value / 600f; }
}

These properties will show up in the Simulation Editor when you display the properties of the Corobot entity so you can easily verify that the values are correct.

SimulatedQuadDifferentialDrive provides a way to expose the encoder values in its state. Add the following two lines of code to the UpdateStateFromSimulation method in the SimulatedQuadDifferentialDriveService class:

_state.LeftWheel.EncoderState.CurrentReading = _entity.FrontLeftEncoder;
_state.RightWheel.EncoderState.CurrentReading = _entity.FrontRightEncoder;

Making It Look Real

One might argue that the visual appearance of the entity has little bearing on the usefulness of the simulation. However, one gets tired of looking at a lot of gray boxes driving around. You can build a mesh using a graphics modeling tool such as 3D Studio Max, Maya, or Blender that provides a better visualization of the entity. This mesh displays in the simulator in place of the entity but it has no effect on the entity's physics behavior.

It doesn't matter which modeling tool you use to create the mesh as long as it can export to the Alias .obj format. This is a fairly universal, if somewhat basic, 3D geometry format. Each .obj file is typically accompanied by a .mtl file with the same name that specifies the material characteristics for the geometry.

It is beyond the scope of this book to discuss geometry modeling in any great detail, but here are a few lessons that were learned from building meshes for the Corobot model:

  • Modeling tools: The Maya modeling tool was used to generate the geometry for this model. Other modeling and CAD packages such as 3D Studio Max, SolidWorks, and Blender have been successfully used to build models for the simulation environment. Blender is a reasonable option if money is a concern. You can find more information about Blender at www.blender.org.

  • Realism versus speed: Because you want the wheels to move independently from the robot body, you must make both a wheel mesh and a separate body mesh. There is usually a trade-off between the realism and visual interest of the model and the number of polygons it takes to define the model. Too many polygons makes the simulator run more slowly, especially if you have many robots or objects in the scene and you are running on a less powerful graphics card. Always endeavor to keep your simulation running at 30 frames per second or faster. Certainly, if it is running at 20 fps or slower, it becomes much less usable. Also keep in mind that the AGEIA physics engine becomes less stable if it is asked to continuously simulate steps much larger than 16.6 ms. If your simulator is running at less than 30 frames per second, then each frame represents over 33 ms and you may begin to notice physics behavior problems.

  • Wheel geometry: Because this model has four wheels, the wheel geometry affects the overall look of the model a great deal. For that reason, more polygons were budgeted for the wheels than for any other part of the robot model. The wheels were modeled by defining a wheel hub composed of a cylindrical tube with a disc in the center with five triangular holes. Maya's smoothing feature was used to round the edges of the hub. Finally, a cylindrical tire was modeled to fit over the hub. The resulting tire model is shown in Figure 6-8. The right part of the figure illustrates the number of polygons that were used. The wheel doesn't look exactly like the one on the robot but it is close enough.

    Figure 6-8

    Figure 6.8. Figure 6-8

  • Modeling the robot chassis and platform: The dimensions of the physics model are a good place to start in modeling the visual geometry. The chassis is modeled as a simple box combined with hexagonal cylinders on the front and back to give it the beveled look of the actual robot. The platform is modeled as a simple box sandwiched between two larger thin boxes. The middle box represents the batteries and circuit boards carried by the robot. A couple of cylinders and a box represent the camera in front, and a very simple representation of the IR sensor was also added to the front and back of the chassis.

  • Circuit boards and battery: Because a real Corobot was available, pictures were taken of the circuit boards and battery on the platform and added as a texture map to the platform box. The texture map is shown in Figure 6-9, and Figure 6-10 shows the model with the texture map applied. The simulation environment supports texture maps in a number of formats, including.bmp and .jpg. You can also use the DirectX Texture Tool, available in the DirectX SDK, to make.dds files that support mip-maps. A mip-map is a texture that contains multiple resolutions. The hardware automatically chooses the most appropriate resolution based on the mapping of the texture to the screen. You might have noticed that a .dds texture with mip levels is typically used on the ground plane. If you substitute a regular .bmp or .jpg texture map with only a single resolution, you will notice shimmering pixels in the distance as the camera moves around due to aliasing of the texture map.

    Figure 6-9

    Figure 6.9. Figure 6-9

    Figure 6-10

    Figure 6.10. Figure 6-10

  • Tweaking the simulation environment: Because different modeling packages handle materials and geometry in different ways, you will likely need to iterate several times to make the model look good. For example, some modeling packages define +Z as the upward axis, in which case you would need to rotate your model to make it look right in the simulation environment. Other packages use sophisticated lighting models that don't translate well to the .obj format, so you may have to go back and forth between the modeling tool and the simulation environment, tweaking the lighting parameters until the model looks as expected.

Adding the Custom Mesh to the Corobot Entity

If no mesh is specified for an entity in State.Assets.Mesh, the simulator generates meshes from the physics shapes in the entity. If a mesh is specified, it is used. You need to change your Corobot entity to add these custom meshes. Add the following line for each of the RotatingWheelEntities just prior to the call to Initialize:

_wheelFR = new RotatingWheelEntity(wheelFRprop);
_wheelFR.State.Name = base.State.Name + " FrontRightWheel";
_wheelFR.Parent = this;
_wheelFR.MeshRotation = new Vector3(0, 180, 0);   // flip the wheel mesh
_wheelFR.State.Assets.Mesh = "CorobotWheel.obj";
_wheelFR.Initialize(device, physicsEngine);

The simulator looks in the storemedia directory by default but you can also specify any other directory inside the SDK directory structure.

After adding the wheel mesh to each of the wheels, add the Corobot mesh to the Corobot entity just before base.Initialize is called in its Initialize method:

State.Assets.Mesh = "Corobot.obj";
base.Initialize(device, physicsEngine);

Run the Corobot service. You should now see the custom mesh in place of the old physics model. To display the physics representation or combine the two, select a different rendering mode by pressing F2. Figure 6-11 shows the full Corobot model in the simulation environment.

Figure 6-11

Figure 6.11. Figure 6-11

Adding a Camera

The Corobot has a camera mounted on the front center of its chassis. Fortunately, a camera entity and its associated service are already provided in the SDK so it is fairly simple to add it to your entity.

Adding the Camera Entity

Add the following lines of code to the CorobotEntity nondefault constructor:

CameraEntity frontCam = new CameraEntity(
    320,
    240,
    (float)(30.0 * Math.PI / 180.0));
frontCam.State.Name = name + "_cam";
frontCam.IsRealTimeCamera = true;    // update each frame
frontCam.State.Pose.Position = new Vector3(
    0,
    chassisDimensions.Y / 2.0f + chassisClearance—0.01f,
    -chassisDimensions.Z / 2.0f);
InsertEntityGlobal(frontCam);

For this code, you first create a CameraEntity with a resolution of 320 by 240 pixels and a field of view of 30 degrees (converted to radians). You give it a name based on its parent entity and set it to be a real-time camera. Because the view from a real-time camera is rendered every frame, be careful not to add too many real-time cameras in a scene because they can greatly increase rendering time.

Finally, you give the camera a pose that positions it just a little below the center of the front of the chassis shape facing forward. Because the camera is a child of the Corobot entity, it follows its parent's orientation and position.

Compile and build with these changes and run the Corobot manifest. When the scene is displayed, select the Camera menu item. Both the main camera and the Corobot_cam should appear in the menu. When you select the Corobot_cam, you see the simulation environment from the point of view of that camera. The 320 × 240 image is stretched to fill the display window so it may look a little blocky, and objects may be distorted if the display window has a different aspect ratio. Drive the robot around with the dashboard and verify that the camera view changes.

Adding the SimulatedWebcam Service

Just as the Corobot motors needed a SimulatedQuadDifferentialDrive service to drive them, the camera entity you just added needs a SimulatedWebCam service to retrieve frames from the camera for other services to use.

Like other simulation services, the SimulatedWebCam service is started with an entity name for a partner. It attempts to connect with the entity by making a subscription request to the simulation engine.

Add the following lines to the Corobot.manifest.xml file:

<!-- Start WebCam service -->
<ServiceRecordType>
<dssp:Contract>
http://schemas.tempuri.org/2006/09/simulatedwebcam.html
</dssp:Contract>
<dssp:Service>http://localhost/Corobot/Cam</dssp:Service>
<dssp:PartnerList>
  <dssp:Partner>
    <dssp:Service>http://localhost/Corobot_cam</dssp:Service>
    <dssp:Name>simcommon:Entity</dssp:Name>
  </dssp:Partner>
</dssp:PartnerList>
</ServiceRecordType>

The service is started by specifying its contract, and the entity name Corobot_cam is added as a partner (in the form of a URI). The service is given a run-time identifier of /Corobot/Cam. This is the identifier that will appear in the service directory.

Verify that the SimulatedWebCam service is working properly by running the manifest. Open a browser window and navigate to http://localhost:50000. Select Service Directory in the left column and then click the entry called /Corobot/Cam. A web page will be displayed showing the current view from the camera. If you select a refresh interval and press the Start button, the web page will be updated with new images at the specified rate.

Adding Infrared Distance Sensors

The Corobot has infrared (IR) distance sensors mounted on the front and rear of it chassis. These sensors serve as virtual bumpers, telling the robot when it is about to collide with an obstacle. In this section, you modify the LaserRangeFinderEntity included with the MRDS SDK to become an IR sensor. You also add a service to read the value of an IR sensor and send notifications to other services when it changes.

The CorobotIREntity

It is difficult to fully simulate all of the properties of an infrared sensor. The main purpose of the sensor is to return an approximate distance value based on the amount of infrared light reflected back to the sensor. The sensor can also be used as a reflectivity sensor because the value it returns changes with the reflectivity of different materials even if they are at the same distance. This property of the IR sensor is used to advantage in the MRDS Sumo Competition Package. The iRobot Create robots used as sumo players utilize the IR sensors mounted on their underside to detect changes in reflectivity in the outer region of the sumo ring so that they don't exceed that.

The simulation environment does not currently support modeling reflectivity, so this aspect of the sensor is hard to model. The CorobotIREntity that you'll create for the Corobot assumes that every material in the world has the same reflectivity, so it will simply return distance.

The LaserRangeFinderEntity provided in the SDK is a very accurate distance sensor that sweeps a laser horizontally across the environment and measures the light that is reflected back. Simulation Tutorial 2 (run by clicking Start

The CorobotIREntity

You can easily modify this entity to become your CorobotIREntity. Copy the source code for the LaserRangeFinderEntity from samplessimulationentitiesentities.cs into Corobot.cs. Change the name of the entity to CorobotIREntity and include the following member variables:

public class CorobotIREntity: VisualEntity
{
    [DataMember]
    public float DispersionConeAngle = 8f; // in degrees
    [DataMember]
    public float Samples = 3f;  // the number of rays in each direction
    [DataMember]
    public float MaximumRange =
        (30f * 2.54f / 100.0f); // 30 inches converted to meters

    float _elapsedSinceLastScan;
    Port<RaycastResult> _raycastResultsPort;
    RaycastResult _lastResults;
    Port<RaycastResult> _raycastResults = new Port<RaycastResult>();
    RaycastProperties _raycastProperties;
    CachedEffectParameter _timeAttenuationHandle;
    float _appTime;
    Shape _particlePlane;

    /// <summary>
    /// Raycast configuration
    /// </summary>
    public RaycastProperties RaycastProperties
    {
        get { return _raycastProperties; }
        set { _raycastProperties = value; }
    }

    float _distance;
    [DataMember]
    public float Distance
    {
        get { return _distance; }
        set { _distance = value; }
    }

DispersionConeAngle is a new variable that sets the angle across which the infrared rays spread out from the emitter. The Samples variable specifies the number of distance samples to take horizontally and vertically. The MaximumRange variable specifies the farthest distance from which the sensor returns any meaningful data. If objects are farther than this distance, the sensor reports MaximumRange as the distance. You can use the Distance property to retrieve the last reading from the sensor. The rest of the variables are copied directly from the LaserRangeFinderEntity.

The constructors are renamed to match the new class name. State.Assets.Effect is set to "LaserRangeFinder.fx". This effect is used when the laser impact points are rendered.

/// <summary>
/// Default constructor used when this entity is deserialized
/// </summary>
public CorobotIREntity()
{
}

/// <summary>
/// Initialization constructor used when this entity is built programmatically
/// </summary>
/// <param name="initialPos"></param>
public CorobotIREntity(string name, Pose initialPose)
{
    base.State.Name = name;
    base.State.Pose = initialPose;

    // used for rendering impact points
    base.State.Assets.Effect = "LaserRangeFinder.fx";
}

The Initialize method is not substantially different from the LaserRangeFinderEntity:

public override void Initialize(xnagrfx.GraphicsDevice device, PhysicsEngine physicsEngine)
{
    try
    {
        if (Parent == null)
            throw new Exception(
                "This entity must be a child of another entity.");

        // make sure that we take at least 2 samples in each direction
        if (Samples < 2f)
            Samples = 2f;

        _raycastProperties = new RaycastProperties();
        _raycastProperties.StartAngle = -DispersionConeAngle / 2.0f;
        _raycastProperties.EndAngle = DispersionConeAngle / 2.0f;
        _raycastProperties.AngleIncrement =
            DispersionConeAngle / (Samples—1f);
        _raycastProperties.Range = MaximumRange;
        _raycastProperties.OriginPose = new Pose();

The sensor calculates a distance value by casting rays out into the environment to see where they intersect with physics objects. The _raycastProperties structure specifies the number of rays and the angle over which they are cast.

// set flag so rendering engine renders us last
        Flags |= VisualEntityProperties.UsesAlphaBlending;

        base.Initialize(device, physicsEngine);

The LaserRangeFinder.fx effect is created in base.Initialize. The following code creates the mesh that is used to render the laser impact point. You keep this functionality in your sensor because it is often useful in debugging services to see exactly where the IR sensor is pointing. The HeightFieldShape is created solely for the purpose of constructing a planar mesh that is two centimeters on a side. The mesh is created and added to the Meshes collection. It is given a texture map called particle.bmp that uses transparency to give the rendered plane a circular appearance:

// set up for rendering impact points
        HeightFieldShapeProperties hf = new HeightFieldShapeProperties(
            "height field", 2, 0.02f, 2, 0.02f, 0, 0, 1, 1);
        hf.HeightSamples =
            new HeightFieldSample[hf.RowCount * hf.ColumnCount];
        for (int i = 0; i < hf.HeightSamples.Length; i++)
            hf.HeightSamples[i] = new HeightFieldSample();

        _particlePlane = new Shape(hf);
        _particlePlane.State.Name = "laser impact plane";

        // The mesh is used to render the ray impact points
        // rather than the sensor geometry.
        int index = Meshes.Count;
        Meshes.Add(SimulationEngine.ResourceCache.CreateMesh(
            device, _particlePlane.State));
        Meshes[0].Textures[0] =
            SimulationEngine.ResourceCache.CreateTextureFromFile(
                device, "particle.bmp");

        if (Effect != null)
            _timeAttenuationHandle = Effect.GetParameter(
                "timeAttenuation");

    }
    catch (Exception ex)
    {
        // clean up
        if (PhysicsEntity != null)
            PhysicsEngine.DeleteEntity(PhysicsEntity);

        HasBeenInitialized = false;
        InitError = ex.ToString();
    }
}

This entity is unusual in that its single mesh is used to render impact points, rather than the geometry of the entity. This won't be a problem for the Corobot because the IR sensors are very small compared to the body of the robot and they are fixed to the body, so it isn't necessary to render them as a separate mesh.

Most of the work that the entity does is in the Update method:

public override void Update(FrameUpdate update)
{
    base.Update(update);
    _elapsedSinceLastScan += (float)update.ElapsedTime;
    _appTime = (float)update.ApplicationTime;
// only retrieve raycast results every SCAN_INTERVAL.
    if ((_elapsedSinceLastScan > SCAN_INTERVAL) &&
        (_raycastProperties != null))
    {

It is a fairly expensive operation to cast rays into the physics environment to determine which physics object they intersect, so this operation is not done every frame. The SCAN_INTERVAL constant determines the frequency with which the distance value is updated.

The position and orientation of the raycast pattern is set according to the Pose of this entity and the Pose of its parent:

_elapsedSinceLastScan = 0;

        _raycastProperties.OriginPose.Orientation =
            TypeConversion.FromXNA(
                TypeConversion.ToXNA(Parent.State.Pose.Orientation) *
                TypeConversion.ToXNA(State.Pose.Orientation));

        _raycastProperties.OriginPose.Position =
            TypeConversion.FromXNA(
                xna.Vector3.Transform(
                    TypeConversion.ToXNA(State.Pose.Position),
                    Parent.World));

You use the PhysicsEngine Raycast2D API to find the intersections of the rays with physics shapes in the environment. The first set of rays that you cast are in the horizontal plane:

// cast rays on a horizontal plane and again on a vertical plane
        _raycastResultsPort =
            PhysicsEngine.Raycast2D(_raycastProperties);

If the first raycast was successful, then you cast a second set of rays in the vertical plane to form a cross pattern:

_raycastResultsPort.Test(out _lastResults);
        if (_lastResults != null)
        {

You combine the impact points, if any, from both sets of rays and then find the distance to the closest intersection:

RaycastResult verticalResults;

            // rotate the plane by 90 degrees
            _raycastProperties.OriginPose.Orientation =
                TypeConversion.FromXNA(
                    TypeConversion.ToXNA(
                        _raycastProperties.OriginPose.Orientation) *
                        xna.Quaternion.CreateFromAxisAngle(
                            new xna.Vector3(0, 0, 1),
                            (float)Math.PI / 2f));
_raycastResultsPort =
                PhysicsEngine.Raycast2D(_raycastProperties);
            _raycastResultsPort.Test(out verticalResults);

            // combine the results of the second raycast with the first
            if (verticalResults != null)
            {
                foreach (RaycastImpactPoint impact in
                verticalResults.ImpactPoints)
                    _lastResults.ImpactPoints.Add(impact);
            }

That is the distance you return. If there is no intersection, then MaximumRange is returned.

// find the shortest distance to an impact point
            float minRange = MaximumRange * MaximumRange;
            xna.Vector4 origin = new xna.Vector4(
                TypeConversion.ToXNA(
                    _raycastProperties.OriginPose.Position), 1);

            foreach (RaycastImpactPoint impact in
            _lastResults.ImpactPoints)
            {
                xna.Vector3 impactVector = new xna.Vector3(
                    impact.Position.X—origin.X,
                    impact.Position.Y—origin.Y,
                    impact.Position.Z—origin.Z);

                float impactDistanceSquared =
                    impactVector.LengthSquared();
                if (impactDistanceSquared < minRange)
                    minRange = impactDistanceSquared;
            }
            _distance = (float)Math.Sqrt(minRange);
        }
    }
}

The final two entity methods render the impact points of the rays:

public override void Render(
    RenderMode renderMode,
    MatrixTransforms transforms,
    CameraEntity currentCamera)
{
    if ((int)(Flags & VisualEntityProperties.DisableRendering) > 0)
        return;

Rendering of the impact points is disabled if the DisableRendering flag is set:

if (_lastResults != null)
        RenderResults(renderMode, transforms, currentCamera);
}

void RenderResults(
    RenderMode renderMode,
    MatrixTransforms transforms,
    CameraEntity currentCamera)
{
    _timeAttenuationHandle.SetValue(
        new xna.Vector4(100 * (float)Math.Cos(
            _appTime * (1.0f / SCAN_INTERVAL)), 0, 0, 1));

This sets a value in the effect that causes the impact points to flash on and off. A local transform matrix is built that rotates the impact point mesh so that it faces the camera:

// render impact points as a quad
    xna.Matrix inverseViewRotation = currentCamera.ViewMatrix;
    inverseViewRotation.M41 =
        inverseViewRotation.M42 =
            inverseViewRotation.M43 = 0;
    xna.Matrix.Invert(ref inverseViewRotation, out inverseViewRotation);
    xna.Matrix localTransform = xna.Matrix.CreateFromAxisAngle(
        new xna.Vector3(1, 0, 0),
        (float)-Math.PI / 2) * inverseViewRotation;
    SimulationEngine.GlobalInstance.Device.RenderState.
        DepthBufferWriteEnable = false;

The DepthBuffer is disabled because these impact points should not occlude other objects. The impact point mesh is adjusted to be a little closer to the ray emitter than the exact impact point:

for (int i = 0; i < _lastResults.ImpactPoints.Count; i++)
    {
        xna.Vector3 pos = new
            xna.Vector3(_lastResults.ImpactPoints[i].Position.X,
                        _lastResults.ImpactPoints[i].Position.Y,
                        _lastResults.ImpactPoints[i].Position.Z);

        xna.Vector3 resultDir = pos—Parent.Position;
        resultDir.Normalize();
        localTransform.Translation = pos—.02f * resultDir;
        transforms.World = localTransform;

This helps the impact points to show up clearly instead of being rendered in the same plane as the shape they intersected.

base.Render(renderMode, transforms, Meshes[0]);
    }
    SimulationEngine.GlobalInstance.Device.RenderState.
        DepthBufferWriteEnable = true;
}

Now that you've completely defined the CorobotIR entity, you want to add two of them to your Corobot—one in the front and one in the rear. Add the following code to the CorobotEntity nondefault constructor:

InsertEntityGlobal(
    new CorobotIREntity(
        name + "_rearIR",
        new Pose(new Vector3(
            0,
            chassisDimensions.Y / 2.0f + chassisClearance,
            chassisDimensions.Z / 2.0f))));

The default orientation for the IR entity is facing toward the +Z direction. That faces toward the rear of the CorobotEntity, so the rear IR sensor is inserted with a default orientation. The coordinates of the position vector place the sensor in the middle of the rear face of the chassis. The position is specified using world coordinates instead of coordinates relative to the parent entity because the InsertEntityGlobal method is used to add the child entity.

The call to insert the front IR entity is essentially the same except that the position coordinates place it in the center of the front face of the chassis shape, just above the camera:

InsertEntityGlobal(
    new CorobotIREntity(
        name + "_frontIR",
        new Pose(new Vector3(
            0,
            chassisDimensions.Y / 2.0f + chassisClearance,
            -chassisDimensions.Z / 2.0f),
        TypeConversion.FromXNA(
            xna.Quaternion.CreateFromAxisAngle(
                new xna.Vector3(0, 1, 0), (float)Math.PI)))));

In addition, the entity is created with a Pose that rotates it 180 degrees around the +Y axis so that it is facing toward the front of the entity.

With all of the rotations and transformations going on, it is important to test these new sensors to ensure that they are oriented and mounted correctly:

  1. Run the Corobot manifest. You should see the rendered impact marks from the rear IR sensor flashing on and off on the side of the giant box.

  2. Move the Corobot slightly forward. The impact points should disappear. This indicates that the giant box is farther from the sensor than 30 inches.

  3. Move the entity closer to the giant box until the impact points are again visible.

  4. Start the Simulation Editor by pressing F5, expand the Corobot entity in the Entities pane to see its children, and select the Corobot_readIR entity.

  5. Set the DisableRendering flag in the entity flags. The impact points should disappear. This verifies that the rear IR sensor is actually generating those impacts.

  6. Check the Distance property of the Corobot_rearIR sensor. It should change as the Corobot moves closer to the giant box.

    Repeat these tests with the front IR sensor to verify that it is also working correctly. Figure 6-12 shows how the laser impact points should appear in the scene when the IR sensor is working properly. (Because this book is black and white, the impact points were enhanced; on the interface they appear read).

    Figure 6-12

    Figure 6.12. Figure 6-12

Adding a SimulatedIR Service

Now that you have an IR entity, you also need a SimulatedIR service to go along with it. This one will be much simpler than the SimulatedQuadDifferentialDrive. You're going to use the existing AnalogSensor contract defined in RoboticsCommon.dll to reduce the amount of code you need to write. Unlike the SimulatedQuadDifferentialDrive service, you won't need to implement a _mainPort that supports different operations than the alternate contract. This _mainPort will implement only the AnalogSensor operations. This is analogous to subclassing an existing class and overriding methods to change behavior but adding no additional methods or public variables.

Start an MRDS command prompt and change to the ProMRDSMyChapter6 directory. Use the following command to generate the SimulatedIR service (bold code indicates something you should type):

C:Microsoft Robotics Studio (1.5)>
C:Microsoft Robotics Studio (1.5)>"cd ProMRDS
C:Microsoft Robotics Studio (1.5)ProMRDS>cd MyChapter6
C:Microsoft Robotics Studio (1.5)ProMRDSMyChapter6> dssnewservice /Service:"SimulatedIR" /Namespace:"ProMRDS.Simulation.SimulatedIR"
/alt:"http://schemas.microsoft.com/robotics/2006/06/analogsensor.html" /i:"Microsoft Robotics Studio (1.5)inRoboticsCommon.dll" /year:"07" /month:"08"

As you would by now expect, this generates a service called SimulatedIR, which supports the AnalogSensor contract. Open your newly generated simulatedIR.csproj from the command line so that Visual Studio inherits the environment from the MRDS command-line environment. Use the following steps to transform this generic service into a SimulatedIR service. Refer to the completed service in the Chapter6 directory as necessary.

  1. Add the using statements and DLL references required for a simulation service just as you did in the section "The SimulatedQuadDifferentialDrive Service" earlier in this chapter. Don't forget to set the CopyLocal and SpecificVersion properties to false for each reference added.

  2. Add the following additional using statement and a reference to the Corobot service:

    using corobot = ProMRDS.Simulation.Corobot;
  3. Change the DisplayName and Description attributes to describe the service.

  4. Add two private class members to handle subscribing to the simulation engine:

    corobot.CorobotIREntity _entity;
    simengine.SimulationEnginePort _notificationTarget;
  5. Change the AllowMultipleInstances parameter of the ServicePort attribute on the _mainPort from false to true. You want multiple instances of this service running because you have multiple IR sensors to support.

    [ServicePort("/simulatedir", AllowMultipleInstances=true)]
  6. Add a SubscriptionManagerPort to handle interactions with the SubscriptionManager service:

    [Partner("SubMgr",
        Contract = submgr.Contract.Identifier,
        CreationPolicy = PartnerCreationPolicy.CreateAlways)]
    private submgr.SubscriptionManagerPort _submgrPort =
        new submgr.SubscriptionManagerPort();
  7. Add the following code to the Start method to subscribe for a partner entity from the SimulationEngine service and to set up a handler for the notification just as you did in the SimulatedQuadDifferentialDrive service:

    protected override void Start()
    {
        _notificationTarget = new simengine.SimulationEnginePort();
    
        // PartnerType.Service is the entity instance name.
        simengine.SimulationEngine.GlobalInstancePort.Subscribe(
            ServiceInfo.PartnerList, _notificationTarget);
    
        // don't start listening to DSSP operations, other than drop,
        // until notification of entity
        Activate(new Interleave(
            new TeardownReceiverGroup
            (
                Arbiter.Receive<simengine.InsertSimulationEntity>(
                    false,
                    _notificationTarget,
                    InsertEntityNotificationHandlerFirstTime),
                Arbiter.Receive<dssp.DsspDefaultDrop>(
    false,
                    _mainPort,
                    DefaultDropHandler)
            ),u
            new ExclusiveReceiverGroup(),
            new ConcurrentReceiverGroup()
        ));
    
        // start notification method
        SpawnIterator<DateTime>(DateTime.Now, CheckForStateChange);
    }
  8. The SpawnIterator call at the end of the Start method is used to start up a method that periodically checks for a change in the reading from the IR sensor and sends a notification to subscribers if necessary. Add this code for that method. When this method runs, it checks whether the Distance property on the entity has changed. If it has, the service updates its state from the entity and then sends a notification to all subscribed services. The method then sets itself to wake up again after 200 ms have elapsed.

    float _previousDistance = 0;
    
    private IEnumerator<ITask> CheckForStateChange(DateTime timeout)
    {
        while (true)
        {
            if (_entity != null)
            {
                if (_entity.Distance != _previousDistance)
                {
                    // send notification of state change
                    UpdateState();
                    base.SendNotification<Replace>(_submgrPort, _state);
                }
                _previousDistance = _entity.Distance;
            }
            yield return Arbiter.Receive(false, TimeoutPort(200), delegate { });
        }
    }
  9. Add the following code to create a default state for the service and to update the service state from the CorobotIR entity. The state is defined as part of the AnalogSensor contract. It includes a raw measurement, which is equal to the Distance property on the entity; a RawMeasurementRange, which reflects the maximum value the sensor can have; and a NormalizedMeasurement value, which is the raw value normalized against the maximum value:

    private void CreateDefaultState()
    {
        _state.HardwareIdentifier = 0;
        _state.NormalizedMeasurement = 0;
        _state.Pose = new Microsoft.Robotics.PhysicalModel.Proxy.Pose();
        _state.RawMeasurement = 0;
        _state.RawMeasurementRange = _entity.MaximumRange;
    }
    void UpdateState()
    {
        // update our state from the entity
        _state.RawMeasurement = _entity.Distance;
        _state.NormalizedMeasurement =
            _state.RawMeasurement / _state.RawMeasurementRange;
        _state.TimeStamp = DateTime.Now;
    }
  10. Add the following code to receive notifications from the SimulationEngine service. These methods are nearly identical to the corresponding methods in the SimulatedQuadDifferentialDrive service:

    void InsertEntityNotificationHandlerFirstTime(simengine.InsertSimulationEntity ins)
    {
        InsertEntityNotificationHandler(ins);
    
        base.Start();
    
                    // Add service specific initialization here.
        MainPortInterleave.CombineWith(
            new Interleave(
                new TeardownReceiverGroup(),
                new ExclusiveReceiverGroup(
                    Arbiter.Receive<simengine.InsertSimulationEntity>(
                        true,
                        _notificationTarget,
                        InsertEntityNotificationHandler),
                    Arbiter.Receive<simengine.DeleteSimulationEntity>(
                        true,
                        _notificationTarget,
                        DeleteEntityNotificationHandler)
                ),
                new ConcurrentReceiverGroup()
            )
        );
    }
    
    void InsertEntityNotificationHandler(simengine.InsertSimulationEntity ins)
    {
        _entity = (corobot.CorobotIREntity)ins.Body;
        _entity.ServiceContract = Contract.Identifier;
    
        CreateDefaultState();
    }
    void DeleteEntityNotificationHandler(simengine.DeleteSimulationEntity del)
    
    {
        _entity = null;
    }
  11. Add a call to UpdateState before the state is posted to the response port in the GetHandler method:

    [ServiceHandler(ServiceHandlerBehavior.Concurrent)]
    public virtual IEnumerator<ITask> GetHandler(pxanalogsensor.Get get)
    {
      UpdateState();
        get.ResponsePort.Post(_state);
        yield break;
    }
  12. Add a call to SubscribeHelper in the SubscribeHandler to manage subscription requests. Handling subscriptions is described in Service Tutorial 4 in the SDK documentation.

    public virtual IEnumerator<ITask> SubscribeHandler(pxanalogsensor.Subscribe subscribe)
    {
        SubscribeHelper(
            _submgrPort,
            subscribe.Body,
            subscribe.ResponsePort);
        yield break;
    }
  13. Add the following entries to the Corobot.manifest.xml file to start two copies of this new service. In the Corobot service, you gave the name "Corobot_frontIR" to the front IR sensor and "Corobot_rearIR" to the rear sensor. The entity partners associated with each of these services have those same names. You also used the <dssp:Service> attribute to specify a name for the service so that you can distinguish the front IR service from the rear one.

    <!-- Start Front IR service -->
    <ServiceRecordType>
      <dssp:Contract>http://schemas.tempuri.org/2007/08/simulatedir.html</dssp:Contract>
      <dssp:Service>http://localhost/Corobot/FrontIR</dssp:Service>
      <dssp:PartnerList>
        <dssp:Partner>
          <!-- The partner name must match the entity name -->
          <dssp:Service>http://localhost/Corobot_frontIR</dssp:Service>
          <dssp:Name>simcommon:Entity</dssp:Name>
        </dssp:Partner>
      </dssp:PartnerList>
    </ServiceRecordType>
    
    <!-- Start Rear IR service -->
    <ServiceRecordType>
      <dssp:Contract>http://schemas.tempuri.org/2007/08/simulatedir.html</dssp:Contract>
      <dssp:Service>http://localhost/Corobot/RearIR</dssp:Service>
      <dssp:PartnerList>
        <dssp:Partner>
          <!-- The partner name must match the entity name -->
    <dssp:Service>http://localhost/Corobot_rearIR</dssp:Service>
          <dssp:Name>simcommon:Entity</dssp:Name>
        </dssp:Partner>
      </dssp:PartnerList>
    </ServiceRecordType>

    That's it. You should now have a working SimulatedIR service. Verify it by running the Corobot manifest. Start a browser window and navigate to http://localhost:50000 and select Service Directory from the left column. You should see something similar to what is shown in Figure 6-13. Each service is listed twice because DssHost adds two entries to the service directory for services that support an alternate contract: one for their own contract and one for the alternate contract they support. In this case, both entries refer to the same port. Click each of the SimulatedIR services to see their current state. As you drive the Corobot around in the environment, refresh the service state to verify that the RawMeasurement field correctly reflects the distance from that IR sensor to the giant box.

Figure 6-13

Figure 6.13. Figure 6-13

Summary

That completes the basic functionality for the Corobot entity and its associated services. This entity can now be used in a variety of simulation scenarios, such as the Robo-Magellan scenario covered in the next chapter.

This chapter began with an overview of the methods and types provided by the various simulation DLLs, including the characteristics of the VisualEntity type, which is fundamental to creating new simulation entities. The characteristics of a simulation service were also described, including their relationship to orchestration services such as the SimMagellan service.

You created a new simulation service called Corobot that added entities to the simulation environment. This eventually included a model of the Corobot robot. You then defined a SimulatedQuadDifferentialDrive service and used it to drive the Corobot around the environment using the SimpleDashboard service. You tuned the top speed of the entity and the tire friction and then you made the wheels turn as the robot moves. Finally, you added a detailed 3D mesh to the Corobot model to make it look more realistic, and then you added a camera and defined a Simulated IR Distance sensor entity and service so that you could add IR sensors to the front and rear of the Corobot.

The entities and services defined in this chapter provide examples for you to use as you create your own custom entities and their associated services.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.124.21