Accessing video streams (depth/IR/RGB) and configuring them

In OpenNI 2 there is only one class that is responsible for giving us access to the output of all video-based sensors (depth/IR/RGB) that have made our work very simple compared to the OpenNI 1.x era, where we needed to use three different classes to access sensors. In this recipe we will show you how to access the depth sensor and initialize it. For accessing the IR sensor and RGB sensor we need to follow the same procedure that we will discuss more in the How It Works… section of this recipe. We will show you how to select an output video mode for a sensor too. Also we will show you how to ask a device to see if an output is supported or not.

We will not cover other configurable properties of the openni::VideoStream class including cropping and mirroring in this recipe; read Chapter 4, More about Low-level Outputs about this topic.

Getting ready

Create a project in Visual Studio 2010 and prepare it for working with OpenNI using the Creating a project in Visual Studio 2010 recipe in this chapter.

How to do it...

Have a look at the following steps:

  1. Open your project and then the project's main source code file. Locate this line:
    int _tmain(int argc, _TCHAR* argv[])
    {
  2. Write the following code snippet above the preceding line of code:
    char ReadLastCharOfLine()
    {
      int newChar = 0;
      int lastChar;
      fflush(stdout);
      do 
      {
        lastChar = newChar;
        newChar = getchar();
      }
      while ((newChar != '
    ') && (newChar != EOF));
      return (char)lastChar;
     }
    
    bool HandleStatus(Status status)
    {
      if (status == STATUS_OK)
        return true;
      printf("ERROR: #%d, %s", status,
        OpenNI::getExtendedError());
      ReadLastCharOfLine();
      return false;
    }
  3. Locate this line again:
    int _tmain(int argc, _TCHAR* argv[])
    {
  4. Write the following code snippet below the preceding line of code:
      Status status = STATUS_OK;
      printf("Scanning machine for devices and loading "
          "modules/drivers ...
    ");
      status = OpenNI::initialize();
      if (!HandleStatus(status)) return 1;
      printf("Completed.
    ");
    
      printf("Press ENTER to continue.
    ");
      ReadLastCharOfLine();
    
      printf("Opening any device ...
    ");
      Device device;
      status = device.open(ANY_DEVICE);
      if (!HandleStatus(status)) return 1;
      printf("%s Opened, Completed.
    ",
        device.getDeviceInfo().getName());
    
      printf("Press ENTER to continue.
    ");
      ReadLastCharOfLine();
    
      printf("Checking if depth stream is supported ...
    ");
      if (!device.hasSensor(SENSOR_DEPTH))
      {
        printf("Depth stream not supported by this device. "
            "Press ENTER to exit.
    ");
        ReadLastCharOfLine();
        return 1;
      }
    
      printf("Asking device to create a depth stream ...
    ");
      VideoStream sensor;
      status = sensor.create(device, SENSOR_DEPTH);
      if (!HandleStatus(status)) return 1;
      printf("Completed.
    ");
    
      printf("Changing sensor video mode to 640x480@30fps.
    ");
      VideoMode depthVM;
      depthVM.setFps(30);
      depthVM.setResolution(640,480);
      depthVM.setPixelFormat(PIXEL_FORMAT_DEPTH_1_MM);
      status = sensor.setVideoMode(depthVM);
      if (!HandleStatus(status)) return 1;
      printf("Completed.
    ");
    
      printf("Asking sensor to start receiving data ...
    ");
      status = sensor.start();
      if (!HandleStatus(status)) return 1;
      printf("Completed.
    ");
    
      printf("Press ENTER to exit.
    ");
      ReadLastCharOfLine();
      sensor.destroy();
      device.close();
      OpenNI::shutdown();
      return 0;

How it works...

First we defined our ReadLastCharOfLine() and HandleStatus() methods just like we did previously; read the previous recipe about that.

Then in the first line of the second step we used the openni::OpenNI::initialize() method to initialize OpenNI and load modules and drivers. Again you can read the previous recipe for more information.

Our main code actually started when we defined a variable of type openni:Device. Then, using this variable we opened access to the first driver in the list of OpenNI's connected devices. We also checked (with the HandleStatus() function) to see if this process ended without any error message so as to continue or write the error to the console and return 1 if there was any error.

  Device device;
  status = device.open(ANY_DEVICE);
  if (!HandleStatus(status)) return 1;

From now, using the device variable we can request access to a depth sensor (or any other type we want), but before that it is a good idea to check if this type of sensor is even supported by this device or not.

  if (!device.hasSensor(SENSOR_DEPTH))
  {
    printf("Depth stream not supported by this device. "
        "Press ENTER to exit.
");
    ReadLastCharOfLine();
    return 1;
  }

Note SensorType enum in the code; currently there are three types of video sensors that we can use or send a request for:

  • openni::SensorType::SENSOR_COLOR: RGB camera
  • openni::SensorType::SENSOR_DEPTH: Depth data
  • openni::SensorType::SENSOR_IR: IR output from IR camera

Any line after the previous condition will run only if our desired sensor type is supported by the device. And if the sensor is supported by the device, we can request an access to this sensor; for doing so we need to create a variable from openni::VideoStream type, ask it to initialize for our device's depth stream, and of course if any error happens we need to handle that.

  VideoStream sensor;
  status = sensor.create(device, SENSOR_DEPTH);
  if (!HandleStatus(status)) return 1;

This will give us access to the depth sensor with default settings; but we want to use a specific video mode for the output of this sensor so we need to change this configuration.

For doing so we need to create a variable of type openni::VideoMode, change it the way we want, and then pass it to our sensor. Again we must take care of any error in this process.

  VideoMode depthVM;
  depthVM.setFps(30);
  depthVM.setResolution(640,480);
  depthVM.setPixelFormat(PIXEL_FORMAT_DEPTH_1_MM);
  status = sensor.setVideoMode(depthVM);
  if (!HandleStatus(status)) return 1;

The code will request the output with a resolution of (640,480) at 30 frames per second and with PIXEL_FORMAT_DEPTH_1_MM pixel format. Read more about pixel formats in the There's more… section.

When we are done with configuring our sensor, we can ask it to start receiving data. This process includes requesting the device to start the sensor and send the required data to the machine. We didn't have any real communication with the physical device before this part of our code.

  status = sensor.start();
  if (!HandleStatus(status)) return 1;

Then we need to read data from the sensor so as to use or display it. We don't cover this topic here; we simply wait for the user input and then we will end our application. And of course, we would ask sensor and device to release resources and then openni:OpenNI to shutdown() before ending.

  ReadLastCharOfLine();
  sensor.destroy();
  device.close();
  OpenNI::shutdown();
  return 0;
How it works...

There's more...

Read the How it works... section of this recipe about how to use openni::SensorType enum to select the desired sensor when creating the openni::VideoSensor object. You also need to select a supported openni::PixelFormat for the type of sensor you selected; for example, you can't request for openni::PixelFormat::PIXEL_FORMAT_DEPTH_1_MM as the format for receiving data from the color sensor. Read the next topic for more information.

You can also define different variables of type openni::VideoSensor and use more than one sensor at a time. But you must keep in mind that creating (or in other words, requesting) the same sensor twice will result in having the same underlying object with two wrappers; that means any change to one will mirror to the other one.

Also it is impossible to have both IR and color stream active; you must stop one before using another one. It seems that this limitation comes from limited USB 2 bandwidth. Also it is important to know that depth output is based on IR CMOS sensor so you can't have both IR and depth active with different resolutions. You must always keep their resolutions same. There is only one exception when using IR with 1280x1024 resolution; in this case, depth must be in 640x480 with 30 fps. The exception is only correct when we use Asus Xtion or PrimeSense sensors.

Pixel formats

In the current version of OpenNI (V2.2) there are ten types of pixel formats that can be used to read data from video streams; we will describe some of them more specifically in the later chapters (when we really need them to read data from streams), but now you can get an idea from the following table:

Name

Used for

Description

openni::PixelFormat::PIXEL_FORMAT_DEPTH_1_MM

Depth stream

This is the usual way to read depth data from the stream. The values are in depth pixel with 1mm accuracy

openni::PixelFormat::PIXEL_FORMAT_DEPTH_100_UM

Depth stream

This is same as openni::PixelFormat::PIXEL_FORMAT_DEPTH_1_MM but with 0.1mm (100 micrometres) accuracy. It is not supported by any currently released sensor (at the time of writing of this book),

openni::PixelFormat::PIXEL_FORMAT_SHIFT_9_2

Depth stream

It is the value of displacement between the projected pattern and the device's view of projected pattern to the environment.

This is the RAW output of depth stream. This output is used for creating openni::PixelFormat::PIXEL_FORMAT_DEPTH_1_MM by driver.

openni::PixelFormat::PIXEL_FORMAT_SHIFT_9_3

Depth stream

This is same as openni::PixelFormat::PIXEL_FORMAT_SHIFT_9_2 but with more accuracy.

This output is used for creating openni::PixelFormat::PIXEL_FORMAT_DEPTH_100_UM by driver.

Not supported by any currently released sensor (at the time of writing of this book).

openni::PixelFormat::PIXEL_FORMAT_RGB888

Color and IR streams

This can be used for color and IR streams to generate data with a 24-bit bitmap format. Usually used for color stream as the main usable output format and rarely used for IR, because using Grayscale 16-bit gives us a more detailed output when reading from the IR stream but with less bandwidth and memory usage.

openni::PixelFormat::PIXEL_FORMAT_YUV422

Color stream

YCbCr (commonly called as YUV422) is a way to encode RGB data to reduce redundancy of data by reducing the size of an image (using YUV422) to 2/3 of RGB bitmap size. Using this output format that is supported by color stream, we can use less memory to manipulate it and also the device needs less bandwidth to send it. But for displaying to the user, this format is not very useable as we need to do a number of calculations until it becomes ready (getting it converted to RGB value). In YUV422, the byte order is UY1VY2.

openni::PixelFormat::PIXEL_FORMAT_GRAY8

Color stream

Grayscale 8-bit contains the average of all three RGB values, which is displayed as one single value. Useable when we don't need to know colors; so we can prevent wasting memory and bandwidth for receiving and manipulating unneeded data.

openni::PixelFormat::PIXEL_FORMAT_GRAY16

IR stream

Grayscale 16-bit is usable only for reading data from the IR stream. Grayscale 16-bit has more details than Grayscale 8-bit (256 times more).

openni::PixelFormat::PIXEL_FORMAT_JPEG

Not supported yet

This is not supported by any currently released device (at the time of writing of this book)

It is expected to be used with color sensor to receive JPEG directly from the device.

openni::PixelFormat::PIXEL_FORMAT_YUYV

Color stream

This is same as openni::PixelFormat::PIXEL_FORMAT_YUV422 but with different byte order. Byte order of YUYV (also known as YUV2) is Y1UY2V.

Known supported list of resolutions of each sensor in different devices

The following is a list of known supported resolutions of each sensor along with their fps in different devices:

Device

Sensor

Resolution

Frames per second

Asus Xtion PrimeSense Sensor

Depth

320x240

25/30/60

640x480

25/30

Image IR

320x240

25/30/60

640x480

25/30

1280x1024

30

Kinect Kinect for Windows

Depth

80x60

30

320x240

30

640x480

30

Image IR

640x480

30

1280x960

12

See also

  • The Reading and showing a frame from the depth sensor recipe in Chapter 3, Using Low-level Data
  • The Reading and showing a frame from the image sensor (color / IR) recipe in Chapter 3, Using Low-level Data
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.78.137