0%

Book Description

Learn how to write NIUI-based applications and motion-controlled games

  • Use OpenNI for all your needs from games and application UI to low-level data processing or motion detection
  • Learn more about the Natural Interaction features of OpenNI
  • The book is useful for both beginners and professionals because it covers the most basic to advanced concepts in the OpenNi technology.
  • Full of illustrations, examples, and tips for understanding different aspects of topics, with clear step-by-step instructions to get different parts of OpenNI working for you

In Detail

The release of Microsoft Kinect, then PrimeSense Sensor, and Asus Xtion opened new doors for developers to interact with users, re-design their application’s UI, and make them environment (context) aware. For this purpose, developers need a good framework which provides a complete application programming interface (API), and OpenNI is the first choice in this field. This book introduces the new version of OpenNI.

"OpenNI Cookbook" will show you how to start developing a Natural Interaction UI for your applications or games with high level APIs and at the same time access RAW data from different sensors of different hardware supported by OpenNI using low level APIs. It also deals with expanding OpenNI by writing new modules and expanding applications using different OpenNI compatible middleware, including NITE.

"OpenNI Cookbook" favors practical examples over plain theory, giving you a more hands-on experience to help you learn. OpenNI Cookbook starts with information about installing devices and retrieving RAW data from them, and then shows how to use this data in applications. You will learn how to access a device or how to read data from it and show them using OpenGL, or use middleware (especially NITE) to track and recognize users, hands, and guess the skeleton of a person in front of a device, all through examples.You also learn about more advanced aspects such as how to write a simple module or middleware for OpenNI itself.

"OpenNI Cookbook" shows you how to start and experiment with both NIUI designs and OpenNI itself using examples.

Table of Contents

  1. OpenNI Cookbook
    1. Table of Contents
    2. OpenNI Cookbook
    3. Credits
    4. About the Author
    5. About the Reviewers
    6. www.PacktPub.com
      1. Support files, eBooks, discount offers, and more
        1. Why Subscribe?
        2. Free Access for Packt account holders
    7. Preface
      1. What this book covers
      2. What you need for this book
      3. Who this book is for
      4. Conventions
      5. Reader feedback
      6. Customer support
        1. Downloading the example code
        2. Errata
        3. Piracy
        4. Questions
    8. 1. Getting Started
      1. Introduction
        1. Introduction to the "Introduction"
        2. Motion-capture devices and the technologies behind them
        3. What is OpenNI?
        4. What is NiTE?
        5. Developing applications and games with the Natural Interactive User Interface
      2. Downloading and installing OpenNI
        1. How to do it...
        2. How it works...
        3. See also
      3. Downloading and installing NiTE
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      4. Downloading and installing the Microsoft Kinect SDK
        1. How to do it...
        2. How to do it...
        3. See also
      5. Connecting Asus Xtion and PrimeSense sensors
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      6. Connecting Microsoft Kinect
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
    9. 2. OpenNI and C++
      1. Introduction
        1. The OpenNI object
        2. The device object
        3. The VideoStream object
          1. Sharing devices between applications
          2. VideoStream paused state
      2. Creating a project in Visual Studio 2010
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
        5. See also
      3. OpenNI class and error handling
        1. Getting ready
        2. How to do it...
        3. How it works...
          1. Defining a method for displaying error message
          2. Possible values of openni::Status
      4. Enumerating a list of connected devices
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
          1. List of known Product IDs and Vendor IDs at the time of writing of this book
        5. See also
      5. Accessing video streams (depth/IR/RGB) and configuring them
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
          1. Pixel formats
          2. Known supported list of resolutions of each sensor in different devices
        5. See also
      6. Retrieving a list of supported video modes for depth stream
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      7. Selecting a specific device for accessing depth stream
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      8. Listening to the device connect and disconnect events
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
          1. Device state changed event
          2. Stop listening to events
        5. See also
      9. Opening an already recorded file (ONI file) instead of a device
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
        5. See also
    10. 3. Using Low-level Data
      1. Introduction
        1. VideoFrameRef object
        2. Back to the OpenNI object again
      2. Configuring Visual Studio 2010 to use OpenGL
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
          1. GLUT alternatives
      3. Initializing and preparing OpenGL
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      4. Reading and showing a frame from the image sensor (color/IR)
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      5. Reading and showing a frame from the depth sensor
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
          1. Histogram equalization – better details in the same color space
          2. Wider color space for showing more details
          3. Filling shadows
        5. See also
      6. Controlling the player when opening a device from file
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      7. Recording streams to file (ONI file)
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      8. Event-based reading of data
        1. Getting ready
        2. How to do it...
        3. How it works...
    11. 4. More about Low-level Outputs
      1. Introduction
        1. The openni::Device object
        2. The openni::VideoStream object
        3. The openni::CoordinateConverter class
        4. The openni::CameraSettings object
      2. Cropping and mirroring frames right from the buffer
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      3. Syncing image and depth sensors to read new frames from both streams at the same time
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      4. Overlaying the depth frame over the image frame
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      5. Converting the depth unit to millimetre
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
        5. See also
      6. Retrieving the color of the nearest point without depth over color registration
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      7. Enabling/disabling auto exposure and auto white balance
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
        5. See also
    12. 5. NiTE and User Tracking
      1. Introduction
        1. The nite::NiTE object
        2. The nite::UserTracker object
        3. The nite::UserTrackerFrameRef object
        4. The nite::UserMap object
        5. The nite::UserData object
      2. Getting a list of all the active users
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      3. Identifying and coloring users' pixels in depth map
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      4. Reading users' bounding boxes and center of mass
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. There's more...
        5. See also
      5. Event-based reading of users' data
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
    13. 6. NiTE and Hand Tracking
      1. Introduction
        1. The nite::HandTracker object
        2. The nite::HandTrackerFrameRef object
        3. The nite::HandData object
        4. The nite::GestureData object
        5. Compared to skeleton tracking
      2. Recognizing predefined hand gestures
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      3. Tracking hands
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      4. Finding the related user ID for each hand ID
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      5. Event-based reading of hands' data
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      6. Working sample for controlling the mouse by hand
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
    14. 7. NiTE and Skeleton Tracking
      1. Introduction
        1. The nite::UserTracker object
        2. The nite::PoseData object
        3. The nite::Skeleton object
        4. The nite::SkeletonJoint object
        5. The nite::UserData object
      2. Detecting a user's pose
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      3. Getting a user's skeleton joints and displaying their position in the depth map
        1. Getting ready
        2. How to do it...
        3. How it works...
        4. See also
      4. Designing a simple pong game using skeleton tracking
        1. How it works...
        2. See also
    15. Index
18.222.21.175