Tracking hand gestures in real time

Hand gestures are analyzed by the HandGestureRecognition class, especially by its recognize method. This class starts off with a few parameter initializations, which will be explained and used later:

class HandGestureRecognition:
    def __init__(self):
        # maximum depth deviation for a pixel to be considered # within range
        self.abs_depth_dev = 14

        # cut-off angle (deg): everything below this is a convexity 
        # point that belongs to two extended fingers
        self.thresh_deg = 80.0

The recognize method is where the real magic takes place. This method handles the entire process flow, from the raw grayscale image all the way to a recognized hand gesture. It implements the following procedure:

  1. It extracts the user's hand region by analyzing the depth map (img_gray) and returning a hand region mask (segment):
    def recognize(self, img_gray):
        segment = self._segment_arm(img_gray)
  2. It performs contour analysis on the hand region mask (segment). Then, it returns the largest contour area found in the image (contours) and any convexity defects (defects):
    [contours, defects] = self._find_hull_defects(segment)
  3. Based on the contours found and the convexity defects, it detects the number of extended fingers (num_fingers) in the image. Then, it annotates the output image (img_draw) with contours, defect points, and the number of extended fingers:
    img_draw = cv2.cvtColor(img_gray, cv2.COLOR_GRAY2RGB)
    [num_fingers, img_draw] = self._detect_num_fingers(contours,
            defects, img_draw)
  4. It returns the estimated number of extended fingers (num_fingers), as well as the annotated output image (img_draw):
    return (num_fingers, img_draw)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.219.249.210