Knowing your mood using the Face API

The Face API allows you to recognize emotions from faces.

Research has shown that there are some key emotions that can be classified as cross-cultural. These are happiness, sadness, surprise, anger, fear, contempt, disgust, and neutral. All of these are detected by the API, which allows your applications to respond in a more personalized way by knowing the user's mood.

We will learn how to recognize emotions from images so that our smart-house application can know our mood.

Getting images from a web camera

Imagine that there are several cameras around your house. The smart-house application can see what your mood is at any time. By knowing this, it can utilize the mood to better predict your needs.

We are going to add web-camera capabilities to our application. If you do not have a web camera, you can follow along, but load images using the techniques we have already seen.

First we need to add a NuGet package to our smart-house application. Search for OpenCvSharp3-AnyCPU and install the package by shimat. This is a package that allows for the processing of images, and is utilized by the next dependency we are going to add.

In the example code provided, there is a project called VideoFrameAnalyzer. This is a project written by Microsoft that allows us to grab frame-by-frame images from a web camera. Using this, we are able to analyze emotions in our application. The use case we will execute is as follows:

Getting images from a web camera

In our HomeView.xaml file, add two new buttons. One will be to start the web camera while the other will be to stop it.

In the corresponding View model, add two ICommand properties for each of the buttons. Also add the following private members:

    private FrameGrabber<CameraResult> _frameGrabber;
    private static readonly ImageEncodingParam[] s_jpegParams = {
        new ImageEncodingParam(ImwriteFlags.JpegQuality, 60)
    };

The first one is a FrameGrabber object, which is from the VideoFrameAnalyzer project. The static member is an array of parameters for images, and is used when fetching web camera images. Additionally, we need to add a CameraResult class, which should be within the ViewModel file.

We initialize the EmotionScores to null, as shown in the following code. This is done so that new emotion scores always will be assigned from the most resent analysis result:

    internal class CameraResult {
        public EmotionScores EmotionScores { get; set; } = null;
    }

Add an initialization of the _frameGrabber member in the constructor and add the following in the Initialization function:

    _frameGrabber.NewFrameProvided += OnNewFrameProvided;

Each time a new frame is provided from the camera, an event is raised.

When we receive new frames, we want to create a BitmapImage from it to show it in the UI. To do so requires us to invoke the action from the current dispatcher, as the event is triggered from a background thread, as shown in the following code:

    private void OnNewFrameProvided(object sender, FrameGrabber<CameraResult>.NewFrameEventArgs e) {          
        Application.Current.Dispatcher.Invoke(() => {
            BitmapSource bitmapSource = e.Frame.Image.ToBitmapSource();

            JpegBitmapEncoder encoder = new JpegBitmapEncoder();
            MemoryStream memoryStream = new MemoryStream();
            BitmapImage image = new BitmapImage();

We get the BitmapSource of the Frame and create some required variables.

Using the encoder we created, we add the bitmapSource and save it to the memoryStream, as follows:

    encoder.Frames.Add(BitmapFrame.Create(bitmapSource));
    encoder.Save(memoryStream);

This memoryStream is then assigned to the BitmapImage we created, as shown in the following code. This is in turn assigned to the ImageSource, which will show the frame in the UI:

    memoryStream.Position = 0;
    image.BeginInit(); 
    image.CacheOption = BitmapCacheOption.OnLoad;
    image.StreamSource = memoryStream;
    image.EndInit();

    memoryStream.Close(); 
    ImageSource = image;

As this event will be triggered a lot, we will get a fluent stream in the UI, and it will seem like it is a direct video feed.

In our Initialization function, we will also need to create our ICommand for the buttons, as follows:

    StopCameraCommand = new DelegateCommand(StopCamera);
    StartCameraCommand = new DelegateCommand(StartCamera, CanStartCamera);

To be able to start the camera, we need to have selected a person group, and we need to have at least one camera available:

    private bool CanStartCamera(object obj) {
        return _frameGrabber.GetNumCameras() > 0 && SelectedPersonGroup != null;
    }

To start a camera, we need to specify which camera to use and how often we want to trigger an analysis using the following code:

    private async void StartCamera(object obj) {
        _frameGrabber.TriggerAnalysisOnInterval(TimeSpan.FromSeconds(5));
        await _frameGrabber.StartProcessingCameraAsync();
    }

If no camera is specified in StartProcessingCameraAsync, the first one available is chosen by default.

We will get back to the analysis part of this process soon.

To stop the camera, we run the following command:

    private async void StopCamera(object obj) {
        await _frameGrabber.StopProcessingAsync();
    }

Letting the smart house know your mood

We now have a video from the web camera available for our use.

In the FrameGrabber class, there is a Func, which will be used for analysis functions. We need to create the function that will be passed on this that will enable emotions to be recognized.

Create a new function, EmotionAnalysisAsync, that accepts a VideoFrame as a parameter. The return type should be Task<CameraResult> and the function should be marked as async.

The frame we get as a parameter is used to create a MemoryStream containing the current frame. This will be in the JPG file format. We will find a face in this image, and we want to ensure that we specify that we want emotion attributes using the following code:

private async Task<CameraResult> EmotionAnalysisAsync (VideoFrame frame) {
   MemoryStream jpg = frame.Image.ToMemoryStream(".jpg", s_jpegParams); 
   try {
      Face[] face = await _faceServiceClient.DetectAsync(jpg, true, false, new List<FaceAttributeType>
         { FaceAttributeType.Emotion });
      EmotionScores emotions = face.First()?.FaceAttributes?.Emotion;

A successful call will result in an object containing all the emotion scores, as shown in the following code. The scores are what we want to return:

    return new CameraResult {
        EmotionScores = emotions
    };

Catch any exceptions that may be thrown, returning null when they are.

We need to assign the Initialize function to the Func. We also need to add an event handler each time we have a new result.

When a new result is obtained, we grab the EmotionScore that is received, as shown in the following code. If it is null or does not contain any elements, then we do not want to do anything else:

    _frameGrabber.NewResultAvailable += OnResultAvailable;
    _frameGrabber.AnalysisFunction = EmotionAnalysisAsync;
    private void OnResultAvailable(object sender, FrameGrabber<CameraResult>.NewResultEventArgs e)
    {
        var analysisResult = e.Analysis.EmotionScores; 
        if (analysisResult == null)
            return;

In the following code, we parse the emotion scores in AnalyseEmotions, which we will look at in a bit:

        string emotion = AnalyseEmotions(analysisResult);

        Application.Current.Dispatcher.Invoke(() => {
            SystemResponse = $"You seem to be {emotion} today.";
        });
    }

Using the result from AnalyseEmotions, we print a string to the result to indicate the current mood. This will need to be invoked from the current dispatcher, as the event has been triggered in another thread.

To get the current mood in a readable format, we parse the emotion scores in AnalyseEmotions as follows:

    private string AnalyseEmotions(Scores analysisResult) {
        string emotion = string.Empty;
        var sortedEmotions = analysisResult.ToRankedList();
        string currentEmotion = sortedEmotions.First().Key;

With the Scores we get, we call a ToRankedList function. This will return a list of KeyValuePair, containing each emotion, along with the corresponding confidence. The first one will be the most likely, the second will be the second most likely, and so on. We only care about the most likely one, so we select it.

With the top emotion score selected, we use a switch statement to find the correct emotion. This is returned and printed to the result, as follows:

        switch(currentEmotion)
        { 
            case "Anger":
                emotion = "angry";
                break;
            case "Contempt":
                emotion = "contempt";
                break;
            case "Disgust":
                emotion = "disgusted";
                break;
            case "Fear":
                emotion = "scared";
                break;
            case "Happiness":
                emotion = "happy";
                break;
            case "Neutral":
                default:
                emotion = "neutral";
                break;
            case "Sadness":
                emotion = "sad";
                break;
            case "Suprise":
                emotion = "suprised";
                break;
            }
            return emotion;
        }

The last piece of the puzzle is to make sure that the analysis is being executed at a specified interval. In the StartCamera function, add the following line, just before calling StartProcessingCamera:

    _frameGrabber.TriggerAnalysisOnInterval(TimeSpan.FromSeconds(5));

This will trigger an emotion analysis to be called every fifth second.

When I have a smile on my face, the application now knows that I am happy and can provide further interaction accordingly. If we compile and run the example, we should get results like those shown in the following screenshots:

Letting the smart house know your mood

As my mood changes to neutral, the application detects this as well:

Letting the smart house know your mood
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.51.153