Emotion Detection with CNNs

Up until recently, interacting with a computer was not too dissimilar from interacting with, say, a power tool; we pick it up, turn it on, manually control it, and then put it down until the next time we require it for that specific task. But recently, we are seeing signs that this is about to change; computers allow natural forms of interaction and are becoming more ubiquitous, more capable, and more ingrained in our daily lives. They are becoming less like heartless dumb tools and more like friends, able to entertain us, look out for us, and assist us with our work.

With this shift comes a need for computers to be able to understand our emotional state. For example, you don't want your social robot cracking a joke after you arrive back from work having lost your job (to an AI bot!). This is a field of computer science known as affective computing (also referred to as artificial emotional intelligence or emotional AI), a field that studies systems that can recognize, interpret, process, and simulate human emotions. The first stage of this is being able to recognize emotional state, which is the topic of this chapter. We will first introduce the data and model we will be using, and then walk through how we approach the problem of expression recognition on the iPhone and how to appropriately preprocess the data for inference.

By the end of of this chapter, you will have achieved the following:

  • Built a simple application that will infer your mood in real time using the front camera feed
  • Gained hands-on experience using the Vision framework
  • Developed a deeper understanding and intuition of how convolutional neural networks (CNNs) work and how they can be applied at the edge

Let's start by introducing the data and model we will be using. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.85.181