Affectiva Redefining Human-Machine Partnership through Emotion AI #emtechdigital

Affdex for Market Research

With facial coding and emotion analytics advertisers can measure unfiltered and unbiased consumer emotional responses to digital content. All your panelists need is internet connectivity and a standard webcam.

SOURCES- Affectiva, Live reporting by Brian Wang, Nextbigfuture.com at EmTech Digital 2019.

Taniya Mishra is the Director of AI Research and Lead Speech Scientist, Affectiva.

Affectiva is used by leading market research firms like Millward Brown, LRW, Added Value and Unruly, and brands like Mars, Kellogg’s and CBS. The technology is used to measure consumer emotion responses to digital content, such as ads and TV programming. Affectiva is used by 25% of Fortune Global 500 and 1,400+ brands.

Taniya discussed the need to build trust with emotional AI and to have smart regulation of it.

There are applications for emotional AI in education and medicine.

Affectiva Automotive AI

The first multi-modal in-cabin sensing AI that identifies, from face and voice, complex and nuanced emotional and cognitive states of drivers and passengers. This helps improve road safety and the transportation experience.

Using in-cabin cameras and microphones, Affectiva Automotive AI analyzes facial and vocal expressions to identify expressions, emotion and reactions of the people in a vehicle. We do not send any facial or vocal data to the cloud, it is all processed locally.  Our algorithms are built using deep learning, computer vision, speech science, and massive amounts of real-world data collected from people driving or riding in cars.

Affectiva Automotive AI includes a subset of facial metrics from our Emotion SDK that are relevant for automotive use cases. These metrics are developed to work in in-cabin environments, supporting different camera positions and head angles. We have also added new vocal metrics.

Metrics in Affectiva Automotive AI

  • Tracking of all in-cabin occupants
  • Three facial emotions: Joy, Anger, and Surprise
  • Facial based valence: overall positivity or negativity
  • Four facial markers for drowsiness: Eye Closure, Yawning, Blink, and Blink Rate
  • Head pose estimation: Head Pitch, Head Yaw, Head Roll
  • Eight facial expressions: Smile, Eye Widen, Brow Raise, Brow Furrow, Cheek Raise, Mouth Open, Upper Lip Raise, and Nose Wrinkle
  • Two vocal Emotions: Anger and Laughter
  • Vocal expression of arousal:  the degree of alertness, excitement, or engagement

Affdex for Market Research

With facial coding and emotion analytics advertisers can measure unfiltered and unbiased consumer emotional responses to digital content. All your panelists need is internet connectivity and a standard webcam.

SOURCES- Affectiva, Live reporting by Brian Wang, Nextbigfuture.com at EmTech Digital 2019.

1 thought on “Affectiva Redefining Human-Machine Partnership through Emotion AI #emtechdigital”

  1. Well written Brian, Due to advance technology, Artificial Intelligence can detect and respond to human emotions. It is also known as emotionally intelligent, which is aware of our most nuanced mental, social and emotional states, and intimately familiar with our moods and preferences. Nowadays organizations are adopting the AI technology to detect emotion from facial expressions and understand the micro-emotions which remains only for fraction of seconds to seconds. It is hard for a human being to capture that micro-emotion, but AI can. AI tool such as MaestroQA, ScorebuddyQA, and CSAT.AI, can detect the empathy which helps to understand the customer’s emotion too.

Comments are closed.