“An MIT Media Lab team is teaching computers how to read human facial expressions, a substantial computing challenge that humans take for granted. Help advance this research by watching the videos below in front of a webcam; you’ll then see an analysis of your own smile and will be able to compare it to others. This new app is the world’s first cloud-based technology for reading facial expressions. It began as an effort to help people on the autism spectrum who have difficulty reading emotion, and it is now being commercialized to help businesses understand their customers.”
“Marketers recognize that emotion drives brand loyalty and purchase decisions. Yet traditional ways of measuring emotional response – surveys and focus groups – create a gap by requiring viewers to think about and say how they feel. Neuroscience provides insight into how the mind works, but it typically requires bulky equipment and lab settings that limit and influence the experience. MIT-spinoff Affectiva has some of the best and brightest emotion experts behind the Affdex platform science, providing the most accurate measurement today. This ongoing investment in research and development is focused not just on measuring, but also on predicting… which ads will really work to drive sales and build brands.
Affdex reads emotional states such as surprise, disklike and attention from facial expressions using a webcam. It employs advanced computer vision and machine learning techniques to recognize and automate the analysis of tacit expressions, and it applies scientific methods to interpret viewers’ emotional responses quickly and at scale.”
How does Affectiva measure the emotional connection people have with advertising and brands? Try our Affdex demo to find out!