AI and your emotions.
Emotion AI is a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures, understands, simulates, and reacts to human emotions. It's also known as affective computing, or artificial emotional intelligence
Enters Realeyes. Using computer vision and machine learning, to measure how people feel as they view content online through their webcam. Helping big brands such as AT&T, Mars, Hershey’s, and Coca-Cola gauge human emotions through desktop computers’ and mobile devices’.
The London-based startup that uses computer vision to read a person’s emotional responses when they are watching a video as short as six seconds long, and then using predictive analytics to help map that reading to the video to provide feedback on its effectiveness, has raised $16.2 million in funding, money that it plans to use to expand in engineering and business development.The company also uses AI to analyze the words used in written survey responses, which complements the data garnered from facial coding.
Advertising company GlassView has partnered with the company as well. And the fact that Realeyes is able to draw conclusions from even short video clips is also interesting: it plays into the fact that a lot of video ads today are a turn-off if they last too long, and so marketers are looking for shorter and more engaging formats to offset that issue.
But emotion AI raises concerns, too.
So, do we really want our emotions to be machine-readable? How can we know that this data will be used in a way that will benefit citizens? Would we be happy for our employers to profile us at work, and perhaps make judgments on our stress management and overall competence? What about insurance companies using data accrued on our bodies and emotional state?