Artificial intelligence is doing what we all thought it couldn’t do, developing and engaging with human emotions. If you’re skeptical, just wait until you hear about the recent developments in AI technology. While some of this technology does give cause for concern, you may be surprised by the many ways this could be useful in society. The question is… do the benefits outweigh the costs?
At SXSW, we learned about “Emotion Economy”. The term refers to an economy that puts value on organizations that practice, implement and value empathy. This economy leverages technology that utilizes empathy.
So, how can AI foster empathy? The discussion is starting surrounding how AI can contribute to such an economy that fosters empathy. In general, when people hear “Artificial Intelligence” they may picture stone-cold robots, void of the human element of relating and empathizing with others. There is a ton of mistrust in AI because of this absence of emotions. The name itself refers to something that is “artificial” which does not conjure images of emotion.
But what if AI could understand human emotions and actually engage with them? Dr. Rana el Kaliouby went on a journey to find an answer to this question. Here are some interesting highlights on her findings:
- When you dissect how humans communicate — only 10% is based on the choice of words we use. The remaining 90% is split between vocals, voice inflections, body language and facial expressions.
- Humans have the natural ability to read other people and adjust their communication based on the listeners nonverbal signals.
These findings help explain why in virtual settings it can be much harder to adjust your communication and adapt to your audience. We can all relate to that given all the zoom calls we’ve participated in this past year, and the fatigue and disconnect that sometimes comes with them.
AI is now learning how to read people better based on things other than word choice. How does AI capture and read non-verbal communication?
The answer is found through the study of “emotion science” and facial expressions. There are 45 facial muscles that create thousands of expressions and mental states. Facial coding refers to a system that was created to score facial expressions. Camera sensors are able to capture this data thus allowing a way to quantify an array of emotions.
Tests and Uses
Children and adults with autism struggle to understand others’ facial expressions. Many of them tend to avoid the face because of this. Affectiva pitched the idea of AI google glasses that read the expressions of the person you’re interacting with in real-time. These glasses act almost like a real-time coach for those with autism. When testing them out they saw incredible results. Children wearing the glasses engaged with people more and made face-to-face contact.
Entering the Marketing Industry
Leveraging AI technology to better understand and engage your consumers is where the advertising industry is headed.
Affectiva also tested their technology on advertising campaigns. When a user scrolled past an ad on their mobile device, they were asked for permission to turn their camera on. AI was able to capture the moment by moment responses and expressions of the user. Did they find the ad funny? Sentimental? Confusing? With the data being compiled from every viewer’s reactions, agencies were able to get a better understanding of how their ad was performing and what their viewers thought about it.
Emotion AI has huge implications for tracking consumer behavior. It’s important to remember that emotions are tied to any piece of content. When sharing content, we are trying to evoke an emotion that will ultimately lead to decision making such as purchase intent.
As marketers, it’s important to be empathetic in your message and find a way to tie it back to your brand. AI can help with this.
Emotion AI can help capture moments of confusion, frustration, curiosity or interest as it monitors the reactions of a user navigating a website. This data can be incredibly valuable to agencies that specialize in UI design and web development. AI can essentially provide insight on any user experience.
- Call Centers
AI can capture voice inflections. This can be helpful for call centers and customer service departments. AI can read your tone and therefore determine which call center agent to transfer you to. For example, a frustrated caller may be sent to a top-performing agent or management based on their tone of voice.
Applying empathy to technology via AI will change so many interfaces. Think about the automotive industry for starters. With AI built into the interface of our vehicles, it could detect the driver’s state. Is the driver tense, tired, drowsy? Are they distracted on their phone? Are they under the influence? With this information being fed into the vehicle’s system, many accidents could be prevented.
Not only does this have safety implications but it also can provide customer value. The environment in the car can be personalized based on who is in the car. Imagine the personalization that can be done with the help of AI for a teenage driver vs an elderly driver.
- Home Robots
Home robots and Alexas can be taken to the next level where they are truly conversational and can persuade you to change your behavior. With the use of emotion AI, these devices have the potential to sway your decisions based on past behavior, conversations and body language. For example, let’s say you tell your device to play some sad music, and your facial expression can be read as distressed or grieving, AI could offer some encouragement, mimicking a true human conversation, or it could suggest some songs based on past data that cheered you up. Emotion AI can make these robots seem a bit more “human”.
Emotion AI technology also has the ability to detect mental health and stress based on body language. This data could be a huge driving force in the healthcare industry. In the telehealth industry which is largely where healthcare is headed, soft skills are very important. This evaluative technology could apply not just to patients, but to providers as well. If the doctor is looking down at papers while on a virtual appointment with a patient, they may seem less empathetic according to AI which is reading the expressions of the patient. This could ultimately lead to the strict monitoring of doctors, even crossing into legal implications.
Obviously, there is huge potential for this type of technology. AI technology that understands emotion may be able to motivate us and persuade us to change our behavior. People may be coaxed to be healthier, more active, and make safer choices. However, consideration will need to be given to unintended consequences. Affectiva and other companies must think about the ethical dilemmas. They have made an effort to prioritize thoughtful regulation but are not waiting for legislation but rather being a steward in designing the right way.
Emotion AI is reimaging how human to human connection can be studied, replicated and applied. It’s not always comfortable to think about, but it’s likely something that is not going away. The possibilities that have already been demonstrated have some amazing benefits that can likely be built upon. One thing is certain, effective communication and consistently interpreting emotions correctly is something that not even humans have mastered. Perhaps Artificial Intelligence will help us understand ourselves better!
Dr. Rana el Kaliouby Co-Founder & Chief Science Officer Affectiva