The Amazing Potential of Augmented Reality Innovations from Facebook Reality Labs

 

Facebook Reality Labs

 

 

AR is Everywhere

Augmented Reality (AR) is being used in so many ways already — you’ve most likely used and seen it in various ways without even realizing. For example, Target, along with other retailers, have an AR feature on their app that allows you to see what a product looks like in your bedroom or on the back patio. Or maybe you’ve used the famous story filters on social media apps like SnapChat and Instagram. The Pokemon Go game is another common example. 

 

The Future of AR

The concept of AR has far surpassed filters and product placement though. In fact, Facebook Reality Labs (FRL) is working on developing natural, intuitive ways to interact using AR glasses. The goal is for AR glasses to see the world as we see it and allow us to interact in that world using signals from our brain and contextually-aware Artificial Intelligence (AI). FRL states that “Two of the most critical elements are contextually-aware AI that understands your commands and actions as well as the context and environment around you, and technology to let you communicate with the system effortlessly — an approach we call ultra-low-friction input.” 

The first step down this path is collecting that “ultra-low-friction input” via a wrist-band. This wrist band has the functionality to pick up on nervous signals being sent from the brain, connect them with contextualized AI and, by doing so, manipulate your environment as you see it through AR glasses. This wrist band would act like an electromyography (EMG). The EMG “uses sensors to translate electrical motor nerve signals that travel through the wrist, to the hand, into digital commands that you can use to control the functions of a device. “

 

How does this translate to our daily life?

Yes, this all sounds very technical and advanced…so how does it translate to our daily lives? One example — imagine walking in your favorite park, the clouds start to roll in and you wonder what the forecast is. With the motion of your finger you can “press” an invisible button in the sky to display a screen that will show you the weather forecast, with the swipe of your finger you can see the radar, and then, with the wave of your hand it all goes away. 

Another example is in the workplace. Imagine HR directors being able to send employees papers to sign that they could pull up as augmented reality documents on the desk, and sign with the movement of their finger. The signals through the wrist with this technology are so clear that EMG can understand just a millimeter finger motion.

 

What does this mean for advertisers?

There are big things coming in the AR world. Not only may our workplaces be changed but the way we market our products and services could take on a new, AR realm. Will ad placement be used in AR technology? Will there be new grounds to explore in the web development world? For example, responsive web design may have new challenges if people can search the web via AR technology. 

It is exciting to think of all the possibilities. One thing is for sure you will need a strategy and plan in place as new technology is introduced. You want a marketing team that is thinking ahead and ready to explore new ways to advertise, engage with consumers, and produce results. Gray digital group is passionate about purposeful digital marketing. We prioritize strategy and innovative ideas. Let us know how we can help you in this exciting new world! 

 


Sources:

Facebook Reality Labs

The Emotional Journey of Artificial Intelligence & Empathy


 
Artificial intelligence is doing what we all thought it couldn’t do, developing and engaging with human emotions. If you’re skeptical, just wait until you hear about the recent developments in AI technology. While some of this technology does give cause for concern, you may be surprised by the many ways this could be useful in society. The question is… do the benefits outweigh the costs? 
 


Emotion Economy

 

At SXSW, we learned about “Emotion Economy”.  The term refers to an economy that puts value on organizations that practice, implement and value empathy. This economy leverages technology that utilizes empathy. 

So, how can AI foster empathy? The discussion is starting surrounding how AI can contribute to such an economy that fosters empathy. In general, when people hear “Artificial Intelligence” they may picture stone-cold robots, void of the human element of relating and empathizing with others. There is a ton of mistrust in AI because of this absence of emotions. The name itself refers to something that is “artificial” which does not conjure images of emotion. 

But what if AI could understand human emotions and actually engage with them? Dr. Rana el Kaliouby went on a journey to find an answer to this question. Here are some interesting highlights on her findings:

 

  • When you dissect how humans communicate — only 10% is based on the choice of words we use. The remaining 90% is split between vocals, voice inflections, body language and facial expressions

 

  • Humans have the natural ability to read other people and adjust their communication based on the listeners nonverbal signals. 

 

These findings help explain why in virtual settings it can be much harder to adjust your communication and adapt to your audience. We can all relate to that given all the zoom calls we’ve participated in this past year, and the fatigue and disconnect that sometimes comes with them.

 

AI is now learning how to read people better based on things other than word choice. How does AI capture and read non-verbal communication? 

 

The answer is found through the study of “emotion science” and facial expressions. There are 45 facial muscles that create thousands of expressions and mental states. Facial coding refers to a system that was created to score facial expressions. Camera sensors are able to capture this data thus allowing a way to quantify an array of emotions. 


Tests and Uses

 

Children and adults with autism struggle to understand others’ facial expressions. Many of them tend to avoid the face because of this. Affectiva pitched the idea of AI google glasses that read the expressions of the person you’re interacting with in real-time. These glasses act almost like a real-time coach for those with autism. When testing them out they saw incredible results. Children wearing the glasses engaged with people more and made face-to-face contact.

Entering the Marketing Industry

 

Leveraging AI technology to better understand and engage your consumers is where the advertising industry is headed. 

Affectiva also tested their technology on advertising campaigns. When a user scrolled past an ad on their mobile device, they were asked for permission to turn their camera on. AI was able to capture the moment by moment responses and expressions of the user. Did they find the ad funny? Sentimental? Confusing? With the data being compiled from every viewer’s reactions, agencies were able to get a better understanding of how their ad was performing and what their viewers thought about it. 

Emotion AI has huge implications for tracking consumer behavior. It’s important to remember that emotions are tied to any piece of content. When sharing content, we are trying to evoke an emotion that will ultimately lead to decision making such as purchase intent. 

As marketers, it’s important to be empathetic in your message and find a way to tie it back to your brand. AI can help with this. 

Emotion AI can help capture moments of confusion, frustration, curiosity or interest as it monitors the reactions of a user navigating a website. This data can be incredibly valuable to agencies that specialize in UI design and web development. AI can essentially provide insight on any user experience. 

Other Applications

  • Call Centers

AI can capture voice inflections. This can be helpful for call centers and customer service departments. AI can read your tone and therefore determine which call center agent to transfer you to. For example, a frustrated caller may be sent to a top-performing agent or management based on their tone of voice.

  • Automotives

Applying empathy to technology via AI will change so many interfaces. Think about the automotive industry for starters. With AI built into the interface of our vehicles, it could detect the driver’s state. Is the driver tense, tired, drowsy? Are they distracted on their phone? Are they under the influence? With this information being fed into the vehicle’s system, many accidents could be prevented. 

Not only does this have safety implications but it also can provide customer value. The environment in the car can be personalized based on who is in the car. Imagine the personalization that can be done with the help of AI for a teenage driver vs an elderly driver. 

  • Home Robots

Home robots and Alexas can be taken to the next level where they are truly conversational and can persuade you to change your behavior. With the use of emotion AI, these devices have the potential to sway your decisions based on past behavior, conversations and body language. For example, let’s say you tell your device to play some sad music, and your facial expression can be read as distressed or grieving, AI could offer some encouragement, mimicking a true human conversation, or it could suggest some songs based on past data that cheered you up. Emotion AI can make these robots seem a bit more “human”. 

  • Healthcare

Emotion AI technology also has the ability to detect mental health and stress based on body language. This data could be a huge driving force in the healthcare industry. In the telehealth industry which is largely where healthcare is headed, soft skills are very important. This evaluative technology could apply not just to patients, but to providers as well. If the doctor is looking down at papers while on a virtual appointment with a patient, they may seem less empathetic according to AI which is reading the expressions of the patient. This could ultimately lead to the strict monitoring of doctors, even crossing into legal implications. 


Summary

 

Obviously, there is huge potential for this type of technology.  AI technology that understands emotion may be able to motivate us and persuade us to change our behavior.  People may be coaxed to be healthier, more active, and make safer choices. However, consideration will need to be given to unintended consequences. Affectiva and other companies must think about the ethical dilemmas. They have made an effort to prioritize thoughtful regulation but are not waiting for legislation but rather being a steward in designing the right way. 

Emotion AI is reimaging how human to human connection can be studied, replicated and applied. It’s not always comfortable to think about, but it’s likely something that is not going away. The possibilities that have already been demonstrated have some amazing benefits that can likely be built upon. One thing is certain, effective communication and consistently interpreting emotions correctly is something that not even humans have mastered. Perhaps Artificial Intelligence will help us understand ourselves better! 

 


Sources:

Photo by cottonbro from Pexels

Dr. Rana el Kaliouby Co-Founder & Chief Science Officer Affectiva