Understanding Emotional AI
Will the smart car of the future understand you better than your spouse? Will it be more sympathetic? It might, considering the rate at which emotional artificial intelligence technology is expanding.
As AI-based applications become more mainstream and tech companies race to develop advanced machine-learning solutions, it’s no wonder that technology is becoming more human.
Some will find the thought of machines responding to your emotions a bit creepy. (I personally have a fear of Alexa channeling Heath Ledger’s Joker character from Batman and asking me Why so serious?) However, the advancements that will be made in the fields of medicine, science, and even safety because of emotional AI should far outweigh the fears. Here’s what you need to know.
How emotional AI works
Emotional artificial intelligence - or emotion detection technology - is the way in which computers use vision and voice analysis in order to interpret human emotions. This is done through biometrics, such as facial scanning and coding.
When you consider the fact that we’re staring into our phones throughout the day, taking selfies, live-streaming and video chatting with everyone, then you can start to see how an application could sense our facial expressions through scans and mapping technology. Some of us are already using facial recognition as the way to unlock our mobile devices.
Voice recognition technology is also prevalent. With voice-enabled applications like Siri and Alexa, all we have to do is talk out loud and these devices respond.
What about the big data trend? A computer can analyze all of your online activity dating back years and created a pretty accurate personality profile.
When you combine all of this - front-facing cameras, voice-recognition technology, and big data - you start to get an idea of how emotional AI can work.
Companies currently using emotional AI
There are several companies already working with emotional artificial intelligence. Here are a few that are making a lot of headway and changing the way we view AI.
Affectiva is a solution that analyzes facial responses via video by using deep learning algorithms. Thousands of brands are already utilizing the services of Affectiva to test their advertising by having viewers use their front-facing cameras while watching the advertising video.
Beyond Verbal uses data based on the human voice to determine personalities, moods and more. It uses this information to create decision-making profiles that advertisers and businesses can use for targeting methods.
Braiq (a play on brake) is a technology used in self-driving cars in order to personalize rider experience. Its goal is to have automated vehicles operate according to the rider’s preferences. This is all done through biometric sensors inside the vehicle that collect data based on the passenger’s responses and then analyze that data in order to provide the best possible experience.
Nuralogix has developed something called Transdermal Optical Imaging (TOI™) that uses a video camera to analyze blood flow from a person’s face. This helps to determine the person’s stress level, heart rate, blood pressure and more. The purpose of the technology is to help people deal with stress and hypertension, which can lead to serious other health issues.
Receptiviti is an SaaS product that uses custom AI to help human resource professionals provide their employees with a better work experience. The company uses AI, NLP, Machine Learning and proprietary Language Psychology Science to collect information and develop behavioral profiles.
All of these companies are already working with clients all over the world and using emotional AI to enhance their business processes. The concept of emotional AI is no longer science fiction. It’s already here. And it’s here to stay.
The future of emotional AI
If emotional AI is already being used in such extraordinary ways as described above, what does the future hold?
Well, to use a tired (but true) cliché, the sky's the limit when it comes to what this type of Artificial Intelligence can do. Imagine the cars of the future with built-in sensors that can determine if a driver is too emotionally upset, sleepy, or drunk to drive. Our Fitbits and smartwatches will be able to sense a heart-attack before it happens and call emergency services for us.
You may even come home from a hard day at work, grab a beer out of your smart fridge, slam the door, and have your refrigerator ask if you had a bad day. Then it will proceed to tell you what you could eat or do to make yourself feel better.
So, yes. The smart devices of the future may continue to become more human, and you may eventually feel that nobody understands you quite like Alexa. But just remember, until Alexa can actually smile at you and cause you to get butterflies in your stomach, she’ll never take the place of your loved one. (And I’m sure that as I’m writing this conclusion, someone, somewhere, is working on technology to make Alexa smile.)