Many generative AI models, such as ChatGPT, have proven to be very intelligent, even outperforming humans on various benchmarks. However, this AI model seeks to prove its capabilities on another plane -- emotional intelligence.
Last week, the startup Hume AI announced that in addition to raising$50 million in a Series B round of funding, it was releasing the beta version of its flagship product --Empathetic Voice Interface (EVI) -- which the company dubbed "the first AI with emotional intelligence."
The model was created to detect human emotions by listening to voices -- and combining that knowledge with what the users are saying -- to craft responses that fit the user's emotional needs. As seen in the demo below, if EVI detects that a user is sad, it can offer them words of encouragement, as well as some advice.
Also: ChatGPT no longer requires a login but you might want one anyway. Here's why
In addition to detecting a person's emotions, EVI can recognize when a person is ending their sentence, stop speaking when the human interrupts it, and generate conversations with nearly no latency, mimicking the interaction that would take place with a human.
According to Hume AI, EVI was built on a combination of large language models (LLMs) and expression measures, which the company calls an empathic large language model (eLLM).
You can demo the technology on the Hume AI website, where EVI is available for demo under preview. I decided to give it a try and was pleasantly surprised.
Getting started is easy. The only requirement: You must give the site access to your microphone. Then you can start chatting, and you will get immediate feedback about whatever emotions you are experiencing.
For the first example, I just spoke to it regularly, as I would if I were on a Zoom call with a colleague. As my first prompt, I said, "Hi, Hume, how are you?"
I have a bubbly, chirpy personality, and I was happy to see that EVI thought so, too; it detected my expressions as surprise, amusement, and interest.
In addition to sensing my tone, EVI kept the conversation going, asking me more about my day. I tested it again, this time channeling my inner theater kid to do a fake crying voice, and the results differed significantly.
In response to my fake crying voice that said, "How are you, I am having such a hard day," EVI detected sadness, pain, and distress in my voice. Additionally, it responded with encouraging words that said, "Oh no, sounds like you are going through it today. I am here for you."
Currently, EVI is unavailable for public access; however, the company shares that EVI will be generally available later this month. If you want to be notified when it is made available, you can fill out this form.
Also: The biggest challenge with increased cybersecurity attacks, according to analysts
Using the chatbot reminded me of my experience testing ElliQ, a senior assistive social robot meant to provide companionship to lonely seniors who lack human interactions in their homes. Similarly, if you told that robot you were sad or lonely, it would give you encouragement or advice.
I can see eLLMs such as EVI being incorporated into more robots and AI assistants to accomplish the same purpose of ElliQ, helping humans feel less lonely and more understood. It can also help those tools better determine how to assist and accomplish tasks.