Tech News

An AI model with emotional intelligence? I cried and Hume’s EVI told me he cared

Sabrina Ortiz/ZDNET

Many generative AI models, such as ChatGPT, have proven to be very intelligent, even outperforming humans on various benchmarks. However, this AI model seeks to prove its capabilities on another level: emotional intelligence.

Last week, startup Hume AI announced that in addition to raising $50 million in a Series B funding round, it was launching the beta version of its flagship product – Empathetic Voice Interface (EVI) – which the company dubbed “the first AI with emotional intelligence.”

The model was created to detect human emotions by listening to voices – and combining that knowledge with what users say – to craft responses tailored to the user’s emotional needs. As shown in the demo below, if EVI detects that a user is sad, it can offer them words of encouragement, as well as some advice.

Also: ChatGPT no longer requires a login, but you might still want one. here’s why

In addition to detecting a person’s emotions, EVI can recognize when a person finishes their sentence, stop speaking when the human interrupts, and generate conversations with almost no latency, mimicking the interaction that would take place with a human.

According to Hume AI, EVI was built on a combination of large language models (LLM) and expression measures, which the company calls an empathic large language model (eLLM).

You can demo the technology on the Hume AI website, where EVI is available for preview demo. I decided to try it and was pleasantly surprised.

Getting started is easy. The only requirement: you must give the site access to your microphone. Then you can start chatting and you will get immediate feedback on the emotions you are feeling.

For the first example, I just talked to him regularly, like I would if I was on a Zoom call with a colleague. As my first prompt, I said, “Hello, Hume, how are you?”

I have a bubbly and happy personality, and I was happy to see that EVI thought so too; he detected my expressions as surprise, amusement, and interest.

Hume AI EVI Demo

Sabrina Ortiz/ZDNET

In addition to sensing my tone, EVI continued the conversation, asking me more about my day. I tested it again, this time channeling my inner theater child to make a fake crying voice, and the results differed significantly.

In response to my fake voice saying, “How are you, I’m having such a hard day,” EVI detected sadness, pain, and distress in my voice. Additionally, he responded with encouraging words saying, “Oh no, it looks like you’re experiencing this today.” I’m here for you. »

Hume AI EVI Demo

Screenshot by Sabrina Ortiz/ZDNET

Currently, EVI is not publicly available; However, the company shares that EVI will be generally available later this month. If you would like to be informed when it is made available, you can fill out this form.

Also read: Biggest challenge, analysts say, is increasing cybersecurity attacks.

Using the chatbot reminded me of my experience testing ElliQ, a senior social assistive robot intended to provide companionship to lonely seniors who lack human interaction in their homes. Likewise, if you told this robot that you were sad or lonely, it would give you encouragement or advice.

I can see eLLMs like EVI being integrated into more robots and AI assistants to achieve the same goal as ElliQ, helping humans feel less alone and more understood. This can also help these tools better determine how to assist and complete tasks.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button