Advertisement

New Affectiva cloud API helps machines understand emotions in human speech

New Affectiva cloud API helps machines understand emotions in human speech
From TechCrunch - September 13, 2017

Affectiva, the startup that spun out of the MIT Media Lab several years ago with tools designed to understand facial emotions, announced a new cloud API today that can detect a range of emotion in human speech.

When we speak, our voices offer subtle and not so subtle cues about our emotions. Whether our voices are tight or loud or soft can give valuable clues about our feelings. Humans can sometimes (although not always) detect those emotions, but traditionally computers have not been very good at it.

Alexa isnt terribly funny because the technology doesnt understand humor or tone, and cant understand when youre joking versus asking a genuine question. Using Affectivas new tech, voice assistants, bots and other devices that operate using artificial intelligence might soon be able to hear and understand our emotionsand be able to derive more meaning from our requests, company CEO and co-founder Dr. Rana el Kaliouby told TechCrunch.

Amazon [and other companies] knows if it wants it to be persuasive to try a product or route, it needs to have a relationship [with you]. To have a relationship, it needs to understand your emotional state, which is what humans do, have a real-time understanding of an emotional state. Are you annoyed, frustrated, confused?, Kaliouby explained.

Amazon isnt alone. Car makers are interested in knowing your emotional state behind the wheel, and that of your passengers. These factors could have an impact on your safety in the car. Any company could use a better understanding of customers calling into their call centers or dealing with a customer service bot (they would find me often annoyed).

Advertisement

Continue reading at TechCrunch »