
Hume AI, a startup and research lab, has raised $50 million in funding to develop an empathic voice interface (EVI) that can detect over 24 emotions in a person's voice. EVI includes features like transcription, language modeling, TTS, expression understanding, and end-of-turn detection. The company aims to integrate EVI into various applications, offering a conversational AI with emotional intelligence.











A comedic AI from @hume_ai takes a jab at @jason One of the many capabilities of the new Empathic Voice Interface (EVI) Checkout the full episode here: https://t.co/7PGnrOZ2f3 https://t.co/kBUUWREEuk
AI is understanding human emotions! The Measurement API from @hume_ai allows is designed for real-time emotion and expression analysis @alancowen demos how it works https://t.co/RFyn0nIPea
Insane AI demo from @hume_ai's @alancowen! -- The Empathic Voice Interface (EVI) understands the user’s tone of voice and uses vocal signals to guide its language and speech -- Designed to respond better, faster, and more naturally than traditional voice AI https://t.co/O0GlxPArfE