Hume AI secures $50 million funding for 'emotional' AI chatbot
Hume AI Inc, a pioneering artificial intelligence start-up, has successfully raised $50 million in a Series B funding round. The round was led by EQT Ventures, including contributions from several other investors such as Union Square Ventures, Metaplanet, Northwell Holdings, Nat Friedman & Daniel Gross, Comcast Ventures, and LG Technology Ventures. This financial injection comes on the heels of Hume AI's development of the Empathic Voice Interface (EVI), an innovative AI chatbot.
Hume AI's EVI: A leap forward in AI chatbot technology
Hume AI's EVI is a unique voice interface currently undergoing beta testing. Unlike other AI chatbots, EVI can interpret the emotional tone of human speakers to better comprehend their speech. It adjusts its responses based on the user's emotional tone, using data from numerous human interactions to create a proper vocal response almost instantly after the user has finished speaking.
Alan Cowen: The brain behind EVI's development
Alan Cowen, Hume AI's Founder and Chief Scientist, played a crucial role in the development of EVI. With his background in semantic space theory at Google LLC, he contributed significantly to the creation of this innovative technology. Semantic space theory is a computational method used to understand emotional expression and experience, enabling EVI to comprehend human voice nuances, and provide realistic and engaging voice-first generative AI experiences.
EVI's unique architecture: The empathetic large language model
EVI was developed using a multimodal generative AI that merges standard large language model capabilities, with expression measuring techniques. This unique architecture, termed as an "empathic large language model" or eLLM by Hume AI, allows EVI to adjust its word choice and tone depending on context and emotional responses. The interface can accurately detect when a speaker is finishing their conversational turn and respond almost immediately, contributing to a more fluid, humanlike conversational interaction.
Hume AI to release API for enhanced chat experience
Hume AI is set to release its application programming interface (API) in beta next month. This will enable developers to integrate it with apps. The API includes eLLM as well as tools for measuring human emotional expression, which are crucial for facilitating realistic chats. In addition to its empathic conversational capabilities, EVI offers fast and reliable transcription and text-to-speech functionality, making it adaptable to a wide range of scenarios.
Voice interfaces are the future of AI interaction
Cowen believes that voice interfaces will eventually become the primary way people interact with AI. He stated, "Speech is four times faster than typing; frees up the eyes and hands; and carries more information in its tune, rhythm, and timbre. That's why we built the first AI with emotional intelligence to understand the voice beyond words." "Based on your voice, it can better predict when to speak, what to say, and how to say it."
EQT Ventures expresses confidence in Hume AI
EQT Ventures' Partner Ted Persson expressed his belief in the potential of Hume AI's empathic models. He stated, "We believe that Hume is building the foundational technology needed to create AI that truly understands our wants and needs, and are particularly excited by its plan to deploy it as a universal interface."