Moodies aims to decode emotions from speech on iOS
Beyond Verbal has announced the launch of Moodies, its emotion decoding app that aims to help users understand human emotions within speech. By tapping a button, Moodies users can record their own voice or that of those near them, with the app then decoding and measuring the raw voice files for emotions.
The app was built on Beyond Verbal’s 18 years of research, with physicists and neuropsychologists who studied more than 70,000 test users across more than 30 languages. The “emotional detection engine” is said to help users understand the mood, attitude and decision-making characteristics of themselves or those around them, and requires voice clips of 15 seconds to work.
“Over the past six months we have envisioned a world where our emotions’ analytics software is embedded into any device or platform. As we continue to reach additional markets and grow as a company, we believe this vision is coming to life,” says Yuval Mor, CEO of Beyond Verbal.
The app’s mood analysis reports are stored in a running screen of results so users can see their emotions over time, with the app offering primary mood, secondary mood and overall “mood group” categories. That is, users can learn the speaker’s largest current emotional state, on top of any emotions that might be underlying in their speech. These results can be shared with others via email, Facebook or Twitter.
“We envision Moodies as a front-runner to a new breed of emotionally-aware apps and invite third-party developers to join in and license the technology for their own solutions,” added Mor. “As the industry continues to evolve, we believe it’s only a matter of time until this technology is showcasing on many levels it’s not what you say, but how you say it.”