Site icon Towards AI

SALMONN, The First AI Model that Hears As Humans Do

SALMONN, The First AI Model that Hears As Humans Do

Author(s): Ignacio de Gregorio

Originally published on Towards AI.

The Path Toward Human-Like Senses Continues

People often underestimate the importance of hearing to function correctly in our world and, more importantly, as an essential tool for learning.

As the famed Helen Keller once said, “Blindness cuts us off from things, but deafness cuts us off from people” and let’s not forget that this woman was blind and deaf.

Therefore, it’s only natural to see hearing as an indispensable requirement for AI to become the sought-after superior ‘being’ that some people predict it will become.

Sadly, current AI systems suck at hearing.

Yes, the new ChatGPT version that leverages OpenAI’s Whisper model understands speech pretty well, and other models capture… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Exit mobile version