SALMONN, The First AI Model that Hears As Humans Do
Last Updated on November 10, 2023 by Editorial Team
Author(s): Ignacio de Gregorio
Originally published on Towards AI.
The Path Toward Human-Like Senses Continues
People often underestimate the importance of hearing to function correctly in our world and, more importantly, as an essential tool for learning.
As the famed Helen Keller once said, βBlindness cuts us off from things, but deafness cuts us off from peopleβ and letβs not forget that this woman was blind and deaf.
Therefore, itβs only natural to see hearing as an indispensable requirement for AI to become the sought-after superior βbeingβ that some people predict it will become.
Sadly, current AI systems suck at hearing.
Yes, the new ChatGPT version that leverages OpenAIβs Whisper model understands speech pretty well, and other models capture… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI