Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!

Publication

Understanding Attention Mechanisms: Basis for Chat GPT3 and LLMs
Latest   Machine Learning

Understanding Attention Mechanisms: Basis for Chat GPT3 and LLMs

Author(s): Sai Viswanth

Originally published on Towards AI.

Practical Implementation of the Attention Mechanism on 2 different sentences.
Photo by Andrew Neel on Unsplash

The surge of large language models used in AI applications such as the Chat GPT 3, Github Co-pilot, and so on was possible because of this groundbreaking concept called the Attention Mechanism introduced in the paper Attention is All You Need in 2017.

Long before Attention, Recurrent Neural Networks (RNNs) could handle entire sentences, processing them word by word in order and giving a meaningful output based on the task at hand. For instance, if it’s about translation, it would take an English sentence and turn it into Spanish.

Ever wonder why we didn’t have ChatGPT 3 or Google Bard-like apps earlier? Well, RNNs had trouble remembering words in long sentences, especially at the beginning, which was a big issue as one word could change the whole sentence’s meaning.

Image by the Author

Without the don’t word , it means that I would want to go to the office , so if RNN somehow missed this word it will be a total waste .

To fix this, Long Short-Term Memory (LSTM), an improved version of RNNs, was better at understanding relationships between distant words by handling both short-term and long-term dependencies separately. However, it still struggled with long text sequences.

Then… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓