Google Gemini VS GPT-4
Last Updated on January 25, 2024 by Editorial Team
Author(s): Tim Cvetko
Originally published on Towards AI.
How does Google’s New Multimodal Transformer Match Up Against OpenAI’s GPT-4?
On Dec 6th, 2023, Google proudly announced the creation of its new MultiModal Large Language Model that outperforms all models at the MMLU Benchmark, including GPT-4.
Gemini is a family of competent multimodal models developed at Google. They trained Gemini jointly across image, audio, video, and text data to build a model with strong generalist capabilities across modalities alongside cutting-edge understanding and reasoning performance in each respective domain.
Source: Deepmind
Who is this article useful for? AI Engineers, Product Builders, etc.
How advanced is this post? Anybody remotely acquainted with LLM should be able to follow along.
Follow for more of my content: timc102.medium.com
Gemini uses a new architecture that merges a multimodal encoder and decoder. The encoder’s job is to convert different types of data into a common language that the decoder can understand. Then the decoder takes over, generating outputs in different modalities based on the encoded inputs and the task at hand.
Source: Interacting with MultiModal AI
Here’s a breakdown of Gemini’s key components:
Multimodal Encoder: This module processes the input data from each modality (e.g., text, image) independently, extracting relevant features and generating individual representations.Cross-modal Attention Network: This network is the heart of Gemini. It allows the model to learn relationships and dependencies between the… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI
Towards AI Academy
We Build Enterprise-Grade AI. We'll Teach You to Master It Too.
15 engineers. 100,000+ students. Towards AI Academy teaches what actually survives production.
Start free — no commitment:
→ 6-Day Agentic AI Engineering Email Guide — one practical lesson per day
→ Agents Architecture Cheatsheet — 3 years of architecture decisions in 6 pages
Our courses:
→ AI Engineering Certification — 90+ lessons from project selection to deployed product. The most comprehensive practical LLM course out there.
→ Agent Engineering Course — Hands on with production agent architectures, memory, routing, and eval frameworks — built from real enterprise engagements.
→ AI for Work — Understand, evaluate, and apply AI for complex work tasks.
Note: Article content contains the views of the contributing authors and not Towards AI.