Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!

Publication

[Code Llama 70B🦙] It is One Step Away From Surpassing GPT-4
Artificial Intelligence   Data Science   Latest   Machine Learning

[Code Llama 70B🦙] It is One Step Away From Surpassing GPT-4

Last Updated on February 2, 2024 by Editorial Team

Author(s): Gao Dalie (高達烈)

Originally published on Towards AI.

When one thinks about the development of artificial intelligence models, the two names that initially come to mind are OpenAI and, perhaps, Google. However, they are not the only ones.

In this Post, we will delve into the key features and improvements that Code Llama 70B brings to the table.

Meta released Code Llama 70B, a new code generation model. This new model is one step away from surpassing GPT-4 and is released as open source and freely available commercially. In other words, the most advanced coding AI is now available to anyone.

This is an incredible development, considering that just three years ago

If you like this topic and you want to support me:

Clap my article 50 times; that will really help me out.U+1F44FFollow me on Medium and subscribe to get my latest articleU+1FAF6

Code Llama 70B can handle more queries than previous versions, which means developers can feed it more prompts while programming, and it can be more accurate.

Code Llama 70B achieves 53% accuracy in the HumanEval test, outperforming the GPT-3.5 model (48.1%) and reducing the distance with GPT-4 (67%) and Gemini Ultra (74.4%), according to data published by both companies.

Built on Llama 2, Code Llama helps developers create strings of code from… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓