Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!


Learn AI Together — Towards AI Community Newsletter #6
Artificial Intelligence   Latest   Machine Learning

Learn AI Together — Towards AI Community Newsletter #6

Author(s): Towards AI Editorial Team

Originally published on Towards AI.

Good morning, AI enthusiasts! I’m thrilled to share some exciting updates in this week’s Learn AI Together newsletter. First, we’re launching the RAG for Production with LlamaIndex and LangChain course. This (completely free) course is our final collaboration with Towards AI, Activeloop, and the Intel Disruptor Initiative. It builds on the success of our previous two courses and deepens into advanced retrieval-augmented (RAG) techniques. We ran a poll asking you how familiar you were with RAG, and most of you wanted to learn more about it. So here it is! Expect more YouTube videos on my channel on RAG as well.

Additionally, make sure to catch this week’s What’s AI Podcast episode, where I had the privilege to talk with Jerry Liu, CEO of LlamaIndex, once again about RAG systems, LLMs, and more. It’s a very insightful discussion from a fantastic human who built a well-established company around RAG.

Lastly, our Learn AI Together Community section is full of opportunities. There’s something for any builder, from data science projects to developing AI models.

Join in, collaborate, and learn together!

– Louis-François Bouchard, Towards AI Co-founder & Head of Community

Announcing RAG for Production with LlamaIndex and LangChain Course

This week, we are excited to announce our Retrieval Augmented Generation (RAG) for Production with LlamaIndex and LangChain course in collaboration with Activeloop and Intel Disruptor Initiative. This course is free and part of our Gen AI 360: Foundational Model Certification Course and follows the success of our “LangChain & Vector Databases In Production” and “Training and Fine-tuning LLMs for Production” courses.

The RAG for Production with LlamaIndex and LangChain course provides the theoretical knowledge and practical skills necessary to build advanced RAG products. You will start by learning the essential RAG tool stack, such as loading, indexing, storing, and querying in Langchain and LlamaIndex. We’ll also demystify the two libraries to help you select the right one when working with RAG or other LLM applications. You will then move towards more advanced RAG techniques aimed at surfacing and using more relevant information from the dataset. We cover techniques such as Query expansion, Transformation reranking, recursive retrieval, optimization, and production tips and techniques with LlamaIndex. We then progress to the exciting stuff: learning how to build RAG agents in Langchain and Llamaindex, an introduction to OpenAI assistants, and some other tools & models that can be used in RAG products. We conclude with a summary of RAG evaluation techniques in LlamaIndex together with an introduction to Langsmith in Langchain.

Complete all units along with quizzes to get a free certificate here:

What’s AI Weekly

In this week’s What’s AI Podcast episode, Louis-François Bouchard interviewed Jerry Liu, the CEO of LlamaIndex. They explore the complexities and challenges of AI technology, focusing on Retrieval Augmented Generation (RAG), the importance of great documentation, and the potential of emerging multimodal models like Gemini. They also explore the technical aspects of chunking strategies and data quality in RAG systems. Jerry’s insights into embedding models, fine-tuning for specific domains, and optimizing retrieval processes are invaluable for anyone looking to implement or improve RAG systems in their projects. Jerry makes complex AI concepts accessible to a broad audience, whether you’re deeply embedded in AI or just starting to explore its possibilities. Tune in to the episode on Spotify, Apple Podcasts, or YouTube!

Learn AI Together Community section!

Featured Community post from the Discord

Win2881 worked on a project to find answers to 300+ machine learning interview questions by Scraping from Chip Huyen’s book and Prompting GPT-4. It’s pretty useful for anyone preparing. It provides in-depth solutions, step-by-step explanations of complex problems, and practical insights. Check it out on LinkedIn and share your feedback in the thread here!

AI poll of the week!

Is the basic theory essential for leveraging AI? Share the theories with the most real-world relevance in the thread on Discord.

Collaboration Opportunities

The Learn AI Together Discord community is flooding with collaboration opportunities. If you are excited to dive into applied AI, want a study partner, or even want to find a partner for your passion project, join the collaboration channel! Keep an eye on this section, too — we share cool opportunities every week!

1. Hayley121995 is looking for data scientists and ML enthusiasts to join their AI research projects. It is a paid opportunity. They are looking for someone proficient in data visualization, machine learning, and data labeling. If you are interested, connect with them in the thread.

2. Samyssmile wants to create a Java-based CNN for self-education. They have already created a Multilayer Perceptron and are looking for Java Devs to collaborate on the new library. If this sounds interesting, contact them in the thread.

3. Heptoop is looking for someone interested in developing a UFC fight prediction model. The aim is to predict outcomes and profit from betting. If you’re a strong programmer interested in this project’s scope, reach out in the thread.

Meme of the week!

Meme shared by !BeastBlaze#5835

TAI Curated section

Article of the week

How to Fine-tune EasyOCR to Achieve Better OCR Performance by Eivind Kjosbakken

OCR is a valuable tool to extract text from images. If you are struggling with its accuracy, fine-tuning your OCR engine is the way to go. This tutorial shows you how to fine-tune EasyOCR, a free, open-source OCR engine you can use with Python.

Our must-read articles

1. Do You Understand Me? Human and Machine Intelligence by Lan Chu

This article explores the notion of intelligence and what it means for computer programs to be intelligent. Uncover terminologies and interrelated concepts, such as understanding, awareness, intelligence, and consciousness, are often discussed in the context of human cognition. Even though the meaning of these terms is mysterious, the article tries to refer to them based on typical usages.

2. Build a Customer Support Bot in 20 Minutes With Tanuki + GPT4 by Martbakler

Build a customer support bot that grasps client needs in 20 minutes with Tanuki + GPT-4. Tanuki, an open-source library, easily integrates advanced AI like GPT-4 into apps with minimal knowledge required. It’s key for developers to leverage AI power without deep technical expertise, ensuring accurate use of case-specific AI outputs.

3. Top Important LLM Papers for the Week from 04/12 to 10/12 by Youssef Hosni

Large language models (LLMs) have advanced rapidly in recent years. As new generations of models are developed, researchers and engineers need to stay informed on the latest progress. This article summarizes some of the most important LLM papers published during the second week of December. The papers cover various topics shaping the next generation of language models, from model optimization and scaling to reasoning, benchmarking, and enhancing performance.

If you are interested in publishing with Towards AI, check our guidelines and sign up. We will publish your work to our network if it meets our editorial policies and standards.

Think a friend would enjoy this too? Share the newsletter and let them join the conversation.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓