Learn AI Together — Towards AI Community Newsletter #17
Last Updated on March 25, 2024 by Editorial Team
Author(s): Towards AI Editorial Team
Originally published on Towards AI.
Good morning, AI enthusiasts! This week, we dive into the industry-specific dimension of AI, starting with AI’s impact on education and further with our poll on AI’s use in SMEs. Read along to find interesting research, paid collaborations, and practical resources from the TAI team and the Discord community!
What’s AI Weekly
This week in the What’s AI podcast, Louis-François Bouchard and Luis Serrano dive into the transformative impact of AI on education, forecasting a radical shift in how future generations learn and think. They discuss how AI can personalize learning at a scale, automate routine, and free up educators to focus on fostering higher-order thinking skills among students. Luis shares many more insights about the impact of AI on education, along with many tips for teachers or just people willing to communicate better. Listen to the episode on your favorite streaming platform or YouTube!
-Louis-François Bouchard, Towards AI Co-founder & Head of Community
This issue is brought to you by TruEra:
Are you building LLM apps? Then, you need LLM Observability — the ability to test, evaluate, and monitor your apps.
In the past 18 months, thousands of developers have tinkered with LLM apps, but very few of those experiments ever make it to production. Join TruEra for this webinar on how you can use LLM Observability to create and monitor high-performance GenAI apps…fast!
At the webinar on March 27th, you will learn:
- What is LLM Observability? And how does it help you build better apps faster?
- What kind of testing should you be doing?
- How should you monitor your app in production, and what metrics matter?
Learn AI Together Community section!
Featured Community post from Discord
Xa9ax collaborated on a paper demonstrating how to utilize generative data in a category-only online CL framework. The research proposes a prompt diversification module and a novel sample complexity-guided ensembling technique that strongly improves ID and OOD performance in online CL benchmarks. It also shows that SDXL, DaLLE-2, CogView, and DeepFloyd can vary in generated sample complexity for the same concepts and prompts. Read the paper on Arxiv and support a fellow community member. Share your feedback in the Discord thread!
AI poll of the week!
LLMs are reaching commercial readiness and revolutionizing industries at scale. We believe there’s a lot of potential for AI models and tools to help SMEs grow. However, this week’s poll shows how AI is being leveraged varies. We would love to hear what’s the ‘Something else’ that you use. Share it in the Discord thread!
Collaboration Opportunities
The Learn AI Together Discord community is flooding with collaboration opportunities. If you are excited to dive into applied AI, want a study partner, or even want to find a partner for your passion project, join the collaboration channel! Keep an eye on this section, too — we share cool opportunities every week!
1. Louis-François Bouchard is looking for AI technical writers and developers for LLM and AI-related topics at Towards AI. This would be a paid opportunity, mostly involving creating LLM-based content and writing tutorials/course projects. If you are interested in such a contracting opportunity, reach out to him in the thread!
2. Drdub_ is mentoring a Google Summer of Code @ Apache UIMA. This project, in particular, is for medium to intermediate C++ coders. It involves working in Linux for a Docker system. If this interests you, apply via the link in the thread!
3. Dykyi_vladk is diving into the world of advanced neural networks and looking for a collaborative partner. They are looking for someone based in Europe with advanced knowledge of neural networks. If you are proficient in the PyTorch framework and looking for a collaborator, connect in the thread!
Meme of the week!
Meme shared by drdub_
TAI Curated section
Article of the week
Building RAG Application using Gemma 7B LLM & Upstash Vector Database by Youssef Hosni
Retrieval-augmented generation (RAG) provides large language models (LLMs) with additional information from an external knowledge source. This allows them to generate more accurate and contextual answers while reducing hallucinations. This article provides a step-by-step guide to building a complete RAG application using the latest open-source LLM by Google Gemma 7B and Upstash serverless vector database.
Our must-read articles
1. A Comprehensive Guide to PyTorch Tensors: From Basics to Advanced Operations by Fatma Elik
To master Deep Learning topics, one should know tensor multiplications deeply. Unlock PyTorch tensor mastery! Elevate your deep learning skills with this comprehensive guide covering everything from basics to advanced operations.
2. Streamline ML Workflow with MLflow️ — I by ronilpatil
This article explains how to leverage MLflow to track machine learning experiments, register a model, and serve the model into production. It also explains how to create a REST endpoint and Streamlit web app so that users can easily interact with the model.
If you are interested in publishing with Towards AI, check our guidelines and sign up. We will publish your work to our network if it meets our editorial policies and standards.
Think a friend would enjoy this too? Share the newsletter and let them join the conversation.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI