#38 Back to Basics — RAG, Transformers, ML Optimization, and LLM Evaluation.
Author(s): Towards AI Editorial Team Originally published on Towards AI. Good morning, AI enthusiasts! This week, the community and I are answering some recurring questions about RAG, coding assistants, transformers, machine learning, and more. You will also find fun collaboration opportunities and …
TAI #114: Two Paths to Small LMs? Synthetic Data (Phi 3.5) vs Pruning & Distillation (Llama-3.1-Minitron)
Author(s): Towards AI Editorial Team Originally published on Towards AI. What happened this week in AI by Louie This was a week for small language models (SLMs) with significant releases from Microsoft and NVIDIA. These new models highlight the growing trend towards …
The Curse of Dimensionality: Why More Isn’t Always Better in Machine Learning
Author(s): Souradip Pal Originally published on Towards AI. This member-only story is on us. Upgrade to access all of Medium. In the world of machine learning, you’re often knee-deep in datasets. These datasets could be anything — a collection of housing prices, …
Attention is all you need: How Transformer Architecture in NLP started.
Author(s): Surya Maddula Originally published on Towards AI. This member-only story is on us. Upgrade to access all of Medium. Original Paper: Attention is all you need. AI-Generated Image This was THE paper that introduced Transformer Architecture to NLP. This transformative concept …
#37 GraphRAG, SAM 2, Embeddings, Discord Chatbot, LSTM Project!
Author(s): Towards AI Editorial Team Originally published on Towards AI. Good morning, AI enthusiasts! This week, we dive into applied AI developments, fundamental concepts, real-world discussions, and more. Dive in and enjoy the read! What’s AI Weekly This week in What’s AI, …
TAI #113; Sakana’s AI Scientist — Are LLM Agents Ready To Assist AI Research?
Author(s): Towards AI Editorial Team Originally published on Towards AI. What happened this week in AI by Louie This week, xAI joined the growing crowd of broadly GPT-4 class models, which now includes models from OpenAI, Anthropic, Deepmind, xAI, Meta, Mistral, and …
#36 A Framework for Building Scalable AI Products, Best AI Tools for Marketers, ML Library, and more!
Author(s): Towards AI Editorial Team Originally published on Towards AI. Good morning, AI enthusiasts! This week, we have curated an interesting mix of resources around using AI for businesses, building AI products, and understanding AI models, along with exciting collaboration opportunities. What’s …
TAI 112; Agent Capabilities Advancing; METR Eval and Inference Compute Scaling
Author(s): Towards AI Editorial Team Originally published on Towards AI. What happened this week in AI by Louie This week saw fewer major announcements in AI, but there were still some notable developments. New open-source models were released, including Qwen 2 Math …
Taylor Series in AI.
Author(s): Surya Maddula Originally published on Towards AI. P.S. Read thru this article a bit slowly, word by word; you’ll thank me later 😉 — — — — Let’s see what the Taylor Series is and how it relates to its applications …
Can Mixture of Experts (MoE) Models Push GenAI to the Next Level?
Author(s): Nick Minaie, PhD Originally published on Towards AI. Can Mixture of Experts (MoE) Models Push GenAI to the Next Level? Having worked in the AI/ML field for many years, I vividly recall the early days of GenAI when creating even simple …