DeepSeek R1 Distilled Models in Ollama: Not What You Think
Author(s): Kshitij Darwhekar Originally published on Towards AI. DeepSeek R1βs distilled models in Ollama sound like smaller versions of the original, but are they really? This member-only story is on us. Upgrade to access all of Medium. AI-generated using ChatGPT by Author …
10 FAQs on AI Agents: Decoding Googleβs Whitepaper in Simple Terms
Author(s): Kshitij Darwhekar Originally published on Towards AI. This member-only story is on us. Upgrade to access all of Medium. This article is part of a new series Iβm launching called 10 FAQs. In this series, I aim to break down complex …
Model Distillation: The Key to Efficient AI Deployment
Author(s): Kshitij Darwhekar Originally published on Towards AI. Shrink Your AI, Not Its Power: The Case for Distilled Models. This member-only story is on us. Upgrade to access all of Medium. LLM Distillation Donβt have a paid Medium membership (yet)? You can …
AlexNet: The Deep Learning Breakthrough That Changed Computer Vision
Author(s): Kshitij Darwhekar Originally published on Towards AI. This article delves into AlexNetβs journey, from its groundbreaking architecture and innovations to its lasting impact on the field of deep learning. Explore the key features, techniques to reduce overfitting, and its legacy in …