Microsoft PHI-2 + Huggine Face + Langchain = Super Tiny Chatbot
Author(s): Gao Dalie (ι«ιη) Originally published on Towards AI. Today, Microsoft Research released the latest version of the small language model (SLM) Phi-2, which has only 2.7 billion sets of parameters. So, In this Post, we will learn what Microsoft Phi-2 is, …
Comparing Dense Attention vs Sparse Sliding Window Attention
Author(s): Building Blocks Originally published on Towards AI. Introduction A little over a month ago in part 1 of this series we identified the prevalence of sink tokens. These are a small subset of tokens that accumulate a large portion of the …
Antigranular: How to Access Sensitive Datasets Without Looking At Them
Author(s): Bex T. Originally published on Towards AI. and developing privacy-first data projectsImage by me with Leonardo AI Today, open-source data represents the tip of the worldβs data iceberg. Most of the worldβs data is below the ocean surface because of privacy …
Computational Linguistics: Detecting AI-Generated Text
Author(s): Matteo Consoli Originally published on Towards AI. AI content indicators: ASL, readability, simplicity, and burstiness. Introduction Every time I read something on Medium or LinkedIn, I canβt stop thinking whether itβs written by a human or itβs a text generated by …
Using Vega-Lite for Data Visualization
Author(s): Angelica Lo Duca Originally published on Towards AI. A tutorial on how to start using Vega-Lite to draw charts.Image by Author Vega-lite is a concise JSON representation of a Vega visualization. Vega is a visualization grammar used to represent the elements …
Top Important LLM Papers for the Week from 11/12 to 17/12
Author(s): Youssef Hosni Originally published on Towards AI. Stay Updated with Recent Large Language Models Research Large language models (LLMs) have advanced rapidly in recent years. As new generations of models are developed, researchers and engineers need to stay informed on the …
Mixtral 8x7B, The New Worldβs Best Pound-For-Pound AI
Author(s): Ignacio de Gregorio Originally published on Towards AI. The Power of Combining Experts Just like any other week these days, a new open-source model has come out. But this time, itβs different. Emulating one of the core features that turned OpenAIβs …
PowerInfer: 11x Speed up LLaMA II Inference On a Local GPU
Author(s): Dr. Mandar Karhade, MD. PhD. Originally published on Towards AI. Some neurons are HOT! Some are cold! A clever way of using a GPU-CPU hybrid interface to achieve impressive speeds! In the last article, we saw that a clever compiler, quantization, …
How Do Inherently Interpretable AI Models Work? β GAMINET
Author(s): Indraneel Dutta Baruah Originally published on Towards AI. Source: interpretable-ml-book The field of deep learning has grown exponentially and the recent craze about ChatGPT is proof of the same. The models are becoming more and more complex with deeper layers leading …
Top Important Computer Vision Papers for the Week from 11/12 to 17/12
Author(s): Youssef Hosni Originally published on Towards AI. Stay Updated with Recent Computer Vision Research Every week, several top-tier academic conferences and journals showcased innovative research in computer vision, presenting exciting breakthroughs in various subfields such as image recognition, vision model optimization, …
Dense X Retrieval Technique in Langchain and LlamaIndex
Author(s): Eduardo MuΓ±oz Originally published on Towards AI. Picture by nadi borodina from Unsplash Introduction Last December, 12th 2.023, the research paper βDense X Retrieval: What Retrieval Granularity Should We Use?β[1] by Tong Chen, Hongwei Wang, Sihao Chen, Wenhao Yu, Kaixin Ma, …
Raw Data to Interactive Maps in the Blink of an Eye with GPT4 Prompting
Author(s): John Loewen, PhD Originally published on Towards AI. Interactive UN population projection mapping with Python PlotlyDall-E 2 image: impressionist style painting of a global choropleth map with deep-blue oceans. ChatGPT is a terrific tool for generating data cleaning and visualization code …
What should I cook for dinner?
Author(s): Renu Gehring Originally published on Towards AI. Can recommender systems help? Photo by Malte Helmhold on Unsplash It is a weeknight, and the witching hour is approaching. Kids are hungry. You are in your kitchen, brainstorming what you should make for …
Mixtral-8x7B + GPT-3 + LLAMA2 70B = The Winner
Author(s): Gao Dalie (ι«ιη) Originally published on Towards AI. While everyoneβs focused on the release of Google Gemini quietly in thebackground Mixtral-8 x 7 Billion releases their open-source model. So, In this Article, weβre diving into some of the latest AI developments …
Hands-On LangChain for LLM Applications Development: Documents Loading
Author(s): Youssef Hosni Originally published on Towards AI. In order to create an application where you can chat with your data, you first have to load your data into a format where it can be worked with. Thatβs where LangChain document loaders …