Towards AI Can Help your Team Adopt AI: Corporate Training, Consulting, and Talent Solutions.

Publication

20x Savings on OpenAI Bills by This Simple Method
Artificial Intelligence   Data Science   Latest   Machine Learning

20x Savings on OpenAI Bills by This Simple Method

Last Updated on January 14, 2024 by Editorial Team

Author(s): Dr. Mandar Karhade, MD. PhD.

Originally published on Towards AI.

LLMLingua uses GPT2-small and LLaMA-2-7B to decrease the prompt size by 20x

TLDR:

If you want to

U+1F4B0 Save Cost by reducing both prompt and generation lengths.U+1F4DD Extend Context Support beyond Model limits of the APIs and ModelsU+1F4DC Mitigates the “lost in the middle” issue and boost overall performance.U+1F575️ Retain Knowledge from the original prompt while ICL (In context learning) and reasoning tasks without redundancy of tokensU+1F4DC Increase the throughput of your deployments by shrinking need for KV-CacheU+1FA83 Un-compress the compressed prompt using GPT-4 so no information is lost;

You should read this article

Photo by Kenny Eliason on Unsplash

With advancements in technologies such as chain-of-thought (CoT) prompting and in-context learning (ICL), the prompts fed to LLMs are becoming increasingly lengthy, even exceeding tens of thousands of tokens.

To accelerate model inference and reduce cost, this paper presents LLMLingua, a coarse-to-fine prompt compression method that involves a budget controller to maintain semantic integrity under high compression ratios, a token-level iterative compression algorithm to better model the interdependence between compressed contents, and an instruction tuning based method for distribution alignment between language models.

This implementation of reduction on prompt size has not been shown to decrease accuracy but has boosted the performance of the model in terms of budget for various hardware parameters or cost parameters on the query… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓