Run OpenAI Swarms at Near-Zero Cost
Author(s): Thomas Reid
Originally published on Towards AI.
Implementing OpenAI Agentic workflows using GROQ
This member-only story is on us. Upgrade to access all of Medium.
I posted an article a few months ago on the new swarm technology that OpenAI had just introduced. Iβll leave a link to that story at the end of this article if you want to check it out. Youβll probably get more out of this article if you do.
Anyway, swarm introduced two core concepts:-
Agents. Like OpenAI Assistants, agents perform specific tasks. Each agent knows what it needs to do and has tools to help get its job done. If an agent canβt complete the task alone, it can be passed on to another agent via a handoff.Handoffs. Allow agents to pass control to other agents more suited to the current task.
The downside of using Swarms on OpenAI is that because there can be lots of processes running throughout the period of your processing, the token costs can soon add up.
Luckily, many third-party LLM providers and enablers implement OpenAI-compatible API calls. This means that we can emulate OpenAI functionality using them.
And because many 3rd party providers allow free or very low-cost LLM usage, it can become very cheap to run agentic swarms.
One such provider is GROQ. Iβve written… Read the full blog for free on Medium.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI