Supercharge FastAPI with Redis
Author(s): ronilpatil
Originally published on Towards AI.
Table of Content
β Introduction
βThe Challenge
βCache Mechanism
βCaching with DiskCache
βCaching with Redis
βRedis on Upstash
βGitHub Repository
βConclusion
Introduction
In the fast-paced world of web applications, speed and efficiency are the most critical aspects, especially when youβre integrating AI models. FastAPI, well known for its high performance, already helps streamline API development. But as requests grow and response times become vital, relying solely on FastAPI might not be enough. Thatβs where caching comes in. By introducing caching, we can drastically reduce the load on our AI models, improve response times, and enhance the overall user experience. In this blog, weβll explore how we can supercharge our FastAPI application using Redis for efficient caching.
The Challenge
When we deploy our machine learning model into production, we mostly communicate via API. FastAPI is lightweight and the easiest way to build an API. Suppose I built an API and multiple users concurrently hitting for response/prediction. When our application handles repeated requests with the same inputs, it triggers redundant computations, particularly with resource-intensive deep-learning models. This inefficiency significantly impacts response times, ultimately affecting our productβs performance and user experience.
Cache Mechanism
A cache is a temporary storage that holds data so it can be accessed quickly. Instead of repeatedly fetching the same data from any source or an API, the data is stored in the cache, which is much faster to retrieve. This speeds up the process and reduces the calls & latency.
Caching with DiskCache
DiskCache is a powerful and lightweight caching library for Python that provides a simple and efficient way to store cached data in non-volatile storage like a disk or SSD.
By default, DiskCache utilizes SQLite as its backend storage for efficient caching, but it also supports various other storage options.
Code Snippet
Output
Caching with Redis
Redis is an in-memory database that runs completely on our machineβs RAM. Since accessing data from RAM is much faster than from disk, it's commonly used as a cache. Itβs designed for speed and capable of handling millions of requests per second, making it a favorite for high-performance applications. Unlike traditional databases that store data on disk, Redis keeps data in memory, allowing lightning-fast access.
But w8, you would think since data is stored in RAM, it would be lost if the system crashes or reboots, and bla bla bla…
Donβt worry Redis offers multiple ways to save data to disk, which will ensure that it can be recovered after a crash or reboot:
- RDB (Redis Database Backup): Redis creates snapshots of the dataset at specified intervals, e.g., every few minutes or after a certain number of writes. This provides a point-in-time backup of the data, so if a failure happens, we can restore the last saved snapshot. Itβs great for fast recovery, but we might lose some data that wasnβt saved since the last snapshot. But that's ok; itβs like the old saying, βYou canβt have it all.β
- AOF (Append-Only File): With AOF, Redis logs every write operation to a file. This method offers better durability than RDB because, in the event of a crash, Redis can replay the log to reconstruct the dataset with minimal data loss. We can configure it to log every write immediately, giving almost real-time persistence. We can reduce data loss, but again, it comes at the cost of performance.
There are other ways available, but I wonβt go into detail. So, letβs go ahead and implement it.
Code Snippet
Output [Client-Server]
Output [Redis server]
Note: Redis may not be available in your system; follow any YouTube videos or use Google/ChatGPT to guide you through downloading Redis on your system.
Redis on Upstash
Upstash is a fully managed, serverless Redis platform. It offers Redis as a service with pay-per-request pricing, meaning you only pay for what you use, making it ideal for applications with variable or low traffic. Let's use it in our API.
Code Snippet
Output [Client-Server]
Output [Upstash Redis Dashboard]
GitHub Repository
Develop & Dockerize ML Microservices using FastAPI + Docker
Develop & Dockerize ML Microservices using FastAPI + Docker – ronylpatil/FastAPI
github.com
Conclusion
By utilizing asynchronous programming with FastAPI and the robust caching capabilities of Redis, you can create fast, reliable applications that provide a superior user experience. As you continue to explore and implement this architecture, consider various optimization strategies and best practices to maximize the benefits of both technologies.
If this blog has sparked your curiosity or ignited new ideas, follow me on Medium, GitHub & connect on LinkedIn, and letβs keep the curiosity alive.
Your questions, feedback, and perspectives are not just welcomed but celebrated. Feel free to reach out with any queries or share your thoughts.
Thank you🙌 &,
Keep pushing boundaries🚀
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming aΒ sponsor.
Published via Towards AI