Cachebox
The fastest memoizing and caching Python library written in Rust.
Install / Use
/learn @awolverp/CacheboxREADME
Cachebox
The fastest caching Python library written in Rust
Releases | Benchmarks | Issues
</div>What does it do?
You can easily perform powerful caching operations in Python as fast as possible. This can make your application a lot faster and it can be a good choice in complex applications. Ideal for optimizing large-scale applications with efficient, low-overhead caching.
Key Features:
- 🚀 Extremely fast (10-50x faster than other caching libraries -- benchmarks)
- 📊 Minimal memory footprint (50% of standard dictionary memory usage)
- 🔥 Full-featured and user-friendly
- 🧶 Completely thread-safe
- 🔧 Tested and correct
- [R] written in Rust for maximum performance
- 🤝 Compatible with Python 3.9+ (PyPy and CPython)
- 📦 Supports 7 advanced caching algorithms
Page Contents
- ❓ When do I need caching and
cachebox? - 🌟 Why
cachebox? - 🔧 Installation
- 💡 Preview
- 🎓 Getting started
- ✏️ Incompatible changes
- 📌 Tips & Notes
When do I need caching and cachebox?
-
📈 Frequent Data Access
If you need to access the same data multiple times, caching can help reduce the number of database queries or API calls, improving performance. -
💎 Expensive Operations
If you have operations that are computationally expensive, caching can help reduce the number of times these operations need to be performed. -
🚗 High Traffic Scenarios
If your application handles high traffic, caching can help reduce the load on your server by reducing the number of requests that need to be processed. -
#️⃣ Web Page Rendering
If you are rendering web pages, caching can help reduce the time it takes to generate the page by caching the results of expensive rendering operations. Caching HTML pages can speed up the delivery of static content. -
🚧 Rate Limiting
If you have a rate limiting system in place, caching can help reduce the number of requests that need to be processed by the rate limiter. Also, caching can help you to manage rate limits imposed by third-party APIs by reducing the number of requests sent. -
🤖 Machine Learning Models
If your application frequently makes predictions using the same input data, caching the results can save computation time.
Why cachebox?
-
⚡ Rust
It uses the Rust language for high-performance. -
🧮 SwissTable
It uses Google's high-performance SwissTable hash map. Credit to hashbrown. -
✨ Low memory usage
It has very low memory usage. -
⭐ Zero Dependency
As we said,cacheboxis written in Rust so you don't have to install any other dependecies. -
🧶 Thread safe
It's completely thread-safe and uses locks to prevent problems. -
👌 Easy To Use
You only need to import it and choose a cache implementation to use. It will behave like a dictionary. -
🚫 Avoids Cache Stampede
It avoids cache stampede by using a distributed lock system.
Installation
cachebox is installable via pip:
pip3 install -U cachebox
[!WARNING]
The new version v5 has some incompatibilities with v4. For more info see Incompatible changes.
Examples
The simplest example of cachebox could look like this:
import cachebox
# Like functools.lru_cache, If maxsize is set to 0, the cache can grow without bounds and limit.
@cachebox.cached(cachebox.FIFOCache(maxsize=128))
def factorial(number: int) -> int:
fact = 1
for num in range(2, number + 1):
fact *= num
return fact
assert factorial(5) == 125
assert len(factorial.cache) == 1
# coroutines are also supported
@cachebox.cached(cachebox.LRUCache(maxsize=128))
async def make_request(method: str, url: str) -> dict:
response = await client.request(method, url)
return response.json()
Unlike functools.lru_cache and other caching libraries, cachebox can copy dict, list, and set objects.
@cachebox.cached(cachebox.LRUCache(maxsize=128))
def make_dict(name: str, age: int) -> dict:
return {"name": name, "age": age}
>
d = make_dict("cachebox", 10)
assert d == {"name": "cachebox", "age": 10}
d["new-key"] = "new-value"
d2 = make_dict("cachebox", 10)
# `d2` will be `{"name": "cachebox", "age": 10, "new-key": "new-value"}` if you use other libraries
assert d2 == {"name": "cachebox", "age": 10}
You can use cache alghoritms without the cached decorator -- just import the cache alghoritm you want and use it like a dictionary.
from cachebox import FIFOCache
cache = FIFOCache(maxsize=128)
cache["key"] = "value"
assert cache["key"] == "value"
# You can also use `cache.get(key, default)`
assert cache.get("key") == "value"
Getting started
There are 3 useful functions:
- cached: a decorator that helps you to cache your functions and calculations with a lot of options.
- is_cached: check if a function/method cached by cachebox or not
And 9 classes:
- BaseCacheImpl: base-class for all classes.
- Cache: A simple cache that has no algorithm; this is only a hashmap.
- FIFOCache: the FIFO cache will remove the element that has been in the cache the longest.
- RRCache: the RR cache will remove a random element to make free up space when necessary.
- LRUCache: the LRU cache will remove the element in the cache that has not been accessed in the longest time.
- LFUCache: the LFU cache will remove the element in the cache that has been accessed the least often, regardless of time.
- TTLCache: the TTL cache will automatically remove the element in the cache that has expired.
- VTTLCache: the TTL cache will automatically remove the element in the cache that has expired when needed.
- Frozen: you can use this class for freezing your caches.
You only need to import the classes you want and can work with them like a regular dictionaries (except for VTTLCache, this have some differences).
The examples below will introduce you to these different features. All the methods in the examples are common across all classes (exceptions are noted where applicable).
cached (🎀 decorator)
Decorator to wrap a function with a memoizing callable that saves results in a cache.
Parameters:
-
cache: Specifies a cache that handles and stores the results. ifNoneordict,FIFOCachewill be used. -
key_maker: Specifies a function that will be called with the same positional and keyword arguments as the wrapped function itself. It has to return a suitable cache key (must be hashable). -
clear_reuse: The wrapped function has a function namedclear_cachethat usescache.clearmethod to clear the cache. This parameter will be passed to cache'sclearmethod. -
callback: Every time thecacheis used, callback is also called. The callback arguments are: event number (seeEVENT_MISSorEVENT_HITvariables), key, and then result. -
copy_level: The wrapped function always copies the result of your function and then returns it. This parameter specifies how the result is copied before returning it.0means "never copy",1means "only copydict,list, andsetresults" and2means "always copy the results". Defaults to 1.
A simple example:
import cachebox
@cachebox.cached(cachebox.LRUCache(128))
def sum_as_string(a, b):
return str(a+b)
assert sum_as_string(1, 2) == "3"
assert len(sum_as_string.cache) == 1
sum_as_string.cache_clear()
assert len(sum_as_string.cache) == 0
A key_maker example:
import cachebox
def simple_key_maker(args: tuple, kwds: dict):
return args[0].path
# Async methods are supported
@cachebox.cached(cachebox.LRUCache(128), key_maker=simple_key_maker)
async def request_handler(request: Request):
return Response("hello man")
A typed key_maker example using a predefined key function:
import cachebox
@cachebox.cached(cachebox.LRUCache(128), key_maker=cachebox.make_typed_key)
def sum_as_string(a, b):
return str(a+b)
sum_as_string(1.0, 1)
sum_as_string(1, 1)
print(len(sum_as_string.cache)) # 2
You have the option to manage caches with .cache attribute as shown in previous examples.
There are more attributes and methods you can use:
import cachebox
@cachebox.cached(cachebox.LRUCache(0))
def sum_as_string(a, b):
return str(a+b)
print(sum_as_string.cache)
# LRUCache(0 / 9223372036854775807, capacity=0)
print(sum_as_string.cache_info())
# CacheInfo(hits=0, misses=0, maxsize=9223372036854775807, length=0, mem
Related Skills
himalaya
341.8kCLI to manage emails via IMAP/SMTP. Use `himalaya` to list, read, write, reply, forward, search, and organize emails from the terminal. Supports multiple accounts and message composition with MML (MIME Meta Language).
node-connect
341.8kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
84.6kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
84.6kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
