Changelog . Releases . API Reference
The fastest caching Python library written in Rust
Note
The new version v3 has some incompatible with v2, for more info please see Incompatible changes
What does it do?
You can easily and powerfully perform caching operations in Python as fast as possible.
Features:
- 🚀 5-20x faster than other caching libraries ...
- 📊 Very low memory usage (1/3 of dictionary) ...
- (R) written in Rust
- 🤝 Support Python 3.8 and above
- 📦 Over 7 cache algorithms are supported
- 🧶 Completely thread-safe (uses
RwLock
)
Installing:
Install it from PyPi:
pip3 install -U cachebox
⁉️ When i need caching?- 🤷♂️ Why
cachebox
? - 🎓 Examples
- 💡 Incompatible changes
⁉️ Frequently Asked Questions- ⏱️ BENCHMARK
There are some situations that you may need caching to imprve your application speed:
-
Sometimes you have functions that take a long time to execute, and you need to call them each time.
-
Sometimes you need to temporarily store data in memory for a short period.
-
When dealing with remote APIs, Instead of making frequent API calls, store the responses in a cache.
-
Caching query results from databases can enhance performance.
-
and ...
Rust - cachebox
library uses Rust language to has high-performance.
SwissTable - It uses Google's high-performance SwissTable hash map.
Low memory usage - It has very low memory usage.
Zero-Dependecy - As we said, cachebox
written in Rust so you don't have to install any other dependecies.
Thread-safe - It's completely thread-safe and uses read-writer locks to prevent problems.
Easy-To-Use - You only need to import it and choice your implementation to use and behave with it like a dictionary.
Tip
See API Reference for more examples and references
decorators example:
import cachebox
@cachebox.cached(cachebox.FIFOCache(maxsize=128))
def factorial(n):
return n * factorial(n-1) if n else 1
# Like functools.lru_cache, If maxsize is set to 0, the cache can grow without bound and limit.
@cachebox.cached(cachebox.LRUCache(maxsize=0))
def count_vowels(sentence):
return sum(sentence.count(vowel) for vowel in 'AEIOUaeiou')
# Async are also supported
@cachebox.cached(cachebox.TTLCache(maxsize=20, ttl=5))
async def get_coin_price(coin):
return await client.get_coin_price(coin)
class Application:
# Use cachedmethod for methods
@cachebox.cachedmethod(cachebox.LFUCache(maxsize=20))
def send(self, request):
self._send_request(request)
Implementations example:
import cachebox
import time
cache = cachebox.TTLCache(maxsize=5, ttl=3)
cache.insert("key", "value")
print(cache["key"]) # Output: value
time.sleep(3)
print(cache.get("key")) # Output: None
These are changes that are not compatible with the previous version:
The change applied is that when you pass 0
as maxsize, the value of sys.maxsize
is automatically used.
import cachebox, sys
c = cachebox.Cache(0)
# In previous version:
assert c.maxsize == 0
# In new version:
assert c.maxsize == sys.maxsize
The change applied is that, in previous version you may make changes in cache after abtaining an iterator from it and that did not cause an error, but now you cannot make changes in cache while using the iterator.
import cachebox
c = cachebox.Cache(0, {i:i for i in range(100)})
for (key, value) in c.items():
# This will raise RuntimeError, don't make changes
del c[key]
In previous version, we couldn't use type-hints as possible as dictionary; now we can:
import cachebox
# In previous version this will raises an exception; but now is OK.
c: cachebox.Cache[int, str] = cachebox.Cache(0)
In previous versions, some caches such as FIFOCache
can return ordered iterators, but now all of them
only can return unordered iterators.
import cachebox
c = cachebox.FIFOCache(20)
for i in range(10):
c.insert(i, i)
for key in c:
print(key)
# (5, 5)
# (9, 9)
# (0, 0)
# ...
Can we set maxsize to zero?
Yes, if you pass zero to maxsize, means there's no limit for items.
How to migrate from cachetools to cachebox?
cachebox syntax is very similar to cachetools. Just change these:
# If you pass infinity to a cache implementation, change it to zero.
cachetools.Cache(math.inf) -> cachebox.Cache(0)
# If you use `isinstance` for cachetools classes, change those.
isinstance(cache, cachetools.Cache) -> isinstance(cache, cachebox.BaseCacheImpl)
cachebox is provided under the MIT license. See LICENSE.
TODO List:
- Rewrite all cache algorithms and use low-level API hashmap
- Change hashing system
- Improve tests
- Rewrite stub-file (
.pyi
) - Rewrite README.md
- Write an API referenece
- Add new functions such as
cached_property
. - Add possible methods to implementations.
- Make better type-hint for
cached
andcachedmethod
if possible.