
Product
Redesigned Repositories Page: A Faster Way to Prioritize Security Risk
Our redesigned Repositories page adds alert severity, filtering, and tabs for faster triage and clearer insights across all your projects.
Simple redis cache for Python functions
pip install python-redis-cache
from redis import StrictRedis
from redis_cache import RedisCache
client = StrictRedis(host="redis", decode_responses=True)
cache = RedisCache(redis_client=client)
@cache.cache()
def my_func(arg1, arg2):
return some_expensive_operation()
# Use the function
my_func(1, 2)
# Call it again with the same arguments and it will use cache
my_func(1, 2)
# Invalidate a single value
my_func.invalidate(1, 2)
# Invalidate all values for function
my_func.invalidate_all()
# Create the redis cache
cache = RedisCache(redis_client, prefix="rc", serializer=dumps, deserializer=loads, key_serializer=None, support_cluster=True, exception_handler=None)
# Cache decorator to go on functions, see above
cache.cache(ttl=..., limit=..., namespace=...) -> Callable[[Callable], Callable]
# Get multiple values from the cache
cache.mget([{"fn": my_func, "args": [1,2], "kwargs": {}}, ...]) -> List[Any]
Redis
# Cached function API
# Returns a cached value, if it exists in cache. Saves value in cache if it doesn't exist
cached_func(*args, *kwargs)
# Invalidates a single value
cached_func.invalidate(*args, **kwargs)
# Invalidates all values for cached function
cached_func.invalidate_all()
f'{function.__module__}.{function.__file__}'
exception_handler(exception: Exception, function: Callable, args: List, kwargs: Dict) -> Any
. If using this handler, reraise the exception in the handler to stop execution of the function. All return results will be used even if None
. If handler not defined, it will raise the exception and not call the original function.{
prefix on the keys. This is NOT recommended. See below for more info.Arguments and return types must be JSON serializable by default. You can override the serializer, but be careful with using Pickle. Make sure you understand the security risks. Pickle should not be used with untrusted values.
https://security.stackexchange.com/questions/183966/safely-load-a-pickle-file
decode_responses
parameter must be False
in redis client if you use pickle.
The key names by default are as follows:
from base64 import b64encode
key = f"{{rc:{fn.__module__}.{fn.__qualname__}}}:{b64encode(function_args).decode('utf-8')}"
The cache key names start with {
, which can be confusing, but is required for redis clusters to place the keys
in the correct slots.
NOTE: It is NOT recommended to use any of the options below. The key name generation by default handles all use cases.
prefix
- The string to prefix the redis keys withcache = RedisCache(redis_client, prefix="custom_prefix")
# Changes keys to the following
key = f"{{custom_prefix:{fn.__module__}.{fn.__qualname__}}}:{b64encode(function_args).decode('utf-8')}"
namespace
- The name of the cache functioncache = RedisCache(redis_client)
@cache.cache(namespace="custom_func_name")
def my_func(arg1, arg2):
pass
# Changes keys to the following
key = f"{{rc:custom_func_name}}:{b64encode(function_args).decode('utf-8')}"
key_serializer
or serializer
- The way function arguments are serializeddef custom_key_serializer(fn_args):
## Do something with fn_args and return a string. For instance
return my_custom_serializer(fn_args)
cache = RedisCache(redis_client, key_serializer=custom_key_serializer)
# Changes keys to the following
key = f"{{rc:{fn.__module__}.{fn.__qualname__}}}:{b64encode(custom_serialized_args).decode('utf-8')}"
support_cluster=False
- This will disable the {
prefix on the keysThis option is NOT recommended because this library will no longer work with redis clusters. Often times people/companies will start not using cluster mode and then will migrate to using cluster. This option will make that migration require a lot of work. If you know for sure you will never use a redis cluster, then you can enable this option. If you are unsure, don't use this option. There is not any benefit.
cache = RedisCache(redis_client, support_cluster=False)
# Changes keys to the following
key = f"rc:{fn.__module__}.{fn.__qualname__}:{b64encode(custom_serialized_args).decode('utf-8')}"
To cache instance/class methods it may require a little refactoring. This is because the self
/cls
cannot be
serialized to JSON without custom serializers. The best way to handle caching class methods is to make a
more specific static method to cache (or global function). For instance:
Don't do this:
class MyClass:
@cache.cache()
def my_func(self, arg1, arg2):
return self.some_arg + arg1 + arg2
Do this instead:
class MyClass:
def my_func(self, arg1, arg2):
return self.my_cached_method(self.some_arg, arg1, arg2)
@cache.cache()
@staticmethod
def my_cached_method(some_arg, arg1, arg2):
return some_arg + arg1 + arg2
If you aren't using self
/cls
in the method, you can use the @staticmethod
decorator to make it a static method.
If you must use self
/cls
in your cached method and can't use the options suggested above, you will need to create
a custom JSON key serializer for the self
/cls
object or you can use the Pickle serializer (which isn't recommended).
FAQs
Basic Redis caching for functions
We found that python-redis-cache demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Our redesigned Repositories page adds alert severity, filtering, and tabs for faster triage and clearer insights across all your projects.
Security News
Slopsquatting is a new supply chain threat where AI-assisted code generators recommend hallucinated packages that attackers register and weaponize.
Security News
Multiple deserialization flaws in PyTorch Lightning could allow remote code execution when loading untrusted model files, affecting versions up to 2.4.0.