SkillAgentSearch skills...

Cachier

Persistent, stale-free, local and cross-machine caching for Python functions.

Install / Use

/learn @python-cachier/Cachier

README

Cachier #######

|PyPI-Status| |Downloads| |PyPI-Versions| |Build-Status| |Codecov| |Codefactor| |LICENCE|

Persistent, stale-free, local and cross-machine caching for Python functions.

.. code-block:: python

from cachier import cachier import datetime

@cachier(stale_after=datetime.timedelta(days=3)) def foo(arg1, arg2): """foo now has a persistent cache, triggering recalculation for values stored more than 3 days.""" return {'arg1': arg1, 'arg2': arg2}

.. role:: python(code) :language: python

.. contents::

.. section-numbering:

Installation

Install cachier with:

.. code-block:: python

pip install cachier

For the latest version supporting Python 2.7 please use:

.. code-block:: python

pip install 'cachier==1.2.8'

Features

Current features

  • Pure Python.

  • Compatible with Python 3.10+ (Python 2.7 was discontinued in version 1.2.8).

  • Supported and tested on Linux, OS X and Windows <https://travis-ci.org/shaypal5/cachier>_.

  • A simple interface.

  • Defining "shelf life" for cached values.

  • Async and sync functions support.

  • Multiple caching backends (cores):

    • Local caching using pickle files.
    • In-memory caching using the memory core.
    • Cross-machine caching using MongoDB.
    • SQL-based caching using SQLAlchemy-supported databases.
    • Redis-based caching for high-performance scenarios.
    • S3-based caching for cross-machine object storage backends.
  • Thread-safety.

  • Per-call max age: Specify a maximum age for cached values per call.

Cachier is NOT:

  • Meant as a transient cache. Python's @lru_cache is better.
  • Especially fast. It is meant to replace function calls that take more than... a second, say (overhead is around 1 millisecond).

Future features

  • Multi-core caching.
  • Cache replacement policies <https://en.wikipedia.org/wiki/Cache_replacement_policies>_

Use

Cachier provides a decorator which you can wrap around your functions to give them a persistent cache. The positional and keyword arguments to the wrapped function must be hashable (i.e. Python's immutable built-in objects, not mutable containers). Also, notice that since objects which are instances of user-defined classes are hashable but all compare unequal (their hash value is their id), equal objects across different sessions will not yield identical keys.

Setting up a Cache

You can add a default, pickle-based, persistent cache to your function - meaning it will last across different Python kernels calling the wrapped function - by decorating it with the cachier decorator (notice the ()!).

.. code-block:: python

from cachier import cachier

@cachier() def foo(arg1, arg2): """Your function now has a persistent cache mapped by argument values!""" return {'arg1': arg1, 'arg2': arg2}

Class and object methods can also be cached. Cachier will automatically ignore the self parameter when determining the cache key for an object method. This means that methods will be cached across all instances of an object, which may not be what you want. Because this is a common source of bugs, @cachier raises a TypeError by default when applied to an instance method (a function whose first parameter is named self). This error is raised when @cachier is applied (at class definition time), not when the method is called. To opt in to cross-instance cache sharing, pass allow_non_static_methods=True.

.. code-block:: python

from cachier import cachier

class Foo(): @staticmethod @cachier() def good_static_usage(arg_1, arg_2): return arg_1 + arg_2

# Instance method does not depend on object's internal state, so good to cache
@cachier(allow_non_static_methods=True)
def good_usage_1(self, arg_1, arg_2):
  return arg_1 + arg_2

# Instance method is calling external service, probably okay to cache
@cachier(allow_non_static_methods=True)
def good_usage_2(self, arg_1, arg_2):
  result = self.call_api(arg_1, arg_2)
  return result

# Instance method relies on object attribute, NOT good to cache
# @cachier() would raise TypeError here -- this is intentional
@cachier()
def bad_usage(self, arg_1, arg_2):
  return arg_1 + arg_2 + self.arg_3

Resetting a Cache

The Cachier wrapper adds a clear_cache() function to each wrapped function. To reset the cache of the wrapped function simply call this method:

.. code-block:: python

foo.clear_cache()

General Configuration

Global Defaults


Settings can be globally configured across all Cachier wrappers through the use of the `set_default_params` function. This function takes the same keyword parameters as the ones defined in the decorator, which can be passed all at once or with multiple calls. Parameters given directly to a decorator take precedence over any values set by this function.

The following parameters will only be applied to decorators defined after `set_default_params` is called:

*  `hash_func`
*  `backend`
*  `mongetter`
*  `cache_dir`
*  `pickle_reload`
*  `separate_files`
*  `entry_size_limit`
*  `allow_non_static_methods`

These parameters can be changed at any time and they will apply to all decorators:

*  `allow_none`
*  `caching_enabled`
*  `stale_after`
*  `next_time`
*  `wait_for_calc_timeout`
*  `cleanup_stale`
*  `cleanup_interval`

The current defaults can be fetched by calling `get_default_params`.

Threads Limit
~~~~~~~~~~~~~

To limit the number of threads Cachier is allowed to spawn, set the ``CACHIER_MAX_WORKERS`` with the desired number. The default is 8, so to enable Cachier to spawn even more threads, you'll have to set a higher limit explicitly.


Global Enable/Disable
---------------------

Caching can be turned off across all decorators by calling `disable_caching`, and then re-activated by calling `enable_caching`.

These functions are convenience wrappers around the `caching_enabled` default setting.


Cache Shelf Life
----------------

Setting Shelf Life

You can set any duration as the shelf life of cached return values of a function by providing a corresponding timedelta object to the stale_after parameter:

.. code-block:: python

import datetime

@cachier(stale_after=datetime.timedelta(weeks=2)) def bar(arg1, arg2): return {'arg1': arg1, 'arg2': arg2}

Now when a cached value matching the given arguments is found the time of its calculation is checked; if more than stale_after time has since passed, the function will be run again for the same arguments and the new value will be cached and returned.

This is useful for lengthy calculations that depend on a dynamic data source.

Fuzzy Shelf Life

Sometimes you may want your function to trigger a calculation when it encounters a stale result, but still not wait on it if it's not that critical. In that case, you can set ``next_time`` to ``True`` to have your function trigger a recalculation **in a separate thread**, but return the currently cached stale value:

.. code-block:: python

  @cachier(next_time=True)

Further function calls made while the calculation is being performed will not trigger redundant calculations.

Automatic Cleanup of Stale Values

Setting cleanup_stale=True on a decorator will spawn a background thread that periodically removes stale cache entries. The interval between cleanup runs is controlled by cleanup_interval and defaults to one day.

.. code-block:: python

@cachier(stale_after=timedelta(seconds=30), cleanup_stale=True) def compute(): ...

Working with unhashable arguments

As mentioned above, the positional and keyword arguments to the wrapped function must be hashable (i.e. Python's immutable built-in objects, not mutable containers). To get around this limitation the hash_func parameter of the cachier decorator can be provided with a callable that gets the args and kwargs from the decorated function and returns a hash key for them.

.. code-block:: python

def calculate_hash(args, kwds): key = ... # compute a hash key here based on arguments return key

@cachier(hash_func=calculate_hash) def calculate_super_complex_stuff(custom_obj): # amazing code goes here

See here for an example:

Question: How to work with unhashable arguments <https://github.com/python-cachier/cachier/issues/91>_

Precaching values

If you want to load a value into the cache without calling the underlying function, this can be done with the precache_value function.

.. code-block:: python

@cachier() def add(arg1, arg2): return arg1 + arg2

add.precache_value(2, 2, value_to_cache=5)

result = add(2, 2) print(result) # prints 5

Per-function call arguments

Cachier also accepts several keyword arguments in the calls of the function it wraps rather than in the decorator call, allowing you to modify its behaviour for a specific function call.

Max Age (max_age)

You can specify a maximum allowed age for a cached value on a per-call basis using the `max_age` keyword argument. If the cached value is older than this threshold, a recalculation is triggered. This is in addition to the `stale_after` parameter set at the decorator level; the strictest (smallest) threshold is enforced.

.. code-block:: python

  from datetime import timedelta
  from cachier import cachier

  @cachier(stale_after=timedelta(days=3))
  def add(a, b):
      return a + b

  # Use a per-call max age:
  result = add(1, 2, max_age=timedelta(seconds=10))  # Only use cache if value is <10s old

**How it works:**
- The effective max age threshold is the minimum of `stale_after` (from the decorator) and `max_age` (from the call).
- If the cached value 

Related Skills

View on GitHub
GitHub Stars652
CategoryDevelopment
Updated7d ago
Forks72

Languages

Python

Security Score

100/100

Audited on Mar 23, 2026

No findings