Generating cache key components

Issue #30 resolved
Wichert Akkerman
created an issue

Please note that this is intended as a discussion topic, not a bugreport.

A common issue I run into is that I have expensive functions which take objects as parameters. The default function key generator will call str() on those to build a cache key which is not useful for objects that lack a str(): the default str from python will include the memory address which will always change, making the generated key useless. To improve the situation I can see three different approaches:

Use a wrapper function to fake a simpler API

This is perhaps a slightly insane approach, but it was taken from production code:

def _expensive_function(self, object_id)
    request = self
    obj = Session().query(klass).get(object_id)

def expensive_function(request, obj):
    return _expensive_function(request,

Add a str everywhere

This is a simple solution but has two downsides: it would require changes in many places and may conflict with other users of str; you may need to be able to do str(obj) to produce a string for a template toolkit or logging but which would not be suitable as a cache key. Perhaps dogpile.cache should have used repr() instead of str() to build the cache key?

A object-to-key-component hook

This is my current approach. I use a special method to build a cache key component for an object. Here is a version suitable for use in applications that use WebOb requests and SQLAlchemy:

def my_converter(param):
    if isinstance(param, BaseObject):
        klass = param.__class__
        keys = ( for key in class_mapper(klass).primary_key)
        key = ' '.join('%s=%s' % (key, getattr(param, key, None)) for key in keys)
        return '<%s %s>' % (klass.__name__, key)
    elif isinstance(param, BaseRequest):
        return '<Request app=%s>' % param.application_url
        return compat.string_type(param)

I can image other implementations using zope.component based adapters to do this as well. The downside to this approach is that you have to make a copy of the standard function_key_generator to use this. It would be nice if I there was a way to pass that in. Something like this:

def function_key_generator(namespace, fn, convertor=str):

region = make_region(function_key_generator=
        functools.partial(function_key_generator, convertor=my_converter)

Comments (2)

  1. Michael Bayer repo owner

    i think the concept of extra arguments to function_key_generator, I was thinking maybe we could build a "make_function_key_generator()" function, but I see that partial() is a nice way to get there also, if we document it.

    the rationale behind function_key_generator is that it allows the task of generating the key from a function call pluggable. So in this case we think that this is too much to have to be plugged all the time. I was a little skeptical that we just want an across-the-board "converter" like that, which necessarily would be doing a lot of isinstance() checks, though for comparison I looked at Python's json.dumps() and they seem to be doing the same thing with "default".

  2. Log in to comment