Using dbm locking one can not create values with nested creators

Issue #5 resolved
created an issue

Sometimes it is desired to cache not only the final result of the computation but the intermediate steps as well. Doing that you can invalidate the result of some of the steps and the final cache. The recomputation of the final result would then be faster because it would still use the rest of the cached values.




from dogpile.cache import make_region

region = make_region() region.configure('dogpile.cache.dbm', arguments=dict( filename='file.dbm' ) )

def create_small_thing(): def small_thing(): return 'small ' return region.get_or_create('small', creator=small_thing)

def create_big_thing(): def big_thing(): return create_small_thing() * 10 return region.get_or_create('big', creator=big_thing)



The above code freezes - once the lock is acquired for the "big thing" you can not store "small things".

Comments (5)

  1. Michael Bayer repo owner

    still thinking sorry ! the dbm backend's lock is across all keys. other locks dont. so maybe a reentrant proxy chosen by the backend depending on the type of lock.

  2. Log in to comment