Mike Bayer avatar Mike Bayer committed 6c95329 Draft

whitespace removal

Comments (0)

Files changed (12)

 replacement to the `Beaker <http://beaker.groovie.org>`_ caching system, the internals
 of which are written by the same author.   All the ideas of Beaker which "work"
 are re-implemented in dogpile.cache in a more efficient and succinct manner,
-and all the cruft (Beaker's internals were first written in 2005) relegated 
+and all the cruft (Beaker's internals were first written in 2005) relegated
 to the trash heap.
 
 Features
 * A standard get/set/delete API as well as a function decorator API is
   provided.
 * The mechanics of key generation are fully customizable.   The function
-  decorator API features a pluggable "key generator" to customize how 
+  decorator API features a pluggable "key generator" to customize how
   cache keys are made to correspond to function calls, and an optional
-  "key mangler" feature provides for pluggable mangling of keys 
+  "key mangler" feature provides for pluggable mangling of keys
   (such as encoding, SHA-1 hashing) as desired for each region.
 * The dogpile lock, first developed as the core engine behind the Beaker
-  caching system, here vastly simplified, improved, and better tested.   
+  caching system, here vastly simplified, improved, and better tested.
   Some key performance
   issues that were intrinsic to Beaker's architecture, particularly that
   values would frequently be "double-fetched" from the cache, have been fixed.
   lock tailored towards the backend is an optional addition, else dogpile uses
   a regular thread mutex. New backends can be registered with dogpile.cache
   directly or made available via setuptools entry points.
-* Included backends feature three memcached backends (python-memcached, pylibmc, 
+* Included backends feature three memcached backends (python-memcached, pylibmc,
   bmemcached), a Redis backend, a backend based on Python's
-  anydbm, and a plain dictionary backend.  
+  anydbm, and a plain dictionary backend.
 * Space for third party plugins, including the first which provides the
   dogpile.cache engine to Mako templates.
 * Python 3 compatible in place - no 2to3 required.
 --------
 
 dogpile.cache features a single public usage object known as the ``CacheRegion``.
-This object then refers to a particular ``CacheBackend``.   Typical usage 
+This object then refers to a particular ``CacheBackend``.   Typical usage
 generates a region using ``make_region()``, which can then be used at the
 module level to decorate functions, or used directly in code with a traditional
 get/set interface.   Configuration of the backend is applied to the region
-using ``configure()`` or ``configure_from_config()``, allowing deferred 
+using ``configure()`` or ``configure_from_config()``, allowing deferred
 config-file based configuration to occur after modules have been imported::
 
     from dogpile.cache import make_region
 Documentation
 -------------
 
-See dogpile.cache's full documentation at 
+See dogpile.cache's full documentation at
 `dogpile.cache documentation <http://dogpilecache.readthedocs.org>`_.
 
 

docs/build/front.rst

 Community
 =========
 
-dogpile.cache is developed by `Mike Bayer <http://techspot.zzzeek.org>`_, and is 
+dogpile.cache is developed by `Mike Bayer <http://techspot.zzzeek.org>`_, and is
 loosely associated with the `Pylons Project <http://www.pylonsproject.org/>`_.
 As dogpile.cache's usage increases, it is anticipated that the Pylons mailing list and IRC channel
 will become the primary channels for support.
 
 Bugs and feature enhancements to dogpile.cache should be reported on the `Bitbucket
 issue tracker
-<https://bitbucket.org/zzzeek/dogpile.cache/issues?status=new&status=open>`_.   If you're not sure 
+<https://bitbucket.org/zzzeek/dogpile.cache/issues?status=new&status=open>`_.   If you're not sure
 that a particular issue is specific to either dogpile.cache or `dogpile.core <https://bitbucket.org/zzzeek/dogpile.core>`_, posting to the dogpile.cache
 tracker is likely the better place to post first.
 

docs/build/index.rst

 Welcome to dogpile.cache's documentation!
 ==========================================
 
-`dogpile.cache <http://bitbucket.org/zzzeek/dogpile.cache>`_ provides a simple 
+`dogpile.cache <http://bitbucket.org/zzzeek/dogpile.cache>`_ provides a simple
 caching pattern based on the `dogpile.core <http://pypi.python.org/pypi/dogpile.core>`_
 locking system, including rudimentary backends. It effectively completes the
-replacement of `Beaker <http://beaker.groovie.org>`_ as far as caching (though **not** HTTP sessions) 
-is concerned, providing an open-ended, simple, and higher-performing pattern to configure and use 
+replacement of `Beaker <http://beaker.groovie.org>`_ as far as caching (though **not** HTTP sessions)
+is concerned, providing an open-ended, simple, and higher-performing pattern to configure and use
 cache backends. New backends are very easy to create
 and use; users are encouraged to adapt the provided backends for their own
 needs, as high volume caching requires lots of tweaks and adjustments specific

docs/build/usage.rst

 Overview
 ========
 
-At the time of this writing, popular key/value servers include 
+At the time of this writing, popular key/value servers include
 `Memcached <http://memcached.org>`_, `Redis <http://redis.io/>`_, and `Riak <http://wiki.basho.com/>`_.
 While these tools all have different usage focuses, they all have in common that the storage model
 is based on the retrieval of a value based on a key; as such, they are all potentially
 suitable for caching, particularly Memcached which is first and foremost designed for
-caching.   
+caching.
 
 With a caching system in mind, dogpile.cache provides an interface to a particular Python API
-targeted at that system.  
+targeted at that system.
 
 A dogpile.cache configuration consists of the following components:
 
   :meth:`~.CacheBackend.get`, :meth:`~.CacheBackend.set` and :meth:`~.CacheBackend.delete`.
   The actual kind of :class:`.CacheBackend` in use for a particular :class:`.CacheRegion`
   is determined by the underlying Python API being used to talk to the cache, such
-  as Pylibmc.  The :class:`.CacheBackend` is instantiated behind the scenes and 
+  as Pylibmc.  The :class:`.CacheBackend` is instantiated behind the scenes and
   not directly accessed by applications under normal circumstances.
 * Value generation functions.   These are user-defined functions that generate
   new values to be placed in the cache.   While dogpile.cache offers the usual
   "set" approach of placing data into the cache, the usual mode of usage is to only instruct
-  it to "get" a value, passing it a *creation function* which will be used to 
+  it to "get" a value, passing it a *creation function* which will be used to
   generate a new value if and only if one is needed.   This "get-or-create" pattern
-  is the entire key to the "Dogpile" system, which coordinates a single value creation 
+  is the entire key to the "Dogpile" system, which coordinates a single value creation
   operation among many concurrent get operations for a particular key, eliminating
   the issue of an expired value being redundantly re-generated by many workers simultaneously.
 
 
 .. sidebar:: pylibmc
 
-    In this section, we're illustrating Memcached usage 
-    using the `pylibmc <http://pypi.python.org/pypi/pylibmc>`_ backend, which is a high performing 
+    In this section, we're illustrating Memcached usage
+    using the `pylibmc <http://pypi.python.org/pypi/pylibmc>`_ backend, which is a high performing
     Python library for Memcached.  It can be compared to the `python-memcached <http://pypi.python.org/pypi/python-memcached>`_
     client, which is also an excellent product.  Pylibmc is written against Memcached's native API
-    so is markedly faster, though might be considered to have rougher edges.   The API is actually a bit 
+    so is markedly faster, though might be considered to have rougher edges.   The API is actually a bit
     more verbose to allow for correct multithreaded usage.
 
 
 Above, we create a :class:`.CacheRegion` using the :func:`.make_region` function, then
-apply the backend configuration via the :meth:`.CacheRegion.configure` method, which returns the 
+apply the backend configuration via the :meth:`.CacheRegion.configure` method, which returns the
 region.  The name of the backend is the only argument required by :meth:`.CacheRegion.configure`
-itself, in this case ``dogpile.cache.pylibmc``.  However, in this specific case, the ``pylibmc`` 
+itself, in this case ``dogpile.cache.pylibmc``.  However, in this specific case, the ``pylibmc``
 backend also requires that the URL of the memcached server be passed within the ``arguments`` dictionary.
 
 The configuration is separated into two sections.  Upon construction via :func:`.make_region`,
 :meth:`.CacheRegion.configure` are typically loaded from a configuration file and therefore
 not necessarily available until runtime, hence the two-step configurational process.
 
-Key arguments passed to :meth:`.CacheRegion.configure` include *expiration_time*, which is the expiration 
+Key arguments passed to :meth:`.CacheRegion.configure` include *expiration_time*, which is the expiration
 time passed to the Dogpile lock, and *arguments*, which are arguments used directly
 by the backend - in this case we are using arguments that are passed directly
 to the pylibmc module.
             self.cache.pop(key)
 
 Then make sure the class is available underneath the entrypoint
-``dogpile.cache``.  If we did this in a ``setup.py`` file, it would be 
+``dogpile.cache``.  If we did this in a ``setup.py`` file, it would be
 in ``setup()`` as::
 
     entry_points="""
       dictionary = mypackage.mybackend:DictionaryBackend
       """
 
-Alternatively, if we want to register the plugin in the same process 
+Alternatively, if we want to register the plugin in the same process
 space without bothering to install anything, we can use ``register_backend``::
 
     from dogpile.cache import register_backend
 
 Where "payload" is the thing being cached, and "metadata" is information
 we store in the cache - a dictionary which currently has just the "creation time"
-and a "version identifier" as key/values.  If the cache backend requires serialization, 
+and a "version identifier" as key/values.  If the cache backend requires serialization,
 pickle or similar can be used on the tuple - the "metadata" portion will always
 be a small and easily serializable Python structure.
 

dogpile/cache/api.py

 
 class NoValue(object):
     """Describe a missing cache value.
-    
+
     The :attr:`.NO_VALUE` module global
     should be used.
-    
+
     """
     @property
     def payload(self):
             return False
 
 NO_VALUE = NoValue()
-"""Value returned from ``get()`` that describes 
+"""Value returned from ``get()`` that describes
 a  key not present."""
 
 class CachedValue(tuple):
     """Represent a value stored in the cache.
-    
+
     :class:`.CachedValue` is a two-tuple of
     ``(payload, metadata)``, where ``metadata``
     is dogpile.cache's tracking information (
     currently the creation time).  The metadata
-    and tuple structure is pickleable, if 
+    and tuple structure is pickleable, if
     the backend requires serialization.
-    
+
     """
     payload = property(operator.itemgetter(0))
     """Named accessor for the payload."""
     """Base class for backend implementations."""
 
     key_mangler = None
-    """Key mangling function.  
-    
+    """Key mangling function.
+
     May be None, or otherwise declared
     as an ordinary instance method.
 
 
     def __init__(self, arguments): #pragma NO COVERAGE
         """Construct a new :class:`.CacheBackend`.
-        
+
         Subclasses should override this to
         handle the given arguments.
-        
+
         :param arguments: The ``arguments`` parameter
          passed to :func:`.make_registry`.
-         
+
         """
         raise NotImplementedError()
 
         prefix_len = len(prefix)
         return cls(
                 dict(
-                    (key[prefix_len:], config_dict[key]) 
-                    for key in config_dict 
+                    (key[prefix_len:], config_dict[key])
+                    for key in config_dict
                     if key.startswith(prefix)
                 )
             )
 
         This object need only provide an ``acquire()``
         and ``release()`` method.
-        
+
         May return ``None``, in which case the dogpile
         lock will use a regular ``threading.Lock``
-        object to mutex concurrent threads for 
+        object to mutex concurrent threads for
         value creation.   The default implementation
         returns ``None``.
-        
+
         Different backends may want to provide various
         kinds of "mutex" objects, such as those which
         link to lock files, distributed mutexes,
         memcached semaphores, etc.  Whatever
         kind of system is best suited for the scope
         and behavior of the caching backend.
-        
+
         A mutex that takes the key into account will
         allow multiple regenerate operations across
         keys to proceed simultaneously, while a mutex
         that does not will serialize regenerate operations
         to just one at a time across all keys in the region.
         The latter approach, or a variant that involves
-        a modulus of the given key's hash value, 
+        a modulus of the given key's hash value,
         can be used as a means of throttling the total
         number of value recreation operations that may
         proceed at one time.
-        
+
         """
         return None
 
     def get(self, key): #pragma NO COVERAGE
         """Retrieve a value from the cache.
-        
+
         The returned value should be an instance of
         :class:`.CachedValue`, or ``NO_VALUE`` if
         not present.
-        
+
         """
         raise NotImplementedError()
 
     def set(self, key, value): #pragma NO COVERAGE
         """Set a value in the cache.
-        
+
         The key will be whatever was passed
         to the registry, processed by the
         "key mangling" function, if any.
         The value will always be an instance
         of :class:`.CachedValue`.
-        
+
         """
         raise NotImplementedError()
 
     def delete(self, key): #pragma NO COVERAGE
         """Delete a value from the cache.
-        
+
         The key will be whatever was passed
         to the registry, processed by the
         "key mangling" function, if any.
-        
+
         The behavior here should be idempotent,
         that is, can be called any number of times
         regardless of whether or not the

dogpile/cache/backends/file.py

 
     DBM access is provided using the Python ``anydbm`` module,
     which selects a platform-specific dbm module to use.
-    This may be made to be more configurable in a future 
+    This may be made to be more configurable in a future
     release.
-    
+
     Note that different dbm modules have different behaviors.
-    Some dbm implementations handle their own locking, while 
+    Some dbm implementations handle their own locking, while
     others don't.  The :class:`.DBMBackend` uses a read/write
     lockfile by default, which is compatible even with those
     DBM implementations for which this is unnecessary,
     though the behavior can be disabled.
 
     The DBM backend by default makes use of two lockfiles.
-    One is in order to protect the DBM file itself from 
+    One is in order to protect the DBM file itself from
     concurrent writes, the other is to coordinate
     value creation (i.e. the dogpile lock).  By default,
-    these lockfiles use the ``flock()`` system call 
-    for locking; this is only available on Unix 
+    these lockfiles use the ``flock()`` system call
+    for locking; this is only available on Unix
     platforms.
-    
+
     Currently, the dogpile lock is against the entire
     DBM file, not per key.   This means there can
     only be one "creator" job running at a time
     per dbm file.
-    
-    A future improvement might be to have the dogpile lock 
-    using a filename that's based on a modulus of the key. 
-    Locking on a filename that uniquely corresponds to the 
+
+    A future improvement might be to have the dogpile lock
+    using a filename that's based on a modulus of the key.
+    Locking on a filename that uniquely corresponds to the
     key is problematic, since it's not generally safe to
-    delete lockfiles as the application runs, implying an 
-    unlimited number of key-based files would need to be 
+    delete lockfiles as the application runs, implying an
+    unlimited number of key-based files would need to be
     created and never deleted.
-    
-    Parameters to the ``arguments`` dictionary are 
+
+    Parameters to the ``arguments`` dictionary are
     below.
 
-    :param filename: path of the filename in which to 
+    :param filename: path of the filename in which to
      create the DBM file.  Note that some dbm backends
      will change this name to have additional suffixes.
     :param rw_lockfile: the name of the file to use for
      read/write locking.  If omitted, a default name
-     is used by appending the suffix ".rw.lock" to the 
+     is used by appending the suffix ".rw.lock" to the
      DBM filename.  If False, then no lock is used.
     :param dogpile_lockfile: the name of the file to use
-     for value creation, i.e. the dogpile lock.  If 
-     omitted, a default name is used by appending the 
-     suffix ".dogpile.lock" to the DBM filename. If 
+     for value creation, i.e. the dogpile lock.  If
+     omitted, a default name is used by appending the
+     suffix ".dogpile.lock" to the DBM filename. If
      False, then dogpile.cache uses the default dogpile
      lock, a plain thread-based mutex.
 
         dir_, filename = os.path.split(self.filename)
 
         self._rw_lock = self._init_lock(
-                                arguments.get('rw_lockfile'), 
+                                arguments.get('rw_lockfile'),
                                 ".rw.lock", dir_, filename)
         self._dogpile_lock = self._init_lock(
-                                arguments.get('dogpile_lockfile'), 
-                                ".dogpile.lock", 
+                                arguments.get('dogpile_lockfile'),
+                                ".dogpile.lock",
                                 dir_, filename,
                                 util.KeyReentrantMutex.factory)
 
 
     def get_mutex(self, key):
         # using one dogpile for the whole file.   Other ways
-        # to do this might be using a set of files keyed to a 
+        # to do this might be using a set of files keyed to a
         # hash/modulus of the key.   the issue is it's never
-        # really safe to delete a lockfile as this can 
+        # really safe to delete a lockfile as this can
         # break other processes trying to get at the file
         # at the same time - so handling unlimited keys
         # can't imply unlimited filenames
     @contextmanager
     def _dbm_file(self, write):
         with self._use_rw_lock(write):
-            dbm = self.dbmmodule.open(self.filename, 
+            dbm = self.dbmmodule.open(self.filename,
                                 "w" if write else "r")
             yield dbm
             dbm.close()
 
 class FileLock(object):
     """Use lockfiles to coordinate read/write access to a file.
-    
-    Only works on Unix systems, using 
+
+    Only works on Unix systems, using
     `fcntl.flock() <http://docs.python.org/library/fcntl.html>`_.
-    
+
     """
 
     def __init__(self, filename):
         except IOError:
             os.close(fileno)
             if not wait:
-                # this is typically 
+                # this is typically
                 # "[Errno 35] Resource temporarily unavailable",
                 # because of LOCK_NB
                 return False

dogpile/cache/backends/memcached.py

 
      .. note::
 
-         This parameter is **different** from Dogpile's own 
+         This parameter is **different** from Dogpile's own
          ``expiration_time``, which is the number of seconds after
-         which Dogpile will consider the value to be expired. 
-         When Dogpile considers a value to be expired, 
+         which Dogpile will consider the value to be expired.
+         When Dogpile considers a value to be expired,
          it **continues to use the value** until generation
-         of a new value is complete, when using 
+         of a new value is complete, when using
          :meth:`.CacheRegion.get_or_create`.
          Therefore, if you are setting ``memcached_expire_time``, you'll
-         want to make sure it is greater than ``expiration_time`` 
+         want to make sure it is greater than ``expiration_time``
          by at least enough seconds for new values to be generated,
-         else the value won't be available during a regeneration, 
-         forcing all threads to wait for a regeneration each time 
+         else the value won't be available during a regeneration,
+         forcing all threads to wait for a regeneration each time
          a value expires.
 
     The :class:`.GenericMemachedBackend` uses a ``threading.local()``
     as most modern memcached clients do not appear to be inherently
     threadsafe.
 
-    In particular, ``threading.local()`` has the advantage over pylibmc's 
-    built-in thread pool in that it automatically discards objects 
+    In particular, ``threading.local()`` has the advantage over pylibmc's
+    built-in thread pool in that it automatically discards objects
     associated with a particular thread when that thread ends.
 
     """
         # using a plain threading.local here.   threading.local
         # automatically deletes the __dict__ when a thread ends,
         # so the idea is that this is superior to pylibmc's
-        # own ThreadMappedPool which doesn't handle this 
+        # own ThreadMappedPool which doesn't handle this
         # automatically.
         self.url = util.to_list(arguments['url'])
         self.distributed_lock = arguments.get('distributed_lock', False)
         self.client.delete(key)
 
 class MemcacheArgs(object):
-    """Mixin which provides support for the 'time' argument to set(), 
+    """Mixin which provides support for the 'time' argument to set(),
     'min_compress_len' to other methods.
-    
+
     """
     def __init__(self, arguments):
         self.min_compress_len = arguments.get('min_compress_len', 0)
         super(MemcacheArgs, self).__init__(arguments)
 
 class PylibmcBackend(MemcacheArgs, GenericMemcachedBackend):
-    """A backend for the 
-    `pylibmc <http://sendapatch.se/projects/pylibmc/index.html>`_ 
+    """A backend for the
+    `pylibmc <http://sendapatch.se/projects/pylibmc/index.html>`_
     memcached client.
 
     A configuration illustrating several of the optional
             }
         )
 
-    Arguments accepted here include those of 
-    :class:`.GenericMemcachedBackend`, as well as 
+    Arguments accepted here include those of
+    :class:`.GenericMemcachedBackend`, as well as
     those below.
 
     :param binary: sets the ``binary`` flag understood by
      ``pylibmc.Client``.
     :param behaviors: a dictionary which will be passed to
      ``pylibmc.Client`` as the ``behaviors`` parameter.
-    :param min_compres_len: Integer, will be passed as the 
+    :param min_compres_len: Integer, will be passed as the
      ``min_compress_len`` parameter to the ``pylibmc.Client.set``
      method.
 
 class MemcachedBackend(MemcacheArgs, GenericMemcachedBackend):
     """A backend using the standard `Python-memcached <http://www.tummy.com/Community/software/python-memcached/>`_
     library.
-    
+
     Example::
-    
+
         from dogpile.cache import make_region
 
         region = make_region().configure(
         return memcache.Client(self.url)
 
 class BMemcachedBackend(GenericMemcachedBackend):
-    """A backend for the 
-    `python-binary-memcached <https://github.com/jaysonsantos/python-binary-memcached>`_ 
+    """A backend for the
+    `python-binary-memcached <https://github.com/jaysonsantos/python-binary-memcached>`_
     memcached client.
 
     This is a pure Python memcached client which
             }
         )
 
-    Arguments which can be passed to the ``arguments`` 
+    Arguments which can be passed to the ``arguments``
     dictionary include:
 
-    :param username: optional username, will be used for 
+    :param username: optional username, will be used for
      SASL authentication.
     :param password: optional password, will be used for
      SASL authentication.
         import bmemcached
 
         class RepairBMemcachedAPI(bmemcached.Client):
-            """Repairs BMemcached's non-standard method 
-            signatures, which was fixed in BMemcached 
-            ef206ed4473fec3b639e.   
+            """Repairs BMemcached's non-standard method
+            signatures, which was fixed in BMemcached
+            ef206ed4473fec3b639e.
 
             """
 

dogpile/cache/backends/memory.py

     There is no size management, and values which
     are placed into the dictionary will remain
     until explicitly removed.   Note that
-    Dogpile's expiration of items is based on 
-    timestamps and does not remove them from 
+    Dogpile's expiration of items is based on
+    timestamps and does not remove them from
     the cache.
 
     E.g.::
-    
+
         from dogpile.cache import make_region
 
         region = make_region().configure(
             'dogpile.cache.memory'
         )
-        
-    
+
+
     To use a Python dictionary of your choosing,
     it can be passed in with the ``cache_dict``
     argument::
-    
+
         my_dictionary = {}
         region = make_region().configure(
             'dogpile.cache.memory',
                 "cache_dict":my_dictionary
             }
         )
-    
-    
+
+
     """
     def __init__(self, arguments):
         self._cache = arguments.pop("cache_dict", {})

dogpile/cache/backends/redis.py

 __all__ = 'RedisBackend', 'RedisLock'
 
 class RedisBackend(CacheBackend):
-    """A `Redis <http://redis.io/>`_ backend, using the 
+    """A `Redis <http://redis.io/>`_ backend, using the
     `redis-py <http://pypi.python.org/pypi/redis/>`_ backend.
 
     Example configuration::

dogpile/cache/plugins/mako_cache.py

 Mako Integration
 ----------------
 
-dogpile.cache includes a `Mako <http://www.makotemplates.org>`_ plugin that replaces `Beaker <http://beaker.groovie.org>`_ 
+dogpile.cache includes a `Mako <http://www.makotemplates.org>`_ plugin that replaces `Beaker <http://beaker.groovie.org>`_
 as the cache backend.
 Setup a Mako template lookup using the "dogpile.cache" cache implementation
 and a region dictionary::
 
     my_regions = {
         "local":make_region().configure(
-                    "dogpile.cache.dbm", 
+                    "dogpile.cache.dbm",
                     expiration_time=360,
                     arguments={"filename":"file.dbm"}
                 ),
         "memcached":make_region().configure(
-                    "dogpile.cache.pylibmc", 
+                    "dogpile.cache.pylibmc",
                     expiration_time=3600,
                     arguments={"url":["127.0.0.1"]}
                 )
 
     def put(self, key, value, **kw):
         self._get_region(**kw).put(key, value)
- 
+
     def get(self, key, **kw):
         return self._get_region(**kw).get(key)
- 
+
     def invalidate(self, key, **kw):
         self._get_region(**kw).delete(key)

dogpile/cache/region.py

 
 class CacheRegion(object):
     """A front end to a particular cache backend.
-    
+
     :param name: Optional, a string name for the region.
      This isn't used internally
      but can be accessed via the ``.name`` parameter, helpful
      for configuring a region from a config file.
-    :param function_key_generator:  Optional.  A 
-     function that will produce a "cache key" given 
+    :param function_key_generator:  Optional.  A
+     function that will produce a "cache key" given
      a data creation function and arguments, when using
      the :meth:`.CacheRegion.cache_on_arguments` method.
      The structure of this function
-     should be two levels: given the data creation function, 
-     return a new function that generates the key based on 
+     should be two levels: given the data creation function,
+     return a new function that generates the key based on
      the given arguments.  Such as::
 
         def my_key_generator(namespace, fn):
                 "filename":"file.dbm"
             }
         )
-     
-     The ``namespace`` is that passed to 
+
+     The ``namespace`` is that passed to
      :meth:`.CacheRegion.cache_on_arguments`.  It's not consulted
      outside this function, so in fact can be of any form.
-     For example, it can be passed as a tuple, used to specify 
+     For example, it can be passed as a tuple, used to specify
      arguments to pluck from \**kw::
-     
+
         def my_key_generator(namespace, fn):
             def generate_key(*arg, **kw):
                 return ":".join(
-                        [kw[k] for k in namespace] + 
+                        [kw[k] for k in namespace] +
                         [str(x) for x in arg]
                     )
-        
+
      Where the decorator might be used as::
-     
+
         @my_region.cache_on_arguments(namespace=('x', 'y'))
         def my_function(a, b, **kw):
             return my_data()
      keys before passing to the backend.  Defaults to ``None``,
      in which case the key mangling function recommended by
      the cache backend will be used.    A typical mangler
-     is the SHA1 mangler found at :func:`.sha1_mangle_key` 
+     is the SHA1 mangler found at :func:`.sha1_mangle_key`
      which coerces keys into a SHA1
      hash, so that the string length is fixed.  To
      disable all key mangling, set to ``False``.
-    
+
     """
 
     def __init__(self,
-            name=None, 
+            name=None,
             function_key_generator=function_key_generator,
             key_mangler=None
 
             _config_prefix=None
         ):
         """Configure a :class:`.CacheRegion`.
-        
-        The :class:`.CacheRegion` itself 
+
+        The :class:`.CacheRegion` itself
         is returned.
-        
-        :param backend:   Required.  This is the name of the 
-         :class:`.CacheBackend` to use, and is resolved by loading 
+
+        :param backend:   Required.  This is the name of the
+         :class:`.CacheBackend` to use, and is resolved by loading
          the class from the ``dogpile.cache`` entrypoint.
 
-        :param expiration_time:   Optional.  The expiration time passed 
+        :param expiration_time:   Optional.  The expiration time passed
          to the dogpile system.  The :meth:`.CacheRegion.get_or_create`
-         method as well as the :meth:`.CacheRegion.cache_on_arguments` 
-         decorator (though note:  **not** the :meth:`.CacheRegion.get` 
+         method as well as the :meth:`.CacheRegion.cache_on_arguments`
+         decorator (though note:  **not** the :meth:`.CacheRegion.get`
          method) will call upon the value creation function after this
          time period has passed since the last generation.
-         
-         .. note:: 
-         
-            The expiration_time is only stored *locally*, 
-            within the :class:`.CacheRegion` instance, and not 
-            in the cache itself.  Only the *creation time* of a 
-            particular value is stored along with it in the cache. 
+
+         .. note::
+
+            The expiration_time is only stored *locally*,
+            within the :class:`.CacheRegion` instance, and not
+            in the cache itself.  Only the *creation time* of a
+            particular value is stored along with it in the cache.
             This means an individual :class:`.CacheRegion` can
             be modified to have a different expiration time, which
             will have an immediate effect on data in the cache without
             the need to set new values.
-            
 
-        :param arguments:   Optional.  The structure here is passed 
-         directly to the constructor of the :class:`.CacheBackend` 
+
+        :param arguments:   Optional.  The structure here is passed
+         directly to the constructor of the :class:`.CacheBackend`
          in use, though is typically a dictionary.
-         
+
         """
         if "backend" in self.__dict__:
             raise Exception(
                     "This region is already "
-                    "configured with backend: %s" 
+                    "configured with backend: %s"
                     % self.backend)
         backend_cls = _backend_loader.load(backend)
         if _config_argument_dict:
         if expiration_time is None:
             expiration_time = self.expiration_time
         return Dogpile(
-                expiration_time, 
+                expiration_time,
                 lock=self.backend.get_mutex(identifier)
             )
 
     def invalidate(self):
         """Invalidate this :class:`.CacheRegion`.
-        
+
         Invalidation works by setting a current timestamp
         (using ``time.time()``)
-        representing the "minimum creation time" for 
+        representing the "minimum creation time" for
         a value.  Any retrieved value whose creation
-        time is prior to this timestamp 
+        time is prior to this timestamp
         is considered to be stale.  It does not
         affect the data in the cache in any way, and is also
         local to this instance of :class:`.CacheRegion`.
-        
+
         Once set, the invalidation time is honored by
-        the :meth:`.CacheRegion.get_or_create` and 
+        the :meth:`.CacheRegion.get_or_create` and
         :meth:`.CacheRegion.get` methods.
-        
+
         .. versionadded:: 0.3.0
-        
+
         """
         self._invalidated = time.time()
 
     def configure_from_config(self, config_dict, prefix):
-        """Configure from a configuration dictionary 
+        """Configure from a configuration dictionary
         and a prefix.
-        
+
         Example::
-        
+
             local_region = make_region()
             memcached_region = make_region()
 
                 "cache.memcached.arguments.url":"127.0.0.1, 10.0.0.1",
             }
             local_region.configure_from_config(myconfig, "cache.local.")
-            memcached_region.configure_from_config(myconfig, 
+            memcached_region.configure_from_config(myconfig,
                                                 "cache.memcached.")
 
         """
         time supplied by the ``expiration_time`` argument,
         is tested against the creation time of the retrieved
         value versus the current time (as reported by ``time.time()``).
-        If stale, the cached value is ignored and the ``NO_VALUE`` 
+        If stale, the cached value is ignored and the ``NO_VALUE``
         token is returned.  Passing the flag ``ignore_expiration=True``
         bypasses the expiration time check.
-        
+
         .. versionchanged:: 0.3.0
            :meth:`.CacheRegion.get` now checks the value's creation time
            against the expiration time, rather than returning
            the value unconditionally.
-        
+
         The method also interprets the cached value in terms
-        of the current "invalidation" time as set by 
+        of the current "invalidation" time as set by
         the :meth:`.invalidate` method.   If a value is present,
-        but its creation time is older than the current 
+        but its creation time is older than the current
         invalidation time, the ``NO_VALUE`` token is returned.
         Passing the flag ``ignore_expiration=True`` bypasses
         the invalidation time check.
-        
-        .. versionadded:: 0.3.0 
+
+        .. versionadded:: 0.3.0
            Support for the :meth:`.CacheRegion.invalidate`
            method.
-        
+
         :param key: Key to be retrieved. While it's typical for a key to be a
          string, it is ultimately passed directly down to the cache backend,
          before being optionally processed by the key_mangler function, so can
          be of any type recognized by the backend or by the key_mangler
          function, if present.
 
-        :param expiration_time: Optional expiration time value 
+        :param expiration_time: Optional expiration time value
          which will supersede that configured on the :class:`.CacheRegion`
          itself.
-         
+
          .. versionadded:: 0.3.0
-         
+
         :param ignore_expiration: if ``True``, the value is returned
          from the cache if present, regardless of configured
          expiration times or whether or not :meth:`.invalidate`
          was called.
-         
+
          .. versionadded:: 0.3.0
 
         """
         return value.payload
 
     def get_or_create(self, key, creator, expiration_time=None):
-        """Return a cached value based on the given key.  
-        
+        """Return a cached value based on the given key.
+
         If the value does not exist or is considered to be expired
-        based on its creation time, the given 
+        based on its creation time, the given
         creation function may or may not be used to recreate the value
         and persist the newly generated value in the cache.
-        
+
         Whether or not the function is used depends on if the
         *dogpile lock* can be acquired or not.  If it can't, it means
         a different thread or process is already running a creation
         function for this key against the cache.  When the dogpile
         lock cannot be acquired, the method will block if no
         previous value is available, until the lock is released and
-        a new value available.  If a previous value 
+        a new value available.  If a previous value
         is available, that value is returned immediately without blocking.
-        
-        If the :meth:`.invalidate` method has been called, and 
+
+        If the :meth:`.invalidate` method has been called, and
         the retrieved value's timestamp is older than the invalidation
         timestamp, the value is unconditionally prevented from
-        being returned.  The method will attempt to acquire the dogpile 
+        being returned.  The method will attempt to acquire the dogpile
         lock to generate a new value, or will wait
         until the lock is released to return the new value.
 
         :param expiration_time: optional expiration time which will overide
          the expiration time already configured on this :class:`.CacheRegion`
          if not None.   To set no expiration, use the value -1.
-         
+
          .. note::
-         
+
             the expiration_time argument here is **not guaranteed** to be
             effective if multiple concurrent threads are accessing the same
             key via :meth:`get_or_create` using different values
             for ``expiration_time`` - the first thread within a cluster
             of concurrent usages establishes the expiration time within a
             :class:`.Dogpile` instance for the duration of those usages.
-            It is advised that all access to a particular key within a particular 
+            It is advised that all access to a particular key within a particular
             :class:`.CacheRegion` use the **same** value for ``expiration_time``.
-            Sticking with the default expiration time configured for 
+            Sticking with the default expiration time configured for
             the :class:`.CacheRegion` as a whole is expected to be the
             usual mode of operation.
 
         See also:
-        
+
         :meth:`.CacheRegion.cache_on_arguments` - applies :meth:`.get_or_create`
         to any function using a decorator.
-         
+
         """
         if self.key_mangler:
             key = self.key_mangler(key)
             value = self.backend.get(key)
             if value is NO_VALUE or \
                 value.metadata['v'] != value_version or \
-                    (self._invalidated and 
+                    (self._invalidated and
                     value.metadata["ct"] < self._invalidated):
                 raise NeedRegenerationException()
             return value.payload, value.metadata["ct"]
             return value.payload, value.metadata["ct"]
 
         dogpile = self.dogpile_registry.get(key, expiration_time)
-        with dogpile.acquire(gen_value, 
+        with dogpile.acquire(gen_value,
                     value_and_created_fn=get_value) as value:
             return value
 
     def _value(self, value):
         """Return a :class:`.CachedValue` given a value."""
         return CachedValue(value, {
-                            "ct":time.time(), 
+                            "ct":time.time(),
                             "v":value_version
                         })
 
     def delete(self, key):
         """Remove a value from the cache.
 
-        This operation is idempotent (can be called multiple times, or on a 
+        This operation is idempotent (can be called multiple times, or on a
         non-existent key, safely)
         """
 
         self.backend.delete(key)
 
     def cache_on_arguments(self, namespace=None, expiration_time=None):
-        """A function decorator that will cache the return 
-        value of the function using a key derived from the 
+        """A function decorator that will cache the return
+        value of the function using a key derived from the
         function itself and its arguments.
-        
+
         The decorator internally makes use of the
         :meth:`.CacheRegion.get_or_create` method to access the
-        cache and conditionally call the function.  See that 
+        cache and conditionally call the function.  See that
         method for additional behavioral details.
-        
+
         E.g.::
-        
+
             @someregion.cache_on_arguments()
             def generate_something(x, y):
                 return somedatabase.query(x, y)
-                
+
         The decorated function can then be called normally, where
         data will be pulled from the cache region unless a new
         value is needed::
-        
+
             result = generate_something(5, 6)
-        
+
         The function is also given an attribute ``invalidate``, which
         provides for invalidation of the value.  Pass to ``invalidate()``
         the same arguments you'd pass to the function itself to represent
         a particular value::
-        
+
             generate_something.invalidate(5, 6)
 
         The default key generation will use the name
         of the function, the module name for the function,
         the arguments passed, as well as an optional "namespace"
         parameter in order to generate a cache key.
-        
+
         Given a function ``one`` inside the module
         ``myapp.tools``::
-        
+
             @region.cache_on_arguments(namespace="foo")
             def one(a, b):
                 return a + b
 
         Above, calling ``one(3, 4)`` will produce a
         cache key as follows::
-        
+
             myapp.tools:one|foo|3, 4
-        
+
         The key generator will ignore an initial argument
         of ``self`` or ``cls``, making the decorator suitable
         (with caveats) for use with instance or class methods.
         Given the example::
-        
+
             class MyClass(object):
                 @region.cache_on_arguments(namespace="foo")
                 def one(self, a, b):
                     return a + b
 
-        The cache key above for ``MyClass().one(3, 4)`` will 
+        The cache key above for ``MyClass().one(3, 4)`` will
         again produce the same cache key of ``myapp.tools:one|foo|3, 4`` -
         the name ``self`` is skipped.
-        
+
         The ``namespace`` parameter is optional, and is used
         normally to disambiguate two functions of the same
         name within the same module, as can occur when decorating
         instance or class methods as below::
-            
+
             class MyClass(object):
                 @region.cache_on_arguments(namespace='MC')
                 def somemethod(self, x, y):
                 @region.cache_on_arguments(namespace='MOC')
                 def somemethod(self, x, y):
                     ""
-                    
+
         Above, the ``namespace`` parameter disambiguates
         between ``somemethod`` on ``MyClass`` and ``MyOtherClass``.
         Python class declaration mechanics otherwise prevent
         The function key generation can be entirely replaced
         on a per-region basis using the ``function_key_generator``
         argument present on :func:`.make_region` and
-        :class:`.CacheRegion`. If defaults to 
+        :class:`.CacheRegion`. If defaults to
         :func:`.function_key_generator`.
 
         :param namespace: optional string argument which will be
          established as part of the cache key.   This may be needed
          to disambiguate functions of the same name within the same
          source file, such as those
-         associated with classes - note that the decorator itself 
+         associated with classes - note that the decorator itself
          can't see the parent class on a function as the class is
          being declared.
         :param expiration_time: if not None, will override the normal
 
 def make_region(*arg, **kw):
     """Instantiate a new :class:`.CacheRegion`.
-    
+
     Currently, :func:`.make_region` is a passthrough
     to :class:`.CacheRegion`.  See that class for
     constructor arguments.
-    
+
     """
     return CacheRegion(*arg, **kw)
 

dogpile/cache/util.py

             # unit test entrypoint stuff, let me know.
             import pkg_resources
             for impl in pkg_resources.iter_entry_points(
-                                self.group, 
+                                self.group,
                                 name):
                 self.impls[name] = impl.load
                 return impl.load()
             else:
                 raise Exception(
-                        "Can't load plugin %s %s" % 
+                        "Can't load plugin %s %s" %
                         (self.group, name))
 
     def register(self, name, modulepath, objname):
Tip: Filter by directory path e.g. /media app.js to search for public/media/app.js.
Tip: Use camelCasing e.g. ProjME to search for ProjectModifiedEvent.java.
Tip: Filter by extension type e.g. /repo .js to search for all .js files in the /repo directory.
Tip: Separate your search with spaces e.g. /ssh pom.xml to search for src/ssh/pom.xml.
Tip: Use ↑ and ↓ arrow keys to navigate and return to view the file.
Tip: You can also navigate files with Ctrl+j (next) and Ctrl+k (previous) and view the file with Ctrl+o.
Tip: You can also navigate files with Alt+j (next) and Alt+k (previous) and view the file with Alt+o.