Commits

Mike Bayer committed b9fffed

- deprecate Dogpile, and rework all the docs
- fix SyncReaderDogpile

  • Participants
  • Parent commits f79e120

Comments (0)

Files changed (6)

File docs/build/api.rst

 Dogpile
 =======
 
-.. autoclass:: Dogpile
-    :members:
-
-.. autoclass:: SyncReaderDogpile
+.. autoclass:: Lock
     :members:
 
 .. autoclass:: NeedRegenerationException
     :members:
 
-NameRegistry
-=============
-
-.. autoclass:: NameRegistry
-    :members:
-
 Utilities
 ==========
 
 .. autoclass:: ReadWriteMutex
     :members:
 
+.. autoclass:: NameRegistry
+    :members:
 
+Legacy API
+===========
+
+.. autoclass:: Dogpile
+	:members:
+
+.. autoclass:: SyncReaderDogpile
+	:members:
+
+

File docs/build/usage.rst

 Usage Guide
 ===========
 
-At its core, dogpile.core provides a locking interface around a "value creation" function.
+dogpile.core provides a locking interface around a "value creation" and
+"value retrieval" pair of functions.
 
-The interface supports several levels of usage, starting from
-one that is very rudimentary, then providing more intricate 
-usage patterns to deal with certain scenarios.  The documentation here will attempt to 
-provide examples that use successively more and more of these features, as 
-we approach how a fully featured caching system might be constructed around
-dogpile.core.
+The primary interface is the :class:`.Lock` object, which provides for
+the invocation of the creation function by only one thread and/or process at
+a time, deferring all other threads/processes to the "value retrieval" function
+until the single creation thread is completed.
 
 Do I Need to Learn the dogpile.core API Directly?
 =================================================
 `dogpile.cache <http://bitbucket.org/zzzeek/dogpile.cache>`_ caching
 front-end.  If you fall into this category, then the short answer is no.
 
-dogpile.core provides core internals to the 
+dogpile.core provides core internals to the
 `dogpile.cache <http://bitbucket.org/zzzeek/dogpile.cache>`_
 package, which provides a simple-to-use caching API, rudimental
-backends for Memcached and others, and easy hooks to add new backends.  
+backends for Memcached and others, and easy hooks to add new backends.
 Users of dogpile.cache
 don't need to know or access dogpile.core's APIs directly, though a rough understanding
 the general idea is always helpful.
 
-Using the core dogpile.core APIs described here directly implies you're building your own 
-resource-usage system outside, or in addition to, the one 
+Using the core dogpile.core APIs described here directly implies you're building your own
+resource-usage system outside, or in addition to, the one
 `dogpile.cache <http://bitbucket.org/zzzeek/dogpile.cache>`_ provides.
 
 Rudimentary Usage
 ==================
 
-A simple example::
+The primary API dogpile provides is the :class:`.Lock` object.   This object allows for
+functions that provide mutexing, value creation, as well as value retrieval.
 
-    from dogpile.core import Dogpile
+.. versionchanged:: 0.4.0
+    The :class:`.Dogpile` class is no longer the primary API of dogpile,
+    replaced by the more straightforward :class:`.Lock` object.
 
-    # store a reference to a "resource", some 
-    # object that is expensive to create.
-    the_resource = [None]
+An example usage is as follows::
 
-    def some_creation_function():
-        # create the resource here
-        the_resource[0] = create_some_resource()
+  from dogpile.core import Lock, NeedRegenerationException
+  import threading
+  import time
 
-    def use_the_resource():
+  # store a reference to a "resource", some
+  # object that is expensive to create.
+  the_resource = [None]
+
+  def some_creation_function():
+      # call a value creation function
+      value = create_some_resource()
+
+      # get creationtime using time.time()
+      creationtime = time.time()
+
+      # keep track of the value and creation time in the "cache"
+      the_resource[0] = tup = (value, creationtime)
+
+      # return the tuple of (value, creationtime)
+      return tup
+
+  def retrieve_resource():
+      # function that retrieves the resource and
+      # creation time.
+
+      # if no resource, then raise NeedRegenerationException
+      if the_resource[0] is None:
+          raise NeedRegenerationException()
+
+      # else return the tuple of (value, creationtime)
+      return the_resource[0]
+
+  mutex = threading.Lock()
+
+  with Lock(mutex, some_creation_function, retrieve_resource, 3600) as value:
         # some function that uses
         # the resource.  Won't reach
         # here until some_creation_function()
         # has completed at least once.
-        the_resource[0].do_something()
-
-    # create Dogpile with 3600 second
-    # expiry time
-    dogpile = Dogpile(3600)
-
-    with dogpile.acquire(some_creation_function):
-        use_the_resource()
+        value.do_something()
 
 Above, ``some_creation_function()`` will be called
-when :meth:`.Dogpile.acquire` is first called.  The 
-remainder of the ``with`` block then proceeds.   Concurrent threads which 
-call :meth:`.Dogpile.acquire` during this initial period
+when :class:`.Lock` is first invoked as a context manager.   The value returned by this
+function is then passed into the ``with`` block, where it can be used
+by application code.  Concurrent threads which
+call :class:`.Lock` during this initial period
 will be blocked until ``some_creation_function()`` completes.
 
 Once the creation function has completed successfully the first time,
-new calls to :meth:`.Dogpile.acquire` will call ``some_creation_function()`` 
-each time the "expiretime" has been reached, allowing only a single
-thread to call the function.  Concurrent threads
-which call :meth:`.Dogpile.acquire` during this period will
-fall through, and not be blocked.  It is expected that
-the "stale" version of the resource remain available at this
-time while the new one is generated.
+new calls to :class:`.Lock` will call ``retrieve_resource()``
+in order to get the current cached value as well as its creation
+time; if the creation time is older than the current time minus
+an expiration time of 3600, then ``some_creation_function()``
+will be called again, but only by one thread/process, using the given
+mutex object as a source of synchronization.  Concurrent threads/processes
+which call :class:`.Lock` during this period will fall through,
+and not be blocked; instead, the "stale" value just returned by
+``retrieve_resource()`` will continue to be returned until the creation
+function has finished.
 
-By default, :class:`.Dogpile` uses Python's ``threading.Lock()`` 
-to synchronize among threads within a process.  This can 
-be altered to support any kind of locking as we'll see in a 
-later section.
-
-Using a Value Function with a Cache Backend
-=============================================
-
-The dogpile lock includes a more intricate mode of usage to optimize the
-usage of a cache like Memcached.   The difficulties :class:`.Dogpile` addresses
-in this mode are:
+The :class:`.Lock` API is designed to work with simple cache backends
+like Memcached.   It addresses such issues as:
 
 * Values can disappear from the cache at any time, before our expiration
-  time is reached. :class:`.Dogpile` needs to be made aware of this and possibly 
-  call the creation function ahead of schedule.
-* There's no function in a Memcached-like system to "check" for a key without 
-  actually retrieving it.  If we need to "check" for a key each time, 
-  we'd like to use that value instead of calling it twice.
-* If we did end up generating the value on this get, we should return 
-  that value instead of doing a cache round-trip.
-
-To use this mode, the steps are as follows:
-
-* Create the :class:`.Dogpile` lock with ``init=True``, to skip the initial
-  "force" of the creation function.   This is assuming you'd like to
-  rely upon the "check the value" function for the initial generation.
-  Leave it at False if you'd like the application to regenerate the
-  value unconditionally when the :class:`.Dogpile` lock is first created
-  (i.e. typically application startup).
-* The "creation" function should return the value it creates.
-* An additional "getter" function is passed to ``acquire()`` which
-  should return the value to be passed to the context block.  If
-  the value isn't available, raise ``NeedRegenerationException``.
-
-Example::
-
-    from dogpile.core import Dogpile, NeedRegenerationException
-
-    def get_value_from_cache():
-        value = my_cache.get("some key")
-        if value is None:
-            raise NeedRegenerationException()
-        return value
-
-    def create_and_cache_value():
-        value = my_expensive_resource.create_value()
-        my_cache.put("some key", value)
-        return value
-
-    dogpile = Dogpile(3600, init=True)
-
-    with dogpile.acquire(create_and_cache_value, get_value_from_cache) as value:
-        return value
-
-Note that ``get_value_from_cache()`` should not raise :class:`.NeedRegenerationException`
-a second time directly after ``create_and_cache_value()`` has been called.
+  time is reached.  The :class:`.NeedRegenerationException` class is used
+  to alert the :class:`.Lock` object that a value needs regeneration ahead
+  of the usual expiration time.
+* There's no function in a Memcached-like system to "check" for a key without
+  actually retrieving it.  The usage of the ``retrieve_resource()`` function
+  allows that we check for an existing key and also return the existing value,
+  if any, at the same time, without the need for two separate round trips.
+* The "creation" function used by :class:`.Lock` is expected to store the
+  newly created value in the cache, as well as to return it.   This is also
+  more efficient than using two separate round trips to separately store,
+  and re-retrieve, the object.
 
 .. _caching_decorator:
 
 illustrate how to approximate Beaker's "cache decoration"
 function, to decorate any function and store the value in
 Memcached.  We create a Python decorator function called ``cached()`` which
-will provide caching for the output of a single function.  It's given 
+will provide caching for the output of a single function.  It's given
 the "key" which we'd like to use in Memcached, and internally it makes
-usage of its own :class:`.Dogpile` object that is dedicated to managing
-this one function/key::
+usage of :class:`.Lock`, along with a thread based mutex (we'll see a distributed mutex
+in the next section)::
 
     import pylibmc
+    import threading
+    from dogpile.core import Lock, NeedRegenerationException
+    import time
+
     mc_pool = pylibmc.ThreadMappedPool(pylibmc.Client("localhost"))
-
-    from dogpile.core import Dogpile, NeedRegenerationException
+    mutex = threading.Lock()
 
     def cached(key, expiration_time):
         """A decorator that will cache the return value of a function
 
         def get_value():
              with mc_pool.reserve() as mc:
-                value = mc.get(key)
-                if value is None:
-                    raise NeedRegenerationException()
-                return value
-
-        dogpile = Dogpile(expiration_time, init=True)
-
-        def decorate(fn):
-            def gen_cached():
-                value = fn()
-                with mc_pool.reserve() as mc:
-                    mc.put(key, value)
-                return value
-
-            def invoke():
-                with dogpile.acquire(gen_cached, get_value) as value:
-                    return value
-            return invoke
-
-        return decorate
-
-Above we can decorate any function as::
-
-    @cached("some key", 3600)
-    def generate_my_expensive_value():
-        return slow_database.lookup("stuff")
-
-The :class:`.Dogpile` lock will ensure that only one thread at a time performs ``slow_database.lookup()``,
-and only every 3600 seconds, unless Memcached has removed the value in which case it will
-be called again as needed.
-
-In particular, dogpile.core's system allows us to call the memcached get() function at most
-once per access, instead of Beaker's system which calls it twice, and doesn't make us call
-get() when we just created the value.
-
-.. _scaling_on_keys:
-
-Scaling dogpile.core against Many Keys
-=======================================
-
-The patterns so far have illustrated how to use a single, persistently held
-:class:`.Dogpile` object which maintains a thread-based lock for the lifespan
-of some particular value.  The :class:`.Dogpile` also is responsible for
-maintaining the last known "creation time" of the value; this is available
-from a given :class:`.Dogpile` object from the :attr:`.Dogpile.createdtime`
-attribute.
-
-For an application that may deal with an arbitrary
-number of cache keys retrieved from a remote service, this approach must be 
-revised so that we don't need to store a :class:`.Dogpile` object for every
-possible key in our application's memory.
-
-The two challenges here are:
-
-* We need to create new :class:`.Dogpile` objects as needed, ideally
-  sharing the object for a given key with all concurrent threads,
-  but then not hold onto it afterwards.
-* Since we aren't holding the :class:`.Dogpile` persistently, we 
-  need to store the last known "creation time" of the value somewhere
-  else, i.e. in the cache itself, and ensure :class:`.Dogpile` uses 
-  it.
-
-The approach is another one derived from Beaker, where we will use a *registry*
-that can provide a unique :class:`.Dogpile` object given a particular key,
-ensuring that all concurrent threads use the same object, but then releasing
-the object to the Python garbage collector when this usage is complete.
-The :class:`.NameRegistry` object provides this functionality, again
-constructed around the notion of a creation function that is only invoked
-as needed.   We also will instruct the :meth:`.Dogpile.acquire` method
-to use a "creation time" value that we retrieve from the cache, via
-the ``value_and_created_fn`` parameter, which supercedes the
-``value_fn`` we used earlier.  ``value_and_created_fn`` expects a function that will return a tuple
-of ``(value, created_at)``, where it's assumed both have been retrieved from
-the cache backend::
-
-    import pylibmc
-    import time
-    from dogpile.core import Dogpile, NeedRegenerationException, NameRegistry
-
-    mc_pool = pylibmc.ThreadMappedPool(pylibmc.Client("localhost"))
-
-    def create_dogpile(key, expiration_time):
-        return Dogpile(expiration_time)
-
-    dogpile_registry = NameRegistry(create_dogpile)
-
-    def get_or_create(key, expiration_time, creation_function):
-        def get_value():
-             with mc_pool.reserve() as mc:
                 value_plus_time = mc.get(key)
                 if value_plus_time is None:
                     raise NeedRegenerationException()
                 # (value, createdtime)
                 return value_plus_time
 
-        def gen_cached():
-            value = creation_function()
-            with mc_pool.reserve() as mc:
-                # create a tuple
-                # (value, createdtime)
-                value_plus_time = (value, time.time())
-                mc.put(key, value_plus_time)
-            return value_plus_time
+        def decorate(fn):
+            def gen_cached():
+                value = fn()
+                with mc_pool.reserve() as mc:
+                    # create a tuple
+                    # (value, createdtime)
+                    value_plus_time = (value, time.time())
+                    mc.put(key, value_plus_time)
+                return value_plus_time
 
-        dogpile = dogpile_registry.get(key, expiration_time)
+            def invoke():
+                with Lock(mutex, gen_cached, get_value, expiration_time) as value:
+                    return value
+            return invoke
 
-        with dogpile.acquire(gen_cached, value_and_created_fn=get_value) as value:
-            return value
+        return decorate
 
+Using the above, we can decorate any function as::
 
-Stepping through the above code:
+    @cached("some key", 3600)
+    def generate_my_expensive_value():
+        return slow_database.lookup("stuff")
 
-* After the imports, we set up the memcached backend using the ``pylibmc`` library's
-  recommended pattern for thread-safe access.
-* We create a Python function that will, given a cache key and an expiration time,
-  produce a :class:`.Dogpile` object which will produce the dogpile mutex on an
-  as-needed basis.   The function here doesn't actually need the key, even though
-  the :class:`.NameRegistry` will be passing it in.  Later, we'll see the scenario
-  for which we'll need this value.
-* We construct a :class:`.NameRegistry`, using our dogpile creator function, that
-  will generate for us new :class:`.Dogpile` locks for individual keys as needed.
-* We define the ``get_or_create()`` function.  This function will accept the cache
-  key, an expiration time value, and a function that is used to create a new value 
-  if one does not exist or the current value is expired.
-* The ``get_or_create()`` function defines two callables, ``get_value()`` and 
-  ``gen_cached()``.   These two functions are exactly analogous to the the
-  functions of the same name in :ref:`caching_decorator` - ``get_value()``
-  retrieves the value from the cache, raising :class:`.NeedRegenerationException`
-  if not present; ``gen_cached()`` calls the creation function to generate a new 
-  value, stores it in the cache, and returns it.  The only difference here is that
-  instead of storing and retrieving the value alone from the cache, the value is 
-  stored along with its creation time; when we make a new value, we set this
-  to ``time.time()``.  While the value and creation time pair are stored here 
-  as a tuple, it doesn't actually matter how the two are persisted; 
-  only that the tuple value is returned from both functions.
-* We acquire a new or existing :class:`.Dogpile` object from the registry using
-  :meth:`.NameRegistry.get`.   We pass the identifying key as well as the expiration
-  time.   A new :class:`.Dogpile` is created for the given key if one does not 
-  exist.  If a :class:`.Dogpile` lock already exists in memory for the given key,
-  we get that one back.
-* We then call :meth:`.Dogpile.acquire` as we did in the previous cache examples,
-  except we use the ``value_and_created_fn`` keyword for our ``get_value()`` 
-  function.  :class:`.Dogpile` uses the "created time" value we pull from our 
-  cache to determine when the value was last created.
+The :class:`.Lock` object will ensure that only one thread at a time performs ``slow_database.lookup()``,
+and only every 3600 seconds, unless Memcached has removed the value, in which case it will
+be called again as needed.
 
-An example usage of the completed function::
+In particular, dogpile.core's system allows us to call the memcached get() function at most
+once per access, instead of Beaker's system which calls it twice, and doesn't make us call
+get() when we just created the value.
 
-    import urllib2
-
-    def get_some_value(key):
-        """retrieve a datafile from a slow site based on the given key."""
-        def get_data():
-            return urllib2.urlopen(
-                        "http://someslowsite.com/some_important_datafile_%s.json" % key
-                    ).read()
-        return get_or_create(key, 3600, get_data)
-
-    my_data = get_some_value("somekey")
 
 Using a File or Distributed Lock with Dogpile
 ==============================================
 
+The examples thus far use a ``threading.Lock()`` object for synchronization.
+If our application uses multiple processes, we will want to coordinate creation
+operations not just on threads, but on some mutex that other processes can access.
 
-The final twist on the caching pattern is to fix the issue of the Dogpile mutex
-itself being local to the current process.   When a handful of threads all go 
-to access some key in our cache, they will access the same :class:`.Dogpile` object
-which internally can synchronize their activity using a Python ``threading.Lock``.
-But in this example we're talking to a Memcached cache.  What if we have many 
-servers which all access this cache?  We'd like all of these servers to coordinate
-together so that we don't just prevent the dogpile problem within a single process,
-we prevent it across all servers.
-
-To accomplish this, we need an object that can coordinate processes.   In this example
+In this example
 we'll use a file-based lock as provided by the `lockfile <http://pypi.python.org/pypi/lockfile>`_
 package, which uses a unix-symlink concept to provide a filesystem-level lock (which also
 has been made threadsafe).  Another strategy may base itself directly off the Unix ``os.flock()``
-call, and still another approach is to lock within Memcached itself, using a recipe 
+call, or use an NFS-safe file lock like `flufl.lock <http://pypi.python.org/pypi/flufl.lock>`_,
+and still another approach is to lock against a cache server, using a recipe
 such as that described at `Using Memcached as a Distributed Locking Service <http://www.regexprn.com/2010/05/using-memcached-as-distributed-locking.html>`_.
-The type of lock chosen here is based on a tradeoff between global availability
-and reliable performance.  The file-based lock will perform more reliably than the
-memcached lock, but may be difficult to make accessible to multiple servers (with NFS 
-being the most likely option, which would eliminate the possibility of the ``os.flock()``
-call).  The memcached lock on the other hand will provide the perfect scope, being available
-from the same memcached server that the cached value itself comes from; however the lock may
-vanish in some cases, which means we still could get a cache-regeneration pileup in that case.
 
 What all of these locking schemes have in common is that unlike the Python ``threading.Lock``
 object, they all need access to an actual key which acts as the symbol that all processes
-will coordinate upon.   This is where the ``key`` argument to our ``create_dogpile()``
-function introduced in :ref:`scaling_on_keys` comes in.   The example can remain
-the same, except for the changes below to just that function::
+will coordinate upon.   So here, we will also need to create the "mutex" which we
+pass to :class:`.Lock` using the ``key`` argument::
 
     import lockfile
     import os
 
     # ... other imports and setup from the previous example
 
-    def create_dogpile(key, expiration_time):
+    def cached(key, expiration_time):
+        """A decorator that will cache the return value of a function
+        in memcached given a key."""
+
         lock_path = os.path.join("/tmp", "%s.lock" % sha1(key).hexdigest())
-        return Dogpile(
-                    expiration_time,
-                    lock=lockfile.FileLock(path)
-                    )
 
-    # ... everything else from the previous example
+        # ... get_value() from the previous example goes here
 
-Where above,the only change is the ``lock`` argument passed to the constructor of
-:class:`.Dogpile`.   For a given key "some_key", we generate a hex digest of it
+        def decorate(fn):
+            # ... gen_cached() from the previous example goes here
+
+            # create an ad-hoc FileLock
+            mutex = lockfile.FileLock(lock_path)
+
+            def invoke():
+                with Lock(mutex, gen_cached, get_value, expiration_time) as value:
+                    return value
+            return invoke
+
+        return decorate
+
+Above, we create the ``mutex`` argument each time using a new ``lockfile.FileLock()``
+object.   For a given key "some_key", we generate a hex digest of it
 first as a quick way to remove any filesystem-unfriendly characters, we then use
-``lockfile.FileLock()`` to create a lock against the file 
-``/tmp/53def077a4264bd3183d4eb21b1f56f883e1b572.lock``.   Any number of :class:`.Dogpile`
-objects in various processes will now coordinate with each other, using this common 
+``lockfile.FileLock()`` to create a lock against the file
+``/tmp/53def077a4264bd3183d4eb21b1f56f883e1b572.lock``.   Any number of :class:`.Lock`
+objects in various processes will now coordinate with each other, using this common
 filename as the "baton" against which creation of a new value proceeds.
 
-Locking the "write" phase against the "readers"
-================================================
-
-A less prominent feature of Dogpile ported from Beaker is the
-ability to provide a mutex against the actual resource being read
-and created, so that the creation function can perform
-certain tasks only after all reader threads have finished.
-The example of this is when the creation function has prepared a new
-datafile to replace the old one, and would like to switch in the
-new file only when other threads have finished using it.
-
-To enable this feature, use :class:`.SyncReaderDogpile`.
-:meth:`.SyncReaderDogpile.acquire_write_lock` then provides a safe-write lock
-for the critical section where readers should be blocked::
-
-
-    from dogpile.core import SyncReaderDogpile
-
-    dogpile = SyncReaderDogpile(3600)
-
-    def some_creation_function(dogpile):
-        create_expensive_datafile()
-        with dogpile.acquire_write_lock():
-            replace_old_datafile_with_new()
-
-    # usage:
-    with dogpile.acquire(some_creation_function):
-        read_datafile()
-
-With the above pattern, :class:`.SyncReaderDogpile` will
-allow concurrent readers to read from the current version 
-of the datafile as 
-the ``create_expensive_datafile()`` function proceeds with its
-job of generating the information for a new version.
-When the data is ready to be written,  the 
-:meth:`.SyncReaderDogpile.acquire_write_lock` call will 
-block until all current readers of the datafile have completed
-(that is, they've finished their own :meth:`.Dogpile.acquire` 
-blocks).   The ``some_creation_function()`` function
-then proceeds, as new readers are blocked until
-this function finishes its work of 
-rewriting the datafile.
-
-Note that the :class:`.SyncReaderDogpile` approach is useful
-for when working with a resource that itself does not support concurent
-access while being written, namely flat files, possibly some forms of DBM file.
-It is **not** needed when dealing with a datasource that already
-provides a high level of concurrency, such as a relational database,
-Memcached, or NoSQL store.   Currently, the :class:`.SyncReaderDogpile` object
-only synchronizes within the current process among multiple threads;
-it won't at this time protect from concurrent access by multiple 
-processes.   Beaker did support this behavior however using lock files,
-and this functionality may be re-added in a future release.
-
-

File dogpile/core/__init__.py

-from .dogpile import Dogpile, SyncReaderDogpile, NeedRegenerationException, Lock
+from .dogpile import NeedRegenerationException, Lock
 from .nameregistry import NameRegistry
 from .readwrite_lock import ReadWriteMutex
+from .legacy import Dogpile, SyncReaderDogpile
 
-__all__ = 'Dogpile', 'SyncReaderDogpile', 'NeedRegenerationException', 'NameRegistry', 'ReadWriteMutex', 'Lock'
+__all__ = [
+        'Dogpile', 'SyncReaderDogpile', 'NeedRegenerationException',
+        'NameRegistry', 'ReadWriteMutex', 'Lock']
 
 __version__ = '0.4.0'
 

File dogpile/core/dogpile.py

-from .util import threading
 import time
 import logging
-from .readwrite_lock import ReadWriteMutex
 
 log = logging.getLogger(__name__)
 
 NOT_REGENERATED = object()
 
 class Lock(object):
+    """Dogpile lock class.
+
+    Provides an interface around an arbitrary mutex
+    that allows one thread/process to be elected as
+    the creator of a new value, while other threads/processes
+    continue to return the previous version
+    of that value.
+
+    .. versionadded:: 0.4.0
+        The :class:`.Lock` class was added as a single-use object
+        representing the dogpile API without dependence on
+        any shared state between multiple instances.
+
+    :param mutex: A mutex object that provides ``acquire()``
+     and ``release()`` methods.
+    :param creator: Callable which returns a tuple of the form
+     (new_value, creation_time).  "new_value" should be a newly
+     generated value representing completed state.  "creation_time"
+     should be a floating point time value which is relative
+     to Python's ``time.time()`` call, representing the time
+     at which the value was created.  This time value should
+     be associated with the created value.
+    :param value_and_created_fn: Callable which returns
+     a tuple of the form (existing_value, creation_time).  This
+     basically should return what the last local call to the ``creator()``
+     callable has returned, i.e. the value and the creation time,
+     which would be assumed here to be from a cache.  If the
+     value is not available, the :class:`.NeedRegenerationException`
+     exception should be thrown.
+    :param expiretime: Expiration time in seconds.  Set to
+     ``None`` for never expires.  This timestamp is compared
+     to the creation_time result and ``time.time()`` to determine if
+     the value returned by value_and_created_fn is "expired".
+
+
+    """
+
     def __init__(self,
             mutex,
             creator,
     def __exit__(self, type, value, traceback):
         pass
 
-class Dogpile(object):
-    """Dogpile lock class.
-
-    Provides an interface around an arbitrary mutex
-    that allows one thread/process to be elected as
-    the creator of a new value, while other threads/processes
-    continue to return the previous version
-    of that value.
-
-    :param expiretime: Expiration time in seconds.  Set to
-     ``None`` for never expires.
-    :param init: if True, set the 'createdtime' to the
-     current time.
-    :param lock: a mutex object that provides
-     ``acquire()`` and ``release()`` methods.
-
-    """
-    def __init__(self, expiretime, init=False, lock=None):
-        """Construct a new :class:`.Dogpile`.
-
-        """
-        if lock:
-            self.dogpilelock = lock
-        else:
-            self.dogpilelock = threading.Lock()
-
-        self.expiretime = expiretime
-        if init:
-            self.createdtime = time.time()
-
-    createdtime = -1
-    """The last known 'creation time' of the value,
-    stored as an epoch (i.e. from ``time.time()``).
-
-    If the value here is -1, it is assumed the value
-    should recreate immediately.
-
-    """
-
-    def acquire(self, creator,
-                        value_fn=None,
-                        value_and_created_fn=None):
-        """Acquire the lock, returning a context manager.
-
-        :param creator: Creation function, used if this thread
-         is chosen to create a new value.
-
-        :param value_fn: Optional function that returns
-         the value from some datasource.  Will be returned
-         if regeneration is not needed.
-
-        :param value_and_created_fn: Like value_fn, but returns a tuple
-         of (value, createdtime).  The returned createdtime
-         will replace the "createdtime" value on this dogpile
-         lock.   This option removes the need for the dogpile lock
-         itself to remain persistent across usages; another
-         dogpile can come along later and pick up where the
-         previous one left off.
-
-        """
-
-        if value_and_created_fn is None:
-            if value_fn is None:
-                def value_and_created_fn():
-                    return None, self.createdtime
-            else:
-                def value_and_created_fn():
-                    return value_fn(), self.createdtime
-
-            def creator_wrapper():
-                value = creator()
-                self.createdtime = time.time()
-                return value, self.createdtime
-        else:
-            def creator_wrapper():
-                value = creator()
-                self.createdtime = time.time()
-                return value
-
-        return Lock(
-            self.dogpilelock,
-            creator_wrapper,
-            value_and_created_fn,
-            self.expiretime
-        )
-
-    @property
-    def is_expired(self):
-        """Return true if the expiration time is reached, or no
-        value is available."""
-
-        return not self.has_value or \
-            (
-                self.expiretime is not None and
-                time.time() - self.createdtime > self.expiretime
-            )
-
-    @property
-    def has_value(self):
-        """Return true if the creation function has proceeded
-        at least once."""
-        return self.createdtime > 0
-
-
-class SyncReaderDogpile(Dogpile):
-    """Provide a read-write lock function on top of the :class:`.Dogpile`
-    class.
-
-    """
-    def __init__(self, *args, **kw):
-        super(SyncReaderDogpile, self).__init__(*args, **kw)
-        self.readwritelock = ReadWriteMutex()
-
-    def acquire_write_lock(self):
-        """Return the "write" lock context manager.
-
-        This will provide a section that is mutexed against
-        all readers/writers for the dogpile-maintained value.
-
-        """
-
-        dogpile = self
-        class Lock(object):
-            def __enter__(self):
-                dogpile.readwritelock.acquire_write_lock()
-            def __exit__(self, type, value, traceback):
-                dogpile.readwritelock.release_write_lock()
-        return Lock()
-
-
-    def _enter(self, *arg, **kw):
-        value = super(SyncReaderDogpile, self)._enter(*arg, **kw)
-        self.readwritelock.acquire_read_lock()
-        return value
-
-    def _exit(self):
-        self.readwritelock.release_read_lock()

File dogpile/core/legacy.py

+from .util import threading
+from .readwrite_lock import ReadWriteMutex
+from .dogpile import Lock
+import time
+import contextlib
+
+class Dogpile(object):
+    """Dogpile lock class.
+
+    .. deprecated:: 0.4.0
+        The :class:`.Lock` object specifies the full
+        API of the :class:`.Dogpile` object in a single way,
+        rather than providing multiple modes of usage which
+        don't necessarily work in the majority of cases.
+        :class:`.Dogpile` is now a wrapper around the :class:`.Lock` object
+        which provides dogpile.core's original usage pattern.
+        This usage pattern began as something simple, but was
+        not of general use in real-world caching environments without
+        several extra complicating factors; the :class:`.Lock`
+        object presents the "real-world" API more succinctly,
+        and also fixes a cross-process concurrency issue.
+
+    :param expiretime: Expiration time in seconds.  Set to
+     ``None`` for never expires.
+    :param init: if True, set the 'createdtime' to the
+     current time.
+    :param lock: a mutex object that provides
+     ``acquire()`` and ``release()`` methods.
+
+    """
+    def __init__(self, expiretime, init=False, lock=None):
+        """Construct a new :class:`.Dogpile`.
+
+        """
+        if lock:
+            self.dogpilelock = lock
+        else:
+            self.dogpilelock = threading.Lock()
+
+        self.expiretime = expiretime
+        if init:
+            self.createdtime = time.time()
+
+    createdtime = -1
+    """The last known 'creation time' of the value,
+    stored as an epoch (i.e. from ``time.time()``).
+
+    If the value here is -1, it is assumed the value
+    should recreate immediately.
+
+    """
+
+    def acquire(self, creator,
+                        value_fn=None,
+                        value_and_created_fn=None):
+        """Acquire the lock, returning a context manager.
+
+        :param creator: Creation function, used if this thread
+         is chosen to create a new value.
+
+        :param value_fn: Optional function that returns
+         the value from some datasource.  Will be returned
+         if regeneration is not needed.
+
+        :param value_and_created_fn: Like value_fn, but returns a tuple
+         of (value, createdtime).  The returned createdtime
+         will replace the "createdtime" value on this dogpile
+         lock.   This option removes the need for the dogpile lock
+         itself to remain persistent across usages; another
+         dogpile can come along later and pick up where the
+         previous one left off.
+
+        """
+
+        if value_and_created_fn is None:
+            if value_fn is None:
+                def value_and_created_fn():
+                    return None, self.createdtime
+            else:
+                def value_and_created_fn():
+                    return value_fn(), self.createdtime
+
+            def creator_wrapper():
+                value = creator()
+                self.createdtime = time.time()
+                return value, self.createdtime
+        else:
+            def creator_wrapper():
+                value = creator()
+                self.createdtime = time.time()
+                return value
+
+        return Lock(
+            self.dogpilelock,
+            creator_wrapper,
+            value_and_created_fn,
+            self.expiretime
+        )
+
+    @property
+    def is_expired(self):
+        """Return true if the expiration time is reached, or no
+        value is available."""
+
+        return not self.has_value or \
+            (
+                self.expiretime is not None and
+                time.time() - self.createdtime > self.expiretime
+            )
+
+    @property
+    def has_value(self):
+        """Return true if the creation function has proceeded
+        at least once."""
+        return self.createdtime > 0
+
+
+class SyncReaderDogpile(Dogpile):
+    """Provide a read-write lock function on top of the :class:`.Dogpile`
+    class.
+
+    .. deprecated:: 0.4.0
+        The :class:`.ReadWriteMutex` object can be used directly.
+
+    """
+    def __init__(self, *args, **kw):
+        super(SyncReaderDogpile, self).__init__(*args, **kw)
+        self.readwritelock = ReadWriteMutex()
+
+    @contextlib.contextmanager
+    def acquire_write_lock(self):
+        """Return the "write" lock context manager.
+
+        This will provide a section that is mutexed against
+        all readers/writers for the dogpile-maintained value.
+
+        """
+
+        self.readwritelock.acquire_write_lock()
+        try:
+            yield
+        finally:
+            self.readwritelock.release_write_lock()
+
+    @contextlib.contextmanager
+    def acquire(self, *arg, **kw):
+        with super(SyncReaderDogpile, self).acquire(*arg, **kw):
+            self.readwritelock.acquire_read_lock()
+            try:
+                yield
+            finally:
+                self.readwritelock.release_read_lock()

File dogpile/core/nameregistry.py

 
 class NameRegistry(object):
     """Generates and return an object, keeping it as a
-    singleton for a certain identifier for as long as its 
+    singleton for a certain identifier for as long as its
     strongly referenced.
-    
+
+    .. note::
+
+        The :class:`.NameRegistry` exists here to support
+        certain usage patterns by the deprecated
+        :class:`.Dogpile` object.   It is still potentially
+        useful in other cases, however.
+
     e.g.::
-    
+
         class MyFoo(object):
             "some important object."
             def __init__(self, identifier):
 
         # thread 1:
         my_foo = registry.get("foo1")
-        
+
         # thread 2
         my_foo = registry.get("foo1")
-    
+
     Above, ``my_foo`` in both thread #1 and #2 will
     be *the same object*.   The constructor for
-    ``MyFoo`` will be called once, passing the 
+    ``MyFoo`` will be called once, passing the
     identifier ``foo1`` as the argument.
-    
-    When thread 1 and thread 2 both complete or 
+
+    When thread 1 and thread 2 both complete or
     otherwise delete references to ``my_foo``, the
-    object is *removed* from the :class:`.NameRegistry` as 
+    object is *removed* from the :class:`.NameRegistry` as
     a result of Python garbage collection.
-    
+
     :class:`.NameRegistry` is a utility object that
     is used to maintain new :class:`.Dogpile` objects
     against a certain key, for as long as that particular key
 
     def __init__(self, creator):
         """Create a new :class:`.NameRegistry`.
-        
-         
+
+
         """
         self._values = weakref.WeakValueDictionary()
         self._mutex = threading.RLock()
 
     def get(self, identifier, *args, **kw):
         """Get and possibly create the value.
-        
+
         :param identifier: Hash key for the value.
          If the creation function is called, this identifier
          will also be passed to the creation function.
         :param \*args, \**kw: Additional arguments which will
-         also be passed to the creation function if it is 
+         also be passed to the creation function if it is
          called.
-        
+
         """
         try:
             if identifier in self._values: