Commits

Jean-Tiare Le Bigot committed 679e912

update readmes, getting started, status and extending guides

Comments (0)

Files changed (5)

 ``range_key``. All other fields are optional.
 
 **DynamoDB** is really awesome but is terribly slooooow with managment tasks.
-This makes it completly unusable in test environements
+This makes it completly unusable in test environements.
 
-**ddbmock** brings a nice, tiny, in-memory implementation of DynamoDB along with
-much better and detailed error messages. Among its niceties, it features a double
-entry point:
+**ddbmock** brings a nice, tiny, in-memory (optionaly sqlite) implementation of
+DynamoDB along with much better and detailed error messages. Among its niceties,
+it features a double entry point:
 
  - regular network based entry-point with 1:1 correspondance with stock DynamoDB
  - **embeded entry-point** with seamless boto intergration 1, ideal to avoid spinning yet another server.
 
+**ddbmock** does *not* intend to be used in production. It *will* **loose** you
+data. you've been warned! I currently recommend the "boto extension" mode for
+unit-tests and the "server" mode for functional tests.
+
 Installation
 ============
 
     $ hg clone ssh://hg@bitbucket.org/Ludia/dynamodb-mock
     $ pip install nose nosexcover coverage mock webtests boto
     $ python setup.py develop
-    $ nosetests # --no-coverage to run boto integration tests too
+    $ nosetests # --no-skip to run boto integration tests too
 
 
-What is/will ddbmock be useful for ?
-====================================
+What is ddbmock useful for ?
+============================
 
 - running unit test FAST. DONE
 - running functional test FAST. DONE
 - experiment with DynamoDB API. DONE
-- plan Throughput usage. WIP (low/mid level foundation done)
-- plan actual storage space requirements. DONE (describe table returns accurate size !)
-- perform simulations with accurate limitations. Even some undocumented "features" are accurate :)
+- plan throughput usage. DONE
+- plan disk space requirements. DONE (describe table returns accurate size !)
+- perform simulations with accurate limitations.
 
 Current status
 ==============
 
-ddbmock is an experimental project and is currently under heavy development. It
-also may be discontinued at *any* time.
-
+- pass all boto integration tests
 - support full table life-cycle
 - support full item life-cycle
 - support for all item limitations
-- accurate size and date reporting
-- ``Query``, ``Scan``, ``BatchGetItem`` and ``BatchWriteItem`` support is preliminary
+- accurate size, throughput reporting
+- ``Scan``, ``BatchGetItem`` and ``BatchWriteItem`` still lacks ``ExclusiveStartKey``
 - no limits on concurent table operations
 - no limits for request/response size nor item count in those
 
     # Done ! just use it wherever in your project as usual.
     db.list_tables() # get list of tables (empty at this stage)
 
+Note, to clean patches made in ``boto.dynamodb.layer1``, you can call
+``clean_boto_patch()`` from  the same module.
 
 Requirements
 ============

docs/_include/intro.rst

 either be run as a stand alone server or as a regular library helping you to
 build lightning fast unit and functional tests :)
 
-**ddbmock** does *not* intend to be production ready. It *will* **loose** you
+**ddbmock** does *not* intend to be used in production. It *will* **loose** you
 data. you've been warned! I currently recommend the "boto extension" mode for
 unit-tests and the "server" mode for functional tests.

docs/pages/extending.rst

             `-- pyramid => just make sure that all methods are supported
 
 
+Get the source Luke
+===================
+
+::
+
+    $ hg clone ssh://hg@bitbucket.org/Ludia/dynamodb-mock
+    $ pip install nose nosexcover coverage mock webtests boto
+    $ python setup.py develop
+    $ nosetests # --no-skip to run boto integration tests too
+
+
 Adding a custom method
 ======================
 

docs/pages/getting_started.rst

 
     # Done ! just use it wherever in your project as usual.
     db.list_tables() # get list of tables (empty at this stage)
+
+Note, to clean patches made in ``boto.dynamodb.layer1``, you can call
+``clean_boto_patch()`` from  the same module.

docs/pages/status.rst

 Current Status
 ##############
 
-This documents reflects ddbmock status as of 3/10/12. It may be outdated.
+This documents reflects ddbmock status as of 5/11/2012. It may be outdated.
+
+Some items are marked as "WONTFIX". These are throttling related. The goal of
+ddbmock is to help you with tests and planification. It won't get in your way.
 
 Methods support
 ===============
 - ``PutItem`` DONE
 - ``DeleteItem`` DONE
 - ``UpdateItem`` ALMOST
-- ``BatchGetItem`` WIP
-- ``BatchWriteItem`` WIP
+- ``BatchGetItem`` DONE*
+- ``BatchWriteItem`` DONE*
 - ``Query`` DONE
-- ``Scan`` WIP
+- ``Scan`` DONE*
 
-There is basically no support for ``ExclusiveStartKey``, and their associated
-features at all in ddbmock. This affects all "WIP" functions. ``Query`` is the
+There basically are no support for ``ExclusiveStartKey``, and their associated
+features at all in ddbmock. This affects all "*" operations. ``Query`` is the
 only exception.
 
 ``UpdateItem`` has a different behavior when the target item did not exist prior
 Request rate
 ------------
 
-- Throttle read  operations when provisioned throughput exceeded. TODO (?)
-- Throttle write operations when provisioned throughput exceeded. TODO (?)
+- Throttle read  operations when provisioned throughput exceeded. WONTFIX
+- Throttle write operations when provisioned throughput exceeded. WONTFIX
+- Throughput usage logging for planification purpose. DONE
 - Maximum throughput is 10,000. DONE
 - Minimum throughput is 1. DONE
 - Report accurate throughput. DONE
 ---------------
 
 - No more than 256 tables. DONE
-- No more than 10 ``CREATING`` tables. TODO
-- No more than 10 ``DELETING`` tables. TODO
-- No more than 1  ``UPDATING`` table.  TODO
+- No more than 10 ``CREATING`` tables. WONTFIX
+- No more than 10 ``DELETING`` tables. WONTFIX
+- No more than 10 ``UPDATING`` tables. WONTFIX
 
 - No more than 1 Throughput decrease/calendar day. DONE
 - No more than *2 Throughput increase/update. DONE
-- At least 10% change per update. DONE
 
 Types and items Limitations
 ===========================