Commits

Anonymous committed f20218f Draft Merge

sync with main repo

  • Participants
  • Parent commits 8cd259a, ed33d8e

Comments (0)

Files changed (19)

 
 *.pyc
 *.egg
+*.orig
 dynamodb_mapper.egg-info
 
 build
 ebfa3e966160a3cb027438100f5270126a5ce535 1.6.1
 34934595a8c35bfe8e044a1d4f48294db4de5881 1.6.2
 739db7a1b75fd3ebe599468051821714db2e304a 1.6.3
+10624a205ab3850f6ccc0b9591a1b2cd85e04713 1.7.0
 Additions
 ---------
 
+- migration engine - single object
 - method ``ConnectionBorg.set_region`` to specify Amazon's region (thanks kimscheibel)
 - method ``DynamoDBModel.from_db_dict`` which additionaly saves ``_raw_data``
 - ``raise_on_conflict`` on ``DynamoDBModel.save``, defaults to ``False``
 - rename ``from_dict`` to ``_from_db_dict``. Should not be used anymore
 - transactions may create new Items (side effect of ``raise_on_conflict`` refactor)
 - fix bug #13 in dates de-serialization. (thanks Merwok)
+- open only one shared boto connection per process instead of on/thread. Boto is thread-safe
+- re-implement ``get_batch`` to rely on boto new generator. Fixes 100 Items limitation and paging.
+- boto min version is 2.6.0
 
 Removal
 -------
 - ``allow_overwrite`` feature was not needed with ``raise_on_conflict``
 - ``to_db_dict`` and ``from_dict`` are no longer public
 - ``ThroughputError``. Throughput checks are delegated to Amazon's API (thanks kimscheibel)
+- ``new_batch_list_nominal`` is not needed anymore with boto>=2.6.0
 
 Upgrade
 -------
 Requirements
 ============
 
-The documentation currently assumes that you're running Boto 2.3.0 or later.
-If you're not, then the API for query and scan changes. You will have to supply
-raw condition dicts, as is done in boto itself.
+ - Boto = 2.6.0
+ - AWS account
 
-Also note that Boto 2.3.1 or later is required for autoincrement_int hash keys.
-Earlier versions will fail.
-
-We assume you've correctly set your Boto credentials.
+ We assume you've correctly set your Boto credentials.
 
 Example usage
 =============

docs/api/migration.rst

+.. _migrations:
+
+##########
+Migrations
+##########
+
+As the the development goes, data schema in the application evolves. As this is
+NoSQL, there is no notion of "column" hence no way to update a whole table at a
+time. In a sense, this is a good point. Migrations may be done lazily with no
+need to lock the database for hours.
+
+Migration module aims to provide simple tools for most common migration scenarios.
+
+Migration concepts
+==================
+
+Migrations involves 2 steps
+
+ 1. detecting the current version
+ 2. if need be, perform operations
+
+Version detection will **always** be performed as long as a ``Migration`` class
+is associated with the ``DynamoDbModel`` to make sure the object is up to date.
+
+The version is detected by running ``check_N`` successively on the raw boto data.
+``N`` is a revision integer. Revisions number do not need to be consecutive and
+are sorted in natural decreasing order. Its means that ``N=11`` is considered
+bigger than ``N=2``.
+
+ - If ``check_N`` returns ``True``, detected version version will be ``N``.
+ - If ``check_N`` returns ``False``, go on with the immediate lower version
+ - If no ``check_N`` succeed, :py:class:`~.VersionError` is raised.
+
+Migration in itself is performed by successively running ``migrate_to_N`` on the
+raw boto data. This enables you to run incremental migration. The first migrator
+ran has ``N > current_version``. Revision number ``N`` needs not be consecutive
+nor to have ``check_N`` equivalents.
+
+If your lowest possible version is ``n``, you need to have a ``check_n`` but no
+``migrate_to_n`` as there is no lower version to migrate to ``n``. On the contrary,
+you need to have both a migrator and a version checker to the latest revisions.
+The migrator will be needed to update older objects while the the version checker
+will ensure the Item is at the latest revision. If it returns ``True``, no
+migration will be performed.
+
+At the end of the process, the version is assumed to be the latest. No additional
+check will be performed. The migrated object needs to be saved manually.
+
+When will the migration be useful?
+----------------------------------
+
+Non null field is added
+    - **detection**: no field in raw_data
+    - **migration**: add the field in raw_data
+    - Note: this is of no use if empty values are allowed as there is no distinction between empty and non existing values in boto
+Renamed field
+    - **detection**: old field name in raw_data
+    - **migration**: insert a new field with the old value and ``del`` the old field in raw_data.
+Deleted field
+    - **detection**: old field still exist in raw data
+    - **migration**: ``del`` old field from raw data
+Type change
+    - **detection**: if converting the raw data field to the expected type fails.
+    - **migration**: perform the type conversion manually and serialize it back *before* returning other data
+
+
+When will it be of no use?
+--------------------------
+
+Table rename
+    You need to manually fall-back to the old table.
+Field migration between table
+    You still need some high level magic.
+
+For complex use cases, you may consider freezing you application and running an
+EMR on it.
+
+Use case: Rename field 'mail' to 'email'
+========================================
+
+Migration engine
+----------------
+
+::
+
+    from dynamodb_mapper.migration import Migration
+
+    class UserMigration(Migration):
+        # Is it at least compatible with first revision ?
+        def check_1(self, raw_data):
+            field_count = 0
+            field_count += u"id" in raw_data and isinstance(raw_data[u"id"], unicode)
+            field_count += u"energy" in raw_data and isinstance(raw_data[u"energy"], int)
+            field_count += u"mail" in raw_data and isinstance(raw_data[u"mail"], unicode)
+
+            return field_count == len(raw_data)
+
+        #No migrator to version 1: in can not be older than version 1 !
+
+        # Is the object Up to date ?
+        def check_2(self, raw_data):
+            field_count = 0
+            field_count += u"id" in raw_data and isinstance(raw_data[u"id"], unicode)
+            field_count += u"energy" in raw_data and isinstance(raw_data[u"energy"], int)
+            field_count += u"email" in raw_data and isinstance(raw_data[u"email"], unicode)
+
+            return field_count == len(raw_data)
+
+        # migrate from previous revision (1) to this one (the latest)
+        def migrate_to_2(self, raw_data):
+            raw_data[u"email"] = raw_data[u"mail"]
+            del raw_data[u"mail"]
+            return raw_data
+
+Enable migrations in model
+--------------------------
+
+::
+
+    from dynamodb_mapper.model import DynamoDBModel
+
+    class User(DynamoDBModel):
+        __table__ = "user"
+        __hash_key__ = "id"
+        __migrator__ = UserMigration # Single line to add !
+        __schema__ = {
+            "id": unicode,
+            "energy": int,
+            "email": unicode
+        }
+
+Example run
+-----------
+
+Let's say you have an object at revision 1 in the db. It will look like this:
+
+::
+
+    raw_data_version_1 = {
+        u"id": u"Jackson",
+        u"energy": 6742348,
+        u"mail": u"jackson@tldr-ludia.com",
+    }
+
+Now, migrate it:
+
+>>> jackson = User.get(u"Jackson")
+# Done, jackson is migrated, but let's check it
+>>> print jackson.email
+u"jackson@tldr-ludia.com" #Alright !
+>>> jackson.save(raise_on_conflict=True)
+# Should go fine if no concurrent access
+
+``raise_on_conflict`` integration
+=================================
+
+Internally, ``raise_on_conflict`` relies on the raw data dict from boto to
+generate a non conflict detection. This dict is stored in the model instance
+*before* the migration engine is triggered so that `raise_on_conflict` feature
+will keep on working as expected.
+
+This behavior guarantees that :ref:`transactions` works as expected even when
+dealing with migrated objects.
+
+Related exceptions
+==================
+
+VersionError
+------------
+
+.. autoclass:: dynamodb_mapper.migration.VersionError

docs/api/query.rst

 please see `Amazon's developer guide
 <http://docs.amazonwebservices.com/amazondynamodb/latest/developerguide/BestPractices.html#ScanQueryConsiderationBestPractices>`_
 
+To retrieve results of :py:meth:`~.DynamoDBModel.get_batch`,
+:py:meth:`~.DynamoDBModel.query` and :py:meth:`~.DynamoDBModel.scan`, just loop
+over the result list. Technically, they all rely on high-level generators
+abstracting the query chunking logic.
+
 All querying methods persists the original raw object for
 :ref:`raise_on_conflict <saving>` and transactions.
 
 # The short X.Y version.
 version = '1.7'
 # The full version, including alpha/beta/rc tags.
-release = '1.7.0.dev'
+release = '1.7.0'
 
 # The language for content autogenerated by Sphinx. Refer to documentation
 # for a list of supported languages.
    api/query
    api/alter
    api/transaction
+   api/migration
 
    pages/changelog
 

docs/pages/overview.rst

 Requirements
 ============
 
-The documentation currently assumes that you're running Boto 2.3.0 or later.
-If you're not, then the API for query and scan changes. You will have to supply
-raw condition dicts, as is done in boto itself.
-
-Also note that Boto 2.3.1 or later is required for autoincrement_int hash keys.
-Earlier versions will fail.
+ - Boto = 2.6.0
+ - AWS account
 
 Features
 ========
 - default values
 - Multi-target transaction support with auto-retry (new in 1.6.0)
 - Sub-transactions (new in 1.6.2)
+- Single object migration engine (new in 1.7.0)
 - Auto-inc hash_key
 - Protection against the 'lost update' syndrom (refactored in 1.7.0)
+- Full low-level chunking abstraction for ``scan``, ``query`` and ``get_batch``
 - New table creation
 - Framework agnostic
 - Log all successful database access

docs/raw_api/migration.rst

+###############
+Migration class
+###############
+
+.. currentmodule:: dynamodb_mapper.migration
+
+Class definition
+================
+
+.. autoclass:: Migration
+
+Constructors
+------------
+
+.. automethod:: Migration.__init__
+
+Call
+----
+
+.. automethod:: Migration.__call__

dynamodb_mapper/migration.py

+from __future__ import absolute_import
+
+# Sorting stuffs
+version_key = lambda name: int(name.split('_')[-1])
+
+# Exception
+class VersionError(Exception):
+    """VersionError exception is raised when schema version could no be identied
+    It may meam that the Python schema is *older* that the DB item or simply is
+    not the right type.
+    """
+
+class Migration(object):
+    """
+    Migration engine base class. It defines all the magic to make things easy to
+    use. To migrate a raw schema, only instanciate the migrator and "call" the
+    instance with the raw_dict.
+
+    >>> migrator = DataMigration()
+    >>> migrated_raw_data = migrator(raw_data)
+
+    Migrators must derive from this class and implement methods of the form
+    - ``check_N(raw_data)`` -> returns True if raw_data is compatible with version N
+    - ``migrate_to_N(raw_data)`` -> migrate from previous version to this one
+
+    ``check_N`` are all called in the decreasing order. The first to return True
+    determines the version.
+
+    All migrators functions are called successively starting at N+1 assuming N
+    is the current version number
+
+    N version numbers do not need to be consecutive and are sorted in natural
+    order.
+    """
+    _detectors = None
+    _migrators = None
+
+    def __init__(self, model):
+        """
+        Gather all the version detectors and migrators on the first call. They are
+        then cached for all further instances.
+
+        :param model: model class this migrator handles
+        """
+        cls = type(self)
+
+        self.model = model
+
+        # Did we already gather all this ?
+        if cls._migrators is not None:
+            return
+
+        _detectors = []
+        _migrators = []
+
+        for key in dir(self):
+            obj = getattr(self, key)
+            if not callable(obj):
+                continue
+            if key.startswith("check_"):
+                _detectors.append(key)
+            if key.startswith("migrate_to_"):
+                _migrators.append(key)
+
+        cls._detectors = sorted(_detectors, key = version_key, reverse=True)
+        cls._migrators = sorted(_migrators, key = version_key)
+
+    def _detect_version(self, raw_data):
+        """
+        Detect the current schema version of this raw_data dict by calling ``check_N``
+        in natural decreasing order. If the version checker returns ``True``, we
+        consider this is the first compatible revision and return the number as
+        ``current_revision``
+
+        :param raw_data: Raw boto dict to migrate to latest version
+
+        :return: current revision number
+
+        :raises VersionError: when no check succeeded
+        """
+        for detector in type(self)._detectors:
+            if getattr(self, detector)(raw_data):
+                return version_key(detector)
+        raise VersionError()
+
+    def _do_migration(self, current_version, raw_data):
+        """
+        Run the migration engine engine. All ``migrate_to_N`` are called successively
+        in natural ascending order as long as ``N > current_version``.
+
+        :param current_version: Current version of the raw_data as detected by :py:meth:`~.Migration._detect_version`
+
+        :param raw_data: Raw boto dict to migrate to latest version
+
+        :return: Up to date raw boto dict
+        """
+        for migrator in type(self)._migrators:
+            if version_key(migrator) > current_version:
+                raw_data = getattr(self, migrator)(raw_data)
+        return raw_data
+
+    def __call__(self, raw_data):
+        """
+        Trigger the the 2 steps migration engine:
+
+          1. detect the current version
+          2. migrate to all newest versions
+
+        :param raw_data: Raw boto dict to migrate to latest version
+
+        :return: Up to date raw boto dict
+
+        :raises VersionError: when no check succeeded
+        """
+        current_version = self._detect_version(raw_data)
+        return self._do_migration(current_version, raw_data)

dynamodb_mapper/model.py

 
 import json
 import logging
-import threading
 from datetime import datetime, timedelta, tzinfo
 
 import boto
 from boto.exception import DynamoDBResponseError
 from boto.dynamodb.exceptions import DynamoDBConditionalCheckFailedError
 
-
 log = logging.getLogger(__name__)
 dblog = logging.getLogger(__name__+".database-access")
 
     """
     if default is not None:
         # If default is callable(function, constructor, ...), try to dereference it
-        if hasattr(default, '__call__'):
+        if callable(default):
             # Might raise a "TypeError" if arguments were needed
             default = default()
         # Check default value consitency
         "_aws_access_key_id": None,
         "_aws_secret_access_key": None,
         "_region": None,
-        # {thread_id: connection} mapping
-        "_connections": {},
+        "_connection": None,
     }
 
     def __init__(self):
         self.__dict__ = self._shared_state
 
     def _get_connection(self):
-        """Return the DynamoDB connection for the current thread, establishing
-        it if required.
+        """Return the DynamoDB connection for the mapper
         """
-        current_thread = threading.current_thread()
-        thread_id = current_thread.ident
-        try:
-            return self._connections[thread_id]
-        except KeyError:
-            log.debug("Creating DynamoDB connection for thread %s.", current_thread)
-            self._connections[thread_id] = boto.connect_dynamodb(
+        if self._connection is None:
+            self._connection = boto.connect_dynamodb(
                 aws_access_key_id=self._aws_access_key_id,
                 aws_secret_access_key=self._aws_secret_access_key,
                 region=self._region,
             )
-            return self._connections[thread_id]
+        return self._connection
 
     def _create_autoincrement_magic_item(self, table):
         item = table.new_item(hash_key=MAGIC_KEY, attrs={
         """Return the table with the requested name."""
         return self._get_connection().get_table(name)
 
-    def new_batch_list(self):
-        """Create a new batch list."""
-        return self._get_connection().new_batch_list()
-
 
 class DynamoDBModel(object):
     """Abstract base class for all models that use DynamoDB as their storage
           corresponding to the type will be used.
           "defaulter" may either be a scalar value or a callable with no
           arguments.
+      - ``__migrator__``: :py:class:`~.Migration` handler attached to this model
 
     To redefine serialization/deserialization semantics (e.g. to have more
     complex schemas, like auto-serialized JSON data structures), override the
     __hash_key__ = None
     __range_key__ = None
     __schema__ = None
+    __migrator__ = None
     __defaults__ = {}
 
     def __init__(self, **kwargs):
         Objects created and initialized with this method are considered as not
         coming from the DB.
         """
-        defaults = type(self).__defaults__
-        schema = type(self).__schema__
+        cls = type(self)
+        defaults = cls.__defaults__
+        schema = cls.__schema__
 
         self._raw_data = {}
 
                 default = defaults[name] if name in defaults else None
             setattr(self, name, _get_default_value(type_, default, name=name))
 
+        # instanciate the migrator only once per model *after* initialization
+        if isinstance(cls.__migrator__, type):
+            cls.__migrator__ = cls.__migrator__(cls)
+
     @classmethod
     def _from_db_dict(cls, raw_data):
         """Build an instance from a dict-like mapping, according to the class's
         schema. Objects created with this method are considered as comming from
         the DB. The initial state is persisted in ``self._raw_data``.
+        If a ``__migrator__`` has been declared, migration is triggered on a copy
+        of the raw data.
 
         Default values are used for anything that's missing from the dict
         (see DynamoDBModel class docstring).
         instance = cls()
         instance._raw_data = raw_data
 
+        #If a migrator is registered, trigger it
+        if cls.__migrator__ is not None:
+           raw_data = cls.__migrator__(raw_data)
+
+        # de-serialize data
         for (name, type_) in cls.__schema__.iteritems():
             # Set the value if we got one from DynamoDB. Otherwise, stick with the default
             value = _dynamodb_to_python(type_, raw_data.get(name)) # de-serialize
 
         get_batch *always* performs eventually consistent reads.
 
-        Please not that a batch can *not* read more than 100 items at once.
-
         Objects loaded by this method are marked as coming from the DB. Hence
         their initial state is saved in ``self._raw_data``.
 
         :param keys: iterable of keys. ex ``[(hash1, range1), (hash2, range2)]``
 
         """
-        if len(keys) > 100:
-            raise ValueError("Too many items to read in a single batch. Maximum is 100.")
+        table = ConnectionBorg().get_table(cls.__table__)
 
-        borg = ConnectionBorg()
-        table = borg.get_table(cls.__table__)
         # Convert all the keys to DynamoDB values.
         if cls.__range_key__:
             dynamo_keys = [
         else:
             dynamo_keys = map(_python_to_dynamodb, keys)
 
-        batch_list = borg.new_batch_list()
-        batch_list.add_batch(table, dynamo_keys)
-
-        res = batch_list.submit()
+        res = table.batch_get_item(dynamo_keys)
 
         dblog.debug("Sent a batch get on table %s", cls.__table__)
 
-        return [
-            cls._from_db_dict(d) for d in res[u"Responses"][cls.__table__][u"Items"]
-        ]
+        return [cls._from_db_dict(d) for d in res]
 
     @classmethod
     def query(cls, hash_key_value, range_key_condition=None, consistent_read=False, reverse=False, limit=None):

dynamodb_mapper/tests/test_migrations.py

+from __future__ import absolute_import
+
+import unittest
+
+from dynamodb_mapper.model import DynamoDBModel
+from dynamodb_mapper.migration import Migration, VersionError
+
+# test case: field rename rename
+class UserMigration(Migration):
+    def check_1(self, raw_data):
+        field_count = 0
+        field_count += u"id" in raw_data and isinstance(raw_data[u"id"], unicode)
+        field_count += u"energy" in raw_data and isinstance(raw_data[u"energy"], int)
+        field_count += u"mail" in raw_data and isinstance(raw_data[u"mail"], unicode)
+
+        return field_count == len(raw_data)
+
+    #No migrator for version 1, of course !
+
+    def check_2(self, raw_data):
+        # Stub. This is to check checker sorting only
+        return False
+
+    def migrate_to_2(self, raw_data):
+        # Stub. This is to check migrator sorting only
+        return raw_data
+
+    def check_11(self, raw_data):
+        field_count = 0
+        field_count += u"id" in raw_data and isinstance(raw_data[u"id"], unicode)
+        field_count += u"energy" in raw_data and isinstance(raw_data[u"energy"], int)
+        field_count += u"email" in raw_data and isinstance(raw_data[u"email"], unicode)
+
+        return field_count == len(raw_data)
+
+    def migrate_to_11(self, raw_data):
+        raw_data[u"email"] = raw_data[u"mail"]
+        del raw_data[u"mail"]
+        return raw_data
+
+class User(DynamoDBModel):
+    __table__ = "user"
+    __hash_key__ = "id"
+    __migrator__ = UserMigration
+    __schema__ = {
+        "id": unicode,
+        "energy": int,
+        "email": unicode
+    }
+
+class TestMigration(unittest.TestCase):
+    def test_init(self):
+        # check migrator list + natural order sort
+        m = UserMigration(User)
+        self.assertEquals(m._detectors, ['check_11', 'check_2', 'check_1'])
+        self.assertEquals(m._migrators, ['migrate_to_2', 'migrate_to_11'])
+
+    def test_version_detection_error(self):#TODO test exception
+        raw_data_version_error_type = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"mail": "jackson@tldr-ludia.com",
+        }
+
+        raw_data_version_error_field = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"e-mail": u"jackson@tldr-ludia.com",
+        }
+
+        m = UserMigration(User)
+        self.assertRaises(
+            VersionError,
+            m._detect_version,
+            raw_data_version_error_type,
+        )
+
+    def test_version_detection(self):#TODO test exception
+        raw_data_version_1_regular = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"mail": u"jackson@tldr-ludia.com",
+        }
+
+        # Version 1 with no mail is detected as 11 as the migration is on mail field
+        raw_data_version_1_no_mail = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+        }
+
+        m = UserMigration(User)
+        self.assertEquals(m._detect_version(raw_data_version_1_regular), 1)
+        self.assertEquals(m._detect_version(raw_data_version_1_no_mail), 11)
+
+    def test_migration(self):
+        raw_data_version_1 = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"mail": u"jackson@tldr-ludia.com",
+        }
+
+        raw_data_version_11 = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"email": u"jackson@tldr-ludia.com",
+        }
+
+        m = UserMigration(User)
+        self.assertEquals(m._do_migration(1, raw_data_version_1), raw_data_version_11)
+        self.assertEquals(m._detect_version(raw_data_version_11), 11)
+
+    def test_auto_migration(self):
+        raw_data_version_1 = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"mail": u"jackson@tldr-ludia.com",
+        }
+
+        raw_data_version_11 = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"email": u"jackson@tldr-ludia.com",
+        }
+
+        m = UserMigration(User)
+        self.assertEquals(m(raw_data_version_1), raw_data_version_11)
+
+    # more a functional test than a unit test...
+    def test_real_model_migration(self):
+        raw_data_version_1 = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"mail": u"jackson@tldr-ludia.com",
+        }
+
+        raw_data_version_11 = {
+            u"id": u"Jackson",
+            u"energy": 6742348,
+            u"email": u"jackson@tldr-ludia.com",
+        }
+
+        user = User._from_db_dict(raw_data_version_1)
+        # Raw_data still original => needed for ROC
+        self.assertEquals(user._raw_data, raw_data_version_1)
+        # Data is at latest revision => consistency
+        self.assertEquals(user._to_db_dict(), raw_data_version_11)
+        # check the migrator engine is persisted (cache)
+        assert isinstance(User.__migrator__, UserMigration)

dynamodb_mapper/tests/test_model.py

 from dynamodb_mapper.model import (ConnectionBorg, DynamoDBModel,
     autoincrement_int, MaxRetriesExceededError, MAX_RETRIES,
     ConflictError, _python_to_dynamodb, _dynamodb_to_python, utc_tz,
-    _get_default_value, SchemaError, MAGIC_KEY, OverwriteError, InvalidRegionError)
+    _get_default_value, SchemaError, MAGIC_KEY, OverwriteError, InvalidRegionError,)
 from boto.exception import DynamoDBResponseError
 from boto.dynamodb.exceptions import DynamoDBConditionalCheckFailedError
 import datetime
 
 class TestConnectionBorg(unittest.TestCase):
     def setUp(self):
-        ConnectionBorg()._connections.clear()
         ConnectionBorg._shared_state = {
             "_aws_access_key_id": None,
             "_aws_secret_access_key": None,
             "_region": None,
-            "_connections": {},
+            "_connection": None,
         }
 
     def tearDown(self):
-        ConnectionBorg()._connections.clear()
         ConnectionBorg._shared_state = {
             "_aws_access_key_id": None,
             "_aws_secret_access_key": None,
             "_region": None,
-            "_connections": {},
+            "_connection": None,
         }
 
     def test_borgness(self):
 
         borg1.set_credentials(aws_access_key_id, aws_secret_access_key)
 
-        self.assertEquals(borg1._aws_access_key_id, aws_access_key_id)
-        self.assertEquals(borg1._aws_secret_access_key, aws_secret_access_key)
+        self.assertEqual(borg1._aws_access_key_id, aws_access_key_id)
+        self.assertEqual(borg1._aws_secret_access_key, aws_secret_access_key)
 
-        self.assertEquals(borg2._aws_access_key_id, aws_access_key_id)
-        self.assertEquals(borg2._aws_secret_access_key, aws_secret_access_key)
+        self.assertEqual(borg2._aws_access_key_id, aws_access_key_id)
+        self.assertEqual(borg2._aws_secret_access_key, aws_secret_access_key)
+
+    @mock.patch("dynamodb_mapper.model.boto")
+    def test_get_connection(self, m_boto):
+        CONNECTION = "JJG"
+        REGION = "La bas, tout est neuf et tout est sauvage."
+        KEY_ID = "foo"
+        KEY_VALUE = "bar"
+
+        m_boto.connect_dynamodb.return_value = "JJG"
+
+        borg1 = ConnectionBorg()
+        borg2 = ConnectionBorg()
+
+        borg1._region = REGION
+        borg1._aws_access_key_id = KEY_ID
+        borg1._aws_secret_access_key = KEY_VALUE
+
+        self.assertIsNone(borg1._connection)
+        self.assertEqual(CONNECTION, borg2._get_connection())
+        self.assertEqual(CONNECTION, borg1._get_connection())
+        self.assertEqual(CONNECTION, borg1._connection)
+
+        m_boto.connect_dynamodb.assert_called_once_with(
+                aws_access_key_id=KEY_ID,
+                aws_secret_access_key=KEY_VALUE,
+                region=REGION,
+                )
 
     @mock.patch("dynamodb_mapper.model.boto")
     def test_set_region_valid(self, m_boto):
             region=m_region,
         )
 
-    @mock.patch("dynamodb_mapper.model.boto")
-    def test_new_batch_list_nominal(self, m_boto):
-        m_new_batch_list = m_boto.connect_dynamodb.return_value.new_batch_list
-        ConnectionBorg().new_batch_list()
-
-        m_new_batch_list.assert_called_once()
-
-
     def test_create_table_no_name(self):
         self.assertRaises(
             SchemaError,
 
 class TestTypeConversions(unittest.TestCase):
     def test_python_to_dynamodb_number(self):
-        self.assertEquals(0, _python_to_dynamodb(0))
-        self.assertEquals(0.0, _python_to_dynamodb(0.0))
-        self.assertEquals(10, _python_to_dynamodb(10))
-        self.assertEquals(10.0, _python_to_dynamodb(10.0))
-        self.assertEquals(0, _python_to_dynamodb(False))
-        self.assertEquals(1, _python_to_dynamodb(True))
+        self.assertEqual(0, _python_to_dynamodb(0))
+        self.assertEqual(0.0, _python_to_dynamodb(0.0))
+        self.assertEqual(10, _python_to_dynamodb(10))
+        self.assertEqual(10.0, _python_to_dynamodb(10.0))
+        self.assertEqual(0, _python_to_dynamodb(False))
+        self.assertEqual(1, _python_to_dynamodb(True))
 
     def test_dynamodb_to_python_number(self):
-        self.assertEquals(0, _dynamodb_to_python(int, 0))
-        self.assertEquals(0.0, _dynamodb_to_python(float, 0.0))
-        self.assertEquals(10, _dynamodb_to_python(int, 10))
-        self.assertEquals(10.0, _dynamodb_to_python(float, 10.0))
-        self.assertEquals(False, _dynamodb_to_python(bool, 0))
-        self.assertEquals(True, _dynamodb_to_python(bool, 1))
+        self.assertEqual(0, _dynamodb_to_python(int, 0))
+        self.assertEqual(0.0, _dynamodb_to_python(float, 0.0))
+        self.assertEqual(10, _dynamodb_to_python(int, 10))
+        self.assertEqual(10.0, _dynamodb_to_python(float, 10.0))
+        self.assertEqual(False, _dynamodb_to_python(bool, 0))
+        self.assertEqual(True, _dynamodb_to_python(bool, 1))
 
     def test_python_to_dynamodb_unicode(self):
-        self.assertEquals(u"hello", _python_to_dynamodb(u"hello"))
+        self.assertEqual(u"hello", _python_to_dynamodb(u"hello"))
         # Empty strings become missing attributes
         self.assertIsNone(_python_to_dynamodb(u""))
 
     def test_dynamodb_to_python_unicode(self):
-        self.assertEquals(u"hello", _dynamodb_to_python(unicode, u"hello"))
+        self.assertEqual(u"hello", _dynamodb_to_python(unicode, u"hello"))
         # Missing attributes become None
         self.assertIsNone(_dynamodb_to_python(unicode, None))
 
     def test_python_to_dynamodb_set(self):
-        self.assertEquals(set([1, 2, 3]), _python_to_dynamodb(set([1, 2, 3])))
-        self.assertEquals(
+        self.assertEqual(set([1, 2, 3]), _python_to_dynamodb(set([1, 2, 3])))
+        self.assertEqual(
             set(["foo", "bar", "baz"]),
             _python_to_dynamodb(set(["foo", "bar", "baz"])))
         # Empty sets become missing attributes
         self.assertIsNone(_python_to_dynamodb(set()))
 
     def test_dynamodb_to_python_set(self):
-        self.assertEquals(set([1, 2, 3]), _dynamodb_to_python(set, set([1, 2, 3])))
-        self.assertEquals(
+        self.assertEqual(set([1, 2, 3]), _dynamodb_to_python(set, set([1, 2, 3])))
+        self.assertEqual(
             set(["foo", "bar", "baz"]),
             _dynamodb_to_python(set, set(["foo", "bar", "baz"])))
         # Empty sets become None
             {'name': 'punch', 'damage': 5}
         ]
 
-        self.assertEquals(
+        self.assertEqual(
             json.dumps(attacks, sort_keys=True),
             _python_to_dynamodb(attacks))
-        self.assertEquals("[]", _python_to_dynamodb([]))
+        self.assertEqual("[]", _python_to_dynamodb([]))
 
     def test_dynamodb_to_python_list(self):
         attacks = [
             {'name': 'punch', 'damage': 5}
         ]
 
-        self.assertEquals(
+        self.assertEqual(
             attacks,
             _dynamodb_to_python(list, json.dumps(attacks, sort_keys=True)))
-        self.assertEquals([], _dynamodb_to_python(list, "[]"))
+        self.assertEqual([], _dynamodb_to_python(list, "[]"))
 
     def test_python_to_dynamodb_dict(self):
         monsters = {
             "cyberdemon": []
         }
 
-        self.assertEquals(
+        self.assertEqual(
             json.dumps(monsters, sort_keys=True),
             _python_to_dynamodb(monsters))
-        self.assertEquals("{}", _python_to_dynamodb({}))
+        self.assertEqual("{}", _python_to_dynamodb({}))
 
     def test_dynamodb_to_python_dict(self):
         monsters = {
             "cyberdemon": []
         }
 
-        self.assertEquals(
+        self.assertEqual(
             monsters,
             _dynamodb_to_python(dict, json.dumps(monsters, sort_keys=True)))
-        self.assertEquals({}, _dynamodb_to_python(dict, "{}"))
+        self.assertEqual({}, _dynamodb_to_python(dict, "{}"))
 
     def test_dynamodb_to_python_datetime(self):
-        self.assertEquals(
+        self.assertEqual(
             datetime.datetime(2012, 05, 31, 12, 0, 0, 42, tzinfo=utc_tz),
             _dynamodb_to_python(datetime.datetime,
                                 "2012-05-31T12:00:00.000042+00:00"))
-        self.assertEquals(
+        self.assertEqual(
             datetime.datetime(2010, 11, 1, 4, 0, 0, 13, tzinfo=utc_tz),
             _dynamodb_to_python(datetime.datetime,
                                 "2010-11-01T04:00:00.000013Z"))
             _dynamodb_to_python, datetime.datetime, "2012-05-31T12:00:00.000000")
 
     def test_python_to_dynamodb_datetime(self):
-        self.assertEquals(
+        self.assertEqual(
             "2012-05-31T12:00:00.000000+00:00",
             _python_to_dynamodb(datetime.datetime(2012, 05, 31, 12, 0, 0, tzinfo=utc_tz)))
 
     def test_datetime(self, m_datetime):
         # Plain default date should be now
         m_datetime.now.return_value = now = datetime.datetime.now(tz=utc_tz)
-        self.assertEquals(
+        self.assertEqual(
             now,
             _get_default_value(m_datetime))
-        self.assertEquals(utc_tz.tzname(now), "UTC")
+        self.assertEqual(utc_tz.tzname(now), "UTC")
 
         # default date with supplied fallback
         fallback = datetime.datetime(2012, 05, 31, 12, 0, 0, tzinfo=utc_tz)
-        self.assertEquals(
+        self.assertEqual(
             fallback,
             _get_default_value(datetime.datetime, fallback))
 
     def test_int(self):
         # Plain default int
-        self.assertEquals(0, _get_default_value(int))
+        self.assertEqual(0, _get_default_value(int))
         # Default int with fallback
-        self.assertEquals(42, _get_default_value(int, 42))
+        self.assertEqual(42, _get_default_value(int, 42))
         # Default int with callable fallback
-        self.assertEquals(42, _get_default_value(int, return_42))
+        self.assertEqual(42, _get_default_value(int, return_42))
 
     def test_incompatible_types_crash_constant(self):
         # Default unicode with int fallback should fail
 
 class TestDynamoDBModel(unittest.TestCase):
     def setUp(self):
-        ConnectionBorg()._connections.clear()
+        ConnectionBorg()._connection = None
 
     def tearDown(self):
-        ConnectionBorg()._connections.clear()
+        ConnectionBorg()._connection = None
 
     def test_to_json_dict(self):
         testdate = datetime.datetime(2012, 5, 31, 12, 0, 0, tzinfo=utc_tz)
         d.date = testdate
 
         d_dict = d.to_json_dict()
-        self.assertEquals(d_dict["id"], 42)
-        self.assertEquals(d_dict["set"], sorted(testSet))
-        self.assertEquals(d_dict["date"], testdate.astimezone(utc_tz).isoformat())
+        self.assertEqual(d_dict["id"], 42)
+        self.assertEqual(d_dict["set"], sorted(testSet))
+        self.assertEqual(d_dict["date"], testdate.astimezone(utc_tz).isoformat())
 
     def test_build_default_values(self):
         d = DoomEpisode()
-        self.assertEquals(d.id, 0)
-        self.assertEquals(d.name, u"")
+        self.assertEqual(d.id, 0)
+        self.assertEqual(d.name, u"")
 
         # Using to_json_dict because _to_db_dict omits empty values
         d_dict = d.to_json_dict()
-        self.assertEquals(d_dict["id"], d.id)
-        self.assertEquals(d_dict["name"], d.name)
+        self.assertEqual(d_dict["id"], d.id)
+        self.assertEqual(d_dict["name"], d.name)
 
     @mock.patch("dynamodb_mapper.model._get_default_value")
     def test_build_default_values_with_defaults(self, m_defaulter):
         d = PlayerStrength()
 
         m_defaulter.assert_called_with(unicode, u"weak", name='strength')
-        self.assertEquals(d.strength, u"weak")
+        self.assertEqual(d.strength, u"weak")
 
     def test_init_from_args(self):
         d = DoomEpisode(id=1, name=u"Knee-deep in the Dead")
-        self.assertEquals({}, d._raw_data)
-        self.assertEquals(1, d.id)
-        self.assertEquals(u"Knee-deep in the Dead", d.name)
+        self.assertEqual({}, d._raw_data)
+        self.assertEqual(1, d.id)
+        self.assertEqual(u"Knee-deep in the Dead", d.name)
 
     def test_build_from_dict(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
         d_dict = {"id": 1, "name": "Knee-deep in the Dead"}
         d = DoomEpisode._from_db_dict(d_dict)
-        self.assertEquals(d_dict, d._raw_data)
-        self.assertEquals(d_dict["id"], d.id)
-        self.assertEquals(d_dict["name"], d.name)
+        self.assertEqual(d_dict, d._raw_data)
+        self.assertEqual(d_dict["id"], d.id)
+        self.assertEqual(d_dict["name"], d.name)
 
     def test_build_from_dict_missing_attrs(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
         d = DoomCampaign._from_db_dict({})
-        self.assertEquals(d.id, 0)
-        self.assertEquals(d.name, u"")
-        self.assertEquals(d.cheats, set())
+        self.assertEqual(d.id, 0)
+        self.assertEqual(d.name, u"")
+        self.assertEqual(d.cheats, set())
 
     def test_build_from_dict_json_list(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
             "attacks": json.dumps(attacks, sort_keys=True)
         })
 
-        self.assertEquals(m.attacks, attacks)
+        self.assertEqual(m.attacks, attacks)
 
         # Test default values
         m2 = DoomMonster._from_db_dict({"id": 1})
-        self.assertEquals(m2.attacks, [])
+        self.assertEqual(m2.attacks, [])
 
     def test_build_from_dict_json_dict(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
             "map_id": 1,
             "monsters": json.dumps(monsters, sort_keys=True)
         })
-        self.assertEquals(e.monsters, monsters)
+        self.assertEqual(e.monsters, monsters)
 
         e2 = DoomMonsterMap._from_db_dict({"map_id": 1})
-        self.assertEquals(e2.monsters, {})
+        self.assertEqual(e2.monsters, {})
 
     def test_build_from_db_dict(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
         d_dict = {"id": 1, "name": "Knee-deep in the Dead"}
         d = DoomEpisode._from_db_dict(d_dict)
-        self.assertEquals(d_dict, d._raw_data)
+        self.assertEqual(d_dict, d._raw_data)
 
     def test_build_from_db_dict_json_dict(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
         }
 
         e = DoomMonsterMap._from_db_dict(raw_dict)
-        self.assertEquals(e.monsters, monsters)
-        self.assertEquals(raw_dict["map_id"], e._raw_data["map_id"])
-        self.assertEquals(raw_dict["monsters"], e._raw_data["monsters"])
+        self.assertEqual(e.monsters, monsters)
+        self.assertEqual(raw_dict["map_id"], e._raw_data["map_id"])
+        self.assertEqual(raw_dict["monsters"], e._raw_data["monsters"])
 
     def test_to_db_dict_json_list(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
 
         d = m._to_db_dict()
 
-        self.assertEquals(d["id"], 1)
-        self.assertEquals(d["attacks"], json.dumps(attacks, sort_keys=True))
+        self.assertEqual(d["id"], 1)
+        self.assertEqual(d["attacks"], json.dumps(attacks, sort_keys=True))
 
     def test_to_db_dict_json_dict(self):
         #FIXME: can it be removed as this feature is implicitly tested elsewhere ?
 
         d = e._to_db_dict()
 
-        self.assertEquals(d["map_id"], 1)
-        self.assertEquals(d["monsters"], json.dumps(monsters, sort_keys=True))
+        self.assertEqual(d["map_id"], 1)
+        self.assertEqual(d["monsters"], json.dumps(monsters, sort_keys=True))
 
 
     @mock.patch("dynamodb_mapper.model.Item")
         m_table.get_item.return_value = {"datetime": d_text}
 
         p = Patch.get(d)
-        self.assertEquals(d_text, p._raw_data["datetime"])
+        self.assertEqual(d_text, p._raw_data["datetime"])
         m_table.get_item.assert_called_once_with(
             hash_key=d_text, range_key=None, consistent_read=False)
 
         }
 
         r = GameReport.get(players, d)
-        self.assertEquals(d_text, r._raw_data["datetime"])
-        self.assertEquals(players_text, r._raw_data["player_ids"])
+        self.assertEqual(d_text, r._raw_data["datetime"])
+        self.assertEqual(players_text, r._raw_data["player_ids"])
         m_table.get_item.assert_called_once_with(
             hash_key=players_text, range_key=d_text, consistent_read=False)
 
         l.save()
 
         m_item.return_value.__setitem__.assert_called_with("id", 3)
-        self.assertEquals(m_item.return_value.__setitem__.call_count, 2)
+        self.assertEqual(m_item.return_value.__setitem__.call_count, 2)
 
     @mock.patch("dynamodb_mapper.model.Item")
     @mock.patch("dynamodb_mapper.model.boto")
         l.text = "Everybody's dead, Dave."
         self.assertRaises(MaxRetriesExceededError, l.save)
 
-        self.assertEquals(m_item_instance.put.call_count, MAX_RETRIES)
+        self.assertEqual(m_item_instance.put.call_count, MAX_RETRIES)
 
     @mock.patch("dynamodb_mapper.model.Item")
     @mock.patch("dynamodb_mapper.model.boto")
 
     @mock.patch("dynamodb_mapper.model.boto")
     def test_get_batch_hash_key(self, m_boto):
-        m_batch_list = m_boto.connect_dynamodb.return_value.new_batch_list.return_value
+        # paging function/generator is done by real db tests
+        KEYS = range(2)
+        DATA = [{
+                    u"id": 0,
+                    u"name": u"The level with cute cats",
+                },
+                {
+                    u"id": 1,
+                    u"name": u"The level with horrible monsters",
+                }]
+
         m_table = m_boto.connect_dynamodb.return_value.get_table.return_value
-        # Mock empty answer as we only observe the way the request is build
-        # FIXME: make sure _raw_data is filled
-        m_batch_list.submit.return_value = {
-            u"Responses": {
-                DoomEpisode.__table__: {
-                    u"Items": []
-                }
-            }
-        }
+        m_table.batch_get_item.return_value = DATA
 
-        DoomEpisode.get_batch(range(2));
-        m_batch_list.add_batch.assert_called_with(m_table, [0,1])
-        assert m_batch_list.submit.called
+        res = DoomEpisode.get_batch(KEYS);
+
+        m_table.batch_get_item.assert_called_with(KEYS)
+        self.assertEqual(2, len(res))
+        self.assertEqual(DATA[0][u'id'], res[0].id)
+        self.assertEqual(DATA[0][u'name'], res[0].name)
+        self.assertEqual(DATA[1][u'id'], res[1].id)
+        self.assertEqual(DATA[1][u'name'], res[1].name)
 
     @mock.patch("dynamodb_mapper.model.boto")
     def test_get_batch_hash_range_key(self, m_boto):
-        m_batch_list = m_boto.connect_dynamodb.return_value.new_batch_list.return_value
+        KEYS = [(1, u"The level with cute cats"),
+                (2, u"The level with horrible monsters")]
+        DATA = [{
+                    u"episode_id": 1,
+                    u"name": u"The level with cute cats",
+                    u"world": u"Heaven",
+                },
+                {
+                    u"episode_id": 2,
+                    u"name": u"The level with horrible monsters",
+                    u"world": u"Hell",
+                }]
+
         m_table = m_boto.connect_dynamodb.return_value.get_table.return_value
-        # Mock empty answer as we only observe the way the request is build
-        # FIXME: make sure _raw_data is filled
-        m_batch_list.submit.return_value = {
-            u"Responses": {
-                DoomMap.__table__: {
-                    u"Items": []
-                }
-            }
-        }
+        m_table.batch_get_item.return_value = DATA
 
-        DoomMap.get_batch([(0, u"level1"), (1, u"level2")]);
-        m_batch_list.add_batch.assert_called_with(m_table, [(0, u'level1'), (1, u'level2')])
-        assert m_batch_list.submit.called
+        res = DoomMap.get_batch(KEYS)
 
-    def test_get_batch_over_100(self):
-        self.assertRaises(
-            ValueError,
-            DoomEpisode.get_batch,
-            range(101)
-        )
+        m_table.batch_get_item.assert_called_with(KEYS)
+        self.assertEqual(2, len(res))
+        self.assertEqual(DATA[0][u'episode_id'], res[0].episode_id)
+        self.assertEqual(DATA[0][u'name'], res[0].name)
+        self.assertEqual(DATA[0][u'world'], res[0].world)
+        self.assertEqual(DATA[1][u'episode_id'], res[1].episode_id)
+        self.assertEqual(DATA[1][u'name'], res[1].name)
+        self.assertEqual(DATA[1][u'world'], res[1].world)
 
     @mock.patch("dynamodb_mapper.model.ConnectionBorg.get_table")
     def test_query(self, m_get_table):
 
         d.delete()
 
-        self.assertEquals(d._raw_data, {})
+        self.assertEqual(d._raw_data, {})
         m_item.assert_called_once_with(m_table, dynamodb_date, None)
         assert m_item_instance.delete.called
 
 
         d.delete()
 
-        self.assertEquals(d._raw_data, {})
+        self.assertEqual(d._raw_data, {})
         m_item.assert_called_once_with(m_table, 42, "level super geek")
         assert m_item_instance.delete.called
 
             raise_on_conflict=True
         )
 
-        self.assertEquals(d._raw_data, {}) #still "new"
+        self.assertEqual(d._raw_data, {}) #still "new"
 
     @mock.patch("dynamodb_mapper.model.boto")
     @mock.patch("dynamodb_mapper.model.Item")
             u'world': False
         })
 
-        self.assertEquals(d._raw_data, raw_data) #no change
+        self.assertEqual(d._raw_data, raw_data) #no change

dynamodb_mapper/tests/test_transactions.py

 
 class EnergyPurchase(Transaction):
     """A sample transaction with sub transactions"""
-    __table__ = "energyPurchase"
+    __table__ = "energy_purchase"
 
     def _get_transactors(self):
         return [] # Stub
         m_user_save.assert_called()
         m_transaction_save.assert_called()
 
-        self.assertEquals(m_user_instance.energy, 0)
-        self.assertEquals(t.user_id, USER_ID)
+        self.assertEqual(m_user_instance.energy, 0)
+        self.assertEqual(t.user_id, USER_ID)
 
     @mock.patch("dynamodb_mapper.model.DynamoDBModel.save")
     @mock.patch.object(User, "save")
         t.commit()
 
         m_user_save.assert_called()
-        self.assertEquals(m_transaction_save.call_count, 0)
+        self.assertEqual(m_transaction_save.call_count, 0)
 
-        self.assertEquals(m_user_instance.energy, 0)
-        self.assertEquals(t.status, "done")
+        self.assertEqual(m_user_instance.energy, 0)
+        self.assertEqual(t.status, "done")
 
     @mock.patch("dynamodb_mapper.transactions.Transaction.save")
     @mock.patch("dynamodb_mapper.model.DynamoDBModel.save")
 
         self.assertRaises(UniverseDestroyedError, t.commit)
 
-        self.assertEquals(m_save.call_count, 0)
+        self.assertEqual(m_save.call_count, 0)
         m_transaction_save.assert_called()
-        self.assertEquals(t.status, "pending")
+        self.assertEqual(t.status, "pending")
 
     @mock.patch("dynamodb_mapper.transactions.Transaction.save")
     @mock.patch.object(User, "save")
         self.assertRaises(TargetNotFoundError, t.commit)
 
         # The transaction fails at the first step: no save
-        self.assertEquals(m_user_save.call_count, 0)
+        self.assertEqual(m_user_save.call_count, 0)
         self.assertFalse(m_transaction_save.called)
-        self.assertEquals(t.status, "pending")
+        self.assertEqual(t.status, "pending")
 
     @mock.patch("dynamodb_mapper.transactions.Transaction.save")
     @mock.patch.object(User, "save")
 
         self.assertRaises(InsufficientEnergyError, t.commit)
 
-        self.assertEquals(m_user_instance.energy, ENERGY)
-        self.assertEquals(m_user_save.call_count, 0)
+        self.assertEqual(m_user_instance.energy, ENERGY)
+        self.assertEqual(m_user_save.call_count, 0)
         self.assertFalse(m_transaction_save.called)
-        self.assertEquals(t.status, "pending")
+        self.assertEqual(t.status, "pending")
 
     @mock.patch("dynamodb_mapper.transactions.Transaction.save")
     @mock.patch.object(User, "save")
 
         t.commit()
         m_transaction_save.assert_called()
-        self.assertEquals(t.status, "done")
+        self.assertEqual(t.status, "done")
         self.assertEqual(m_user_save.call_count, failed_tries)
 
     @mock.patch("dynamodb_mapper.transactions.Transaction.save")
         t = UserEnergyTransaction._from_db_dict({"user_id": USER_ID, "energy": ENERGY})
 
         self.assertRaises(MaxRetriesExceededError, t.commit)
-        self.assertEquals(m_user_save.call_count, Transaction.MAX_RETRIES)
+        self.assertEqual(m_user_save.call_count, Transaction.MAX_RETRIES)
         m_transaction_save.assert_called()
-        self.assertEquals(t.status, "pending")
+        self.assertEqual(t.status, "pending")
 
     def test_get_2_transactors(self):
         t = collectRewards()
         transactors = t._get_transactors()
 
-        self.assertEquals(len(transactors), 2)
+        self.assertEqual(len(transactors), 2)
 
     def test_legacy_get_transactors(self):
         t = UserEnergyTransaction()
         transactors = t._get_transactors()
 
-        self.assertEquals(len(transactors), 1)
-        self.assertEquals(transactors[0][0], t._get_target)
-        self.assertEquals(transactors[0][1], t._alter_target)
+        self.assertEqual(len(transactors), 1)
+        self.assertEqual(transactors[0][0], t._get_target)
+        self.assertEqual(transactors[0][1], t._alter_target)
 
     @mock.patch("dynamodb_mapper.transactions.Transaction.save")
     @mock.patch("dynamodb_mapper.model.Item")
 
         #Check "allow_overwrite=False" was generated correctly
         m_item_instance.put.assert_called_with({u'user_id': False, u'name': False})
-        self.assertEquals(m_item_instance.put.call_count, 2)
+        self.assertEqual(m_item_instance.put.call_count, 2)
 
-        self.assertEquals(t.status, "done")
+        self.assertEqual(t.status, "done")
 
     @mock.patch("dynamodb_mapper.model.DynamoDBModel.save")
     @mock.patch.object(UserEnergyTransaction, "commit")

dynamodb_mapper/transactions.py

 
 class Transaction(DynamoDBModel):
     """Abstract base class for transactions. A transaction may involve multiple
-    targets and needs to be fully successful to be marked as "DONE".
+    targets and needs to be fully successful to be marked as done.
 
     This class gracefully handles concurrent modifications and auto-retries but
     embeds no tool to rollback.
     Transactions may register ``subtransactions``. This field is a list of
     ``Transaction``. Sub-transactions are played after the main transactors
 
-    Transactions status may be persisted for tracability, further analysis...
+    Transactions status may be persisted for traceability, further analysis...
     for this purpose, a minimal schema is embedded in this base class. When
     deriving, you MUST keep
 
         - ``datetime`` field as rangekey
         - ``status`` field
 
-    The hash key field may be changed to pick a ore relevant name or change its
+    The hash key field may be changed to pick a more relevant name or change its
     type. In any case, you are responsible of setting its value. For example, if
     collecting rewards for a player, you may wish to keep track of related
     transactions by user_id hence set requester_id to user_id
         """Set up preconditions and parameters for the transaction.
 
         This method is only run once, regardless of how many retries happen.
-        You should override it to fetch all the *unchanging* information you
-        need from the database to run the transaction (e.g. the cost of a Bingo
-        card, or the contents of a reward).
+        You should override it to fill the substransactions array or to fetch
+        all the data that wil not be chanded by the transaction need from the
+        database to run the transaction (e.g. the cost of a Bingo card, or the
+        contents of a reward).
         """
 
     def _get_transactors(self):
         raising :exc:`ConflictError`.
 
         Will succeed iff no attributes of the object returned by getter has been
-        modified before ou save method to prevent accidental overwrites.
+        modified before our save method to prevent accidental overwrites.
 
         :param getter: getter as defined in :py:meth:`_get_transactors`
         :param setter: setter as defined in :py:meth:`_get_transactors`
             tries += 1
             try:
                 fn()
-                # Nothing was raised: we're done !
+                # Nothing was raised: we're done!
                 break
             except exc_class as e:
                 log.debug(
             raise MaxRetriesExceededError()
 
     def commit(self):
-        """ Run the transaction and, if needed, store its states to the database
+        """ Run the transaction and, if needed, store its state to the database
 
             - set up preconditions and parameters (:meth:`_setup` -- only called
               once no matter what).

real_dynamodb_tests.py

         doommap.save()
         # make sure it is there
         doommap2 = DoomMap.get(1,1, consistent_read=True)
-        self.assertEquals(doommap._raw_data, doommap2._raw_data)
+        self.assertEqual(doommap._raw_data, doommap2._raw_data)
         # delete it
         doommap2.delete()
         # make sure it's gone
         )
         # make sure the doommap2 survived
         doommap3 = DoomMap.get(1, 1, consistent_read=True)
-        self.assertEquals(doommap3._raw_data, doommap2._raw_data)
+        self.assertEqual(doommap3._raw_data, doommap2._raw_data)
         # delete it
         doommap2.delete()
 
         doommap2.save()
         # make sure it is there
         doommap3 = DoomMap.get(4, 2, consistent_read=True)
-        self.assertEquals(doommap3._raw_data, doommap2._raw_data)
+        self.assertEqual(doommap3._raw_data, doommap2._raw_data)
         # delete them
         doommap.delete()
         doommap3.delete()
 
+    def test_get_bunch_batch(self):
+        bunch = 120
+        #insert these bunch of objects
+        for i in range(bunch):
+            DoomMap(episode=i, map=i*i, name=u"secret level"+str(i)).save()
+        #get all these
+        keys = [(i, i*i) for i in range(bunch)]
+        data = list(DoomMap.get_batch(keys))
+
+        self.assertEqual(bunch, len(data))
+
 if __name__ == '__main__':
     init_table()
     unittest.main()
 [metadata]
 name = dynamodb-mapper
-version = 1.7.0.dev
+version = 1.7.0
 summary = Object mapper for Amazon DynamoDB
 description-file = README.rst
 author = Max Noel
     License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)
 requires-dist =
     Sphinx
-    boto
+    boto == 2.6.0
 
 [files]
 packages =
     # d2to1 bootstrap
     'd2to1',
 
-    # Run time deps. Application needs them to run
-    'boto',
-
     # Testing dependencies (the application doesn't need them to *run*)
     'nose',
     'nosexcover',