Commits

Mikhail Denisenko  committed 31c84cc Merge

merge default

  • Participants
  • Parent commits d1969fe, 68c851a
  • Branches django-1.7-support

Comments (0)

Files changed (29)

 7d4d6f703102dcfc889afb4a112e73849b7960e7 1.1
 ac1b813ab651669cf90c3865fd90969ee502e006 1.2rc1
 15eb3a9182be95bffaf34e57201779d3bcb13dcc 1.2
+4943234e92cacb76451331982302626c282e95ce 1.3
+2790f8f287e0849c46a92dab5000e0450578d42c 1.3.1
 popd
 env\scripts\pip install -e ./pytds
 
-env\bin\pip install pytz --use-mirrors
+env\bin\pip install pytz==2013d --use-mirrors
 
 set COMPUTERNAME=%HOST%
 python tests\test_main\manage.py test --noinput
 if [ $BACKEND = sqlserver.pymssql ]; then
     python env/bin/pip install cython hg+https://denisenkom@code.google.com/r/denisenkom-pymssql/ --use-mirrors
 fi
-python env/bin/pip install pytz --use-mirrors
+python env/bin/pip install pytz==2013d --use-mirrors
 export COMPUTERNAME=$HOST
 python tests/test_main/manage.py test --noinput
 #python tests/test_regex_compare/manage.py test --noinput

File django_settings.py

 
 options = {'use_mars': True,
            'allow_nulls_in_unique_constraints': False,  # sqlserver doesn't fully support multiple nulls in unique constraint
-           'extra_params': 'MARS Connection=True;',
+           'extra_params': 'DataTypeCompatibility=80;MARS Connection=True;',
+           'use_legacy_date_fields': False,
            }
 
 DATABASE_HOST = os.environ['HOST'] + '\\' + os.environ.get('SQLINSTANCE', '')

File docs/changelog.txt

 Changelog
 =========
 
-v1.3 (in development)
+v1.4 (In Development)
 ---------------------
 
+- Support for Django v1.3 has been removed.
+- Corrected DB-API 2 testing documentation.
+- Fixed issue with slicing logic that could prevent the compiler from finding
+  and mapping column aliases properly.
+- Improved the "return ID from insert" logic so it can properly extract the
+  column data type from user defined fields with custom data type strings.
+- Fixed case for identifiers in introspection. Thanks Mikhail Denisenko.
+- Added option :setting:`use_legacy_date_fields` (defaults to True) to allow changing the DatabaseCreation.data_types to
+  not use the Microsoft preferred date data types that were added with SQL Server 2008. :issue:`31`
+- Improved accuracy of field type guessing with inspectdb. See :ref:`introspecting-custom-fields`
+- Fixed issue with identity insert using a cursor to the wrong database in a multi-database environment. Thanks Mikhail 
+  Denisenko
+- Fixed constraint checking. :issue:`35` Thanks Mikhail Denisenko
+- Enabled ``can_introspect_autofield`` database feature. :djangoticket:`21097`
+- Any date related field should now return from the database as the appropriate Python type, instead of always being a
+  datetime.
+- Backend now supports doing date lookups using a string. E.g. ``Article.objects.filter(pub_date__startswith='2005')``
+- ``check_constraints`` will now check all disabled and enabled constraints. This change was made to match the behavior
+  tested by ``backends.FkConstraintsTests.test.test_check_constraints``.
+
+v1.3.1
+------
+
+- Ensure Django knows to re-enable constraints. :issue:`29`
+
+v1.3
+----
+
 - Backend now supports returning the ID from an insert without needing an additional query. This is disabled
   for SQL Server 2000 (assuming that version still works with this backend). :issue:`17`
 

File docs/datatypes.txt

 Dates and Times
 ---------------
 
-Earliest Year
-~~~~~~~~~~~~~
+When using `Django-mssql` with SQL Server 2005, all of the date related fields
+only support the `datetime` data type. Support for these legacy data types can
+be enabled using the :setting:`use_legacy_date_fields` option, or using the 
+fields ``LegacyDateField``,  ``LegacyDateTimeField``, and ``LegacyTimeField`` in
+``sqlserver_ado.fields``.
 
-SQL Server provides two datetime types: `datetime and smalldatetime`_.
+To allow migrating specific apps or only some of your models to the new date 
+times, the model fields ``DateField``, ``DateTimeField``, and ``TimeField`` in
+``sqlserver_ado.fields`` use the new data types regardless of the 
+:setting:`use_legacy_date_fields` option.
 
-.. _`datetime and smalldatetime`: http://msdn.microsoft.com/en-us/library/ms187819.aspx
+.. code:: python
 
-This backend always uses datetime, which supports dates back to 
-January 1, 1753.
+	from django.db import models
+	from sqlserver_ado.fields import DateField, DateTimeField, TimeField
 
-Django previously only supported dates back to 1900, but recent revisions 
-7946 + 7950 changed this behavior. If you need to store historical data in 
-datetime fields, then SQL Server 2005 will have a problem. 
-(I'd welcome bugs/repro cases/patches on this issue.)
+	class MyModel(models.Model):
+		# when use_legecy_date_fields is False, models.*Field will behave like these
+		a_real_date = DateField() # date data type
+		a_datetime2 = DateTimeField() # datetime2 data type
+		a_real_time = TimeField() # time data type
+
+		# when use_legecy_date_fields is True, models.*Field will behave like these
+		a_date = LegacyDateField() # datetime data type
+		a_datetime = LegacyDateTime() # datetime data type
+		a_time = LegacyTimeField() # datetime data type
+
+datetime limitations
+~~~~~~~~~~~~~~~~~~~~
+
+With SQL Server 2005, only the `datetime`_ data type is usable with Django. This
+data type does not store enough precision to provide the full range of Python
+datetime dates and will round to increments of .000, .003, or .007 seconds. The
+earliest supported datetime date value is January 1, 1753.
+
+.. _`datetime`: http://msdn.microsoft.com/en-us/library/ms187819.aspx
 
 SQL Server 2008 introduces a datetime2_ type, with support for fractional 
-seconds and the full range of Python datetime dates. To use this time, edit 
-your local creation.py/introspection.py files to use this datatype instead.
+seconds and the full range of Python datetime dates. To use this time, either
+set the :settings:`use_legacy_date_fields` option to ``False`` or use the 
+``sqlserver_ado.fields.DateTimeField`` with your models.
 
 .. _datetime2: http://msdn.microsoft.com/en-us/library/ms180878(SQL.100).aspx
 
-Bare Times
-~~~~~~~~~~
-
-SQL Server 2005 doesn't have a bare "time" datatype, only a datetime. SQL 
-Server 2008 introduces a time type, but this isn't used by this backend.
 
 bigint
 ------
     ``BigForeignKey`` for things to work as expected.
 
 
+money
+-----
+
+The ``money`` and ``smallmoney`` data types will be introspected as 
+``DecimalField`` with the appropriate values for ``max_digits`` and 
+``decimal_places``. This does not mean that they are expected to work without
+issue.
+
 Unsupported Types
 -----------------
 
 smaller than what Django expects from similar types:
 
 - smalldatetime
-- smallmoney
 - tinyint
 - real

File docs/known-issues.txt

 the database column so it does not include a '%' (percent) character, or change ``DEBUG = False`` 
 in your settings when you run ``manage.py inspectdb``.
 
+.. _introspecting-custom-fields:
+
+Introspecting custom fields
+---------------------------
+
+Some datatypes will be mapped to a custom model field provided by 
+`Django-mssql`. If any of these fields are used, it will be necessary to add 
+``import sqlserver_ado.fields`` to the top of the models.py file. If using a
+version of Django prior to 1.7, it will be necessary to also remove the 
+"models." prefix from any of these custom fields. :djangoticket:`21090`
+
 .. _azure-clustered-indices:
 
 Azure requires clustered indices

File docs/settings.txt

     SQL server maintains the datatype of the values used in ``AVG``. The
     average of an ``int`` column will be an ``int``. With this option set
     to ``True``, ``AVG([1,2])`` == 1, not 1.5.
+
+.. setting:: use_legacy_date_fields
+
+use_legacy_date_fields
+~~~~~~~~~~~~~~~~~~~~~~
+
+Default: ``True``
+
+This setting alters which data types are used for the ``DateField``, 
+``DateTimeField``, and ``TimeField`` fields. When ``True``, the fields will all 
+use the ``datetime`` data type. When ``False``, they will use ``date``, 
+``datetime``, and ``time`` data types.
+
+
+
+.. versionadded:: 1.4
         'Topic :: Database',
     ],
     zip_safe=False,
-    install_requires=[
-        'six',
-    ],
 )

File sqlserver/ado/__init__.py

-import patches
+from __future__ import absolute_import
+from . import patches

File sqlserver/ado/ado_consts.py

 adVarWChar                    = 0xCA
 adVariant                     = 0xC
 adWChar                       = 0x82
-# Additional constants used by introspection but not ADO itself
-#AUTO_FIELD_MARKER = -1000
 
 adTypeNames = {
     adBSTR: 'adBSTR',

File sqlserver/ado/base.py

 import sys
 from django.db import utils
 from django.core.exceptions import ImproperlyConfigured, ValidationError
-import six
 from . import dbapi as Database
 adodb = Database
 

File sqlserver/ado/dbapi.py

 import time
 import datetime
 import re
-import six
+from django.utils import six
 
 try:
     import decimal
             p.NumericScale = 0
             p.Precision = digit_count + exponent
 
+    elif isinstance(value, datetime.datetime):
+        if getattr(settings, 'USE_TZ', False) and value.tzinfo:
+            value = value.astimezone(timezone.utc)
+        p.Type = adBSTR
+        s = value.isoformat(' ')
+        p.Value = s
+        p.Size = len(s)
+
     elif isinstance(value, datetime.time):
-            if getattr(settings, 'USE_TZ', False) and value.tzinfo:
-                value = value.astimezone(timezone.utc)
-            p.Value = datetime.datetime(1,1,1, value.hour, value.minute, value.second)
-    elif isinstance(value, datetime.datetime):
-            if getattr(settings, 'USE_TZ', False) and value.tzinfo:
-                value = value.astimezone(timezone.utc)
-            p.Value = value
+        if getattr(settings, 'USE_TZ', False) and value.tzinfo:
+            value = value.astimezone(timezone.utc)
+        p.Type = adBSTR
+        s = value.isoformat()
+        p.Value = s
+        p.Size = len(s)
+
+    elif isinstance(value, datetime.date):
+        p.Type = adBSTR
+        s = value.isoformat()
+        p.Value = s
+        p.Size = len(s)
+
     else:
         # For any other type, set the value and let pythoncom do the right thing.
         p.Value = value
     def rollback(self):
         """Abort a pending transaction."""
         self.messages = []
-        if not self.supportsTransactions:
-            self._raiseConnectionError(NotSupportedError, None)
+        #if not self.supportsTransactions:
+        #    self._raiseConnectionError(NotSupportedError, None)
+        with self.cursor() as cursor:
+            cursor.execute("select @@TRANCOUNT")
+            trancount, = cursor.fetchone()
+        if trancount == 0:
+            return
 
         self.adoConn.RollbackTrans()
         if not(self.adoConn.Attributes & adXactAbortRetaining):

File sqlserver/ado/introspection.py

+from __future__ import absolute_import
 from ..introspection import BaseSqlDatabaseIntrospection
-import ado_consts
+
+from . import ado_consts
+
+AUTO_FIELD_MARKER = -1000
+BIG_AUTO_FIELD_MARKER = -1001
+MONEY_FIELD_MARKER = -1002
 
 class DatabaseIntrospection(BaseSqlDatabaseIntrospection):
+    def _get_table_field_type_map(self, cursor, table_name):
+        """
+        Return a dict mapping field name to data type. DB-API cursor description 
+        interprets the date columns as chars.
+        """
+        cursor.execute('SELECT [COLUMN_NAME], [DATA_TYPE] FROM INFORMATION_SCHEMA.COLUMNS WHERE [TABLE_NAME] LIKE \'%s\'' % table_name)
+        results = dict(cursor.fetchall())
+        return results
+
+    def _datatype_to_ado_type(self, datatype):
+        """
+        Map datatype name to ado type.
+        """
+        return {
+            'bigint': ado_consts.adBigInt,
+            'binary': ado_consts.adBinary,
+            'bit': ado_consts.adBoolean,
+            'char': ado_consts.adChar,
+            'date': ado_consts.adDBDate,
+            'datetime': ado_consts.adDBTimeStamp,
+            'datetime2': ado_consts.adDBTimeStamp,
+            'datetimeoffset': ado_consts.adDBTimeStamp,
+            'decimal': ado_consts.adDecimal,
+            'float': ado_consts.adDouble,
+            'image': ado_consts.adVarBinary,
+            'int': ado_consts.adInteger,
+            'money': MONEY_FIELD_MARKER,
+            'numeric': ado_consts.adNumeric,
+            'nchar': ado_consts.adWChar,
+            'ntext': ado_consts.adLongVarWChar,
+            'nvarchar': ado_consts.adVarWChar,
+            'smalldatetime': ado_consts.adDBTimeStamp,
+            'smallint': ado_consts.adSmallInt,
+            'smallmoney': MONEY_FIELD_MARKER,
+            'text': ado_consts.adLongVarChar,
+            'time': ado_consts.adDBTime,
+            'tinyint': ado_consts.adTinyInt,
+            'varbinary': ado_consts.adVarBinary,
+            'varchar': ado_consts.adVarChar,
+        }.get(datatype.lower(), None)
+
+    def get_table_description(self, cursor, table_name, identity_check=True):
+        """Return a description of the table, with DB-API cursor.description interface.
+
+        The 'auto_check' parameter has been added to the function argspec.
+        If set to True, the function will check each of the table's fields for the
+        IDENTITY property (the IDENTITY property is the MSSQL equivalent to an AutoField).
+
+        When a field is found with an IDENTITY property, it is given a custom field number
+        of SQL_AUTOFIELD, which maps to the 'AutoField' value in the DATA_TYPES_REVERSE dict.
+        """
+        table_field_type_map = self._get_table_field_type_map(cursor, table_name)
+
+        cursor.execute("SELECT * FROM [%s] where 1=0" % (table_name))
+        columns = cursor.description
+
+        items = list()
+        for column in columns:
+            column = list(column) # Convert tuple to list
+            # fix data type
+            column[1] = self._datatype_to_ado_type(table_field_type_map.get(column[0]))
+
+            if identity_check and self._is_auto_field(cursor, table_name, column[0]):
+                if column[1] == ado_consts.adBigInt:
+                    column[1] = BIG_AUTO_FIELD_MARKER
+                else:
+                    column[1] = AUTO_FIELD_MARKER
+
+            if column[1] == MONEY_FIELD_MARKER:
+                # force decimal_places=4 to match data type. Cursor description thinks this column is a string
+                column[5] = 4
+            items.append(column)
+        return items
+
     data_types_reverse = {
-        #ado_consts.AUTO_FIELD_MARKER: 'AutoField',
+        AUTO_FIELD_MARKER: 'IntegerField',
+        BIG_AUTO_FIELD_MARKER: 'sqlserver_ado.fields.BigAutoField',
+        MONEY_FIELD_MARKER: 'DecimalField',
         ado_consts.adBoolean: 'BooleanField',
         ado_consts.adChar: 'CharField',
         ado_consts.adWChar: 'CharField',
         ado_consts.adDecimal: 'DecimalField',
         ado_consts.adNumeric: 'DecimalField',
+        ado_consts.adDate: 'DateField',
+        ado_consts.adDBDate: 'DateField',
+        ado_consts.adDBTime: 'TimeField',
         ado_consts.adDBTimeStamp: 'DateTimeField',
         ado_consts.adDouble: 'FloatField',
         ado_consts.adSingle: 'FloatField',
         ado_consts.adInteger: 'IntegerField',
         ado_consts.adBigInt: 'BigIntegerField',
-        ado_consts.adSmallInt: 'IntegerField',
-        ado_consts.adTinyInt: 'IntegerField',
+        ado_consts.adSmallInt: 'SmallIntegerField',
+        ado_consts.adTinyInt: 'SmallIntegerField',
         ado_consts.adVarChar: 'CharField',
         ado_consts.adVarWChar: 'CharField',
         ado_consts.adLongVarWChar: 'TextField',

File sqlserver/base.py

 from django.db.backends import BaseDatabaseWrapper, BaseDatabaseFeatures, BaseDatabaseValidation, BaseDatabaseClient
 from django.db.backends.signals import connection_created
 from django.utils.functional import cached_property
-import six
+from django.utils import six
 import sys
 
 try:
 try:
     from .schema import DatabaseSchemaEditor
 except ImportError:
-    pass
+    DatabaseSchemaEditor = None
 
 
 class DatabaseFeatures(BaseDatabaseFeatures):
     has_bulk_insert = False
 
     supports_timezones = False
-    supports_sequence_reset = False
 
     can_return_id_from_insert = True
 
-    supports_regex_backreferencing = False
-
-    # Disable test modeltests.lookup.tests.LookupTests.test_lookup_date_as_str
-    supports_date_lookup_using_string = False
-
     supports_tablespaces = True
 
     ignores_nulls_in_unique_constraints = False
     allows_group_by_pk = False
     allows_group_by_ordinal = False
     supports_microsecond_precision = False
+    allow_sliced_subqueries = False
+
+    can_introspect_autofield = True
+
     supports_subqueries_in_group_by = False
-    allow_sliced_subqueries = False
+
     uses_savepoints = True
     supports_paramstyle_pyformat = False
     supports_transactions = True
     def __init__(self, *args, **kwargs):
         super(SqlServerBaseWrapper, self).__init__(*args, **kwargs)
 
+        try:
+            self.command_timeout = int(self.settings_dict.get('COMMAND_TIMEOUT', 30))
+        except ValueError:
+            self.command_timeout = 30
+
+        options = self.settings_dict.get('OPTIONS', {})
+        try:
+            self.cast_avg_to_float = not bool(options.get('disable_avg_cast', False))
+        except ValueError:
+            self.cast_avg_to_float = False
+
+        USE_LEGACY_DATE_FIELDS_DEFAULT = True
+        try:
+            self.use_legacy_date_fields = bool(options.get('use_legacy_date_fields', USE_LEGACY_DATE_FIELDS_DEFAULT))
+        except ValueError:
+            self.use_legacy_date_fields = USE_LEGACY_DATE_FIELDS_DEFAULT
+
         self.features = DatabaseFeatures(self)
         autocommit = self.settings_dict["OPTIONS"].get("autocommit", False)
         self.features.uses_autocommit = autocommit
         self.creation = DatabaseCreation(self)
         self.validation = BaseDatabaseValidation(self)
 
-        try:
-            self.command_timeout = int(self.settings_dict.get('COMMAND_TIMEOUT', 30))
-        except ValueError:
-            self.command_timeout = 30
-
-        try:
-            options = self.settings_dict.get('OPTIONS', {})
-            self.cast_avg_to_float = not bool(options.get('disable_avg_cast', False))
-        except ValueError:
-            self.cast_avg_to_float = False
-
-        self.ops.features = self.features
-        self.ops.is_sql2000 = self.is_sql2000
-        self.ops.is_sql2005 = self.is_sql2005
-        self.ops.is_sql2008 = self.is_sql2008
-
     if not hasattr(BaseDatabaseWrapper, '_cursor'):
         # support for django 1.5 and previous
         class CursorWrapper(object):
             def __exit__(self, type, value, traceback):
                 self.cursor.close()
 
-            def execute(self, sql, params = ()):
+            def execute(self, sql, params=()):
                 try:
                     return self.cursor.execute(sql, params)
                 except self.database.IntegrityError as e:
             # only pytds support new sql server date types
             supports_new_date_types = self._is_sql2008_and_up(conn)
             self.features.supports_microsecond_precision = supports_new_date_types
-            if supports_new_date_types:
-                self.creation._patch_for_sql2008_and_up()
+            if not supports_new_date_types:
+                self.creation._enable_legacy_date_fields()
         if self.settings_dict["OPTIONS"].get("allow_nulls_in_unique_constraints", True):
             self.features.ignores_nulls_in_unique_constraints = self._is_sql2008_and_up(conn)
             if self._is_sql2008_and_up(conn):
         pass
 
     def schema_editor(self):
-        "Returns a new instance of this backend's SchemaEditor"
+        """Returns a new instance of this backend's SchemaEditor"""
         return DatabaseSchemaEditor(self)

File sqlserver/compiler.py

+from __future__ import absolute_import
+
+try:
+    from itertools import zip_longest
+except ImportError:
+    from itertools import izip_longest as zip_longest
+
+import django
 from django.db.models.sql import compiler
-import datetime
 import re
 from itertools import chain, repeat
 
-from contextlib import contextmanager
-
 # query_class returns the base class to use for Django queries.
 # The custom 'SqlServerQuery' class derives from django.db.models.sql.query.Query
 # which is passed in as "QueryClass" by Django itself.
 # Pattern used in column aliasing to find sub-select placeholders
 _re_col_placeholder = re.compile(r'\{_placeholder_(\d+)\}')
 
+
 def _break(s, find):
     """Break a string s into the part before the substring to find, 
     and the part including and after the substring."""
     i = s.find(find)
     return s[:i], s[i:]
 
+
 def _get_order_limit_offset(sql):
     return _re_order_limit_offset.search(sql).groups()
-    
+
+
 def _remove_order_limit_offset(sql):
-    return _re_order_limit_offset.sub('',sql).split(None, 1)[1]
+    return _re_order_limit_offset.sub('', sql).split(None, 1)[1]
 
-@contextmanager
-def prevent_ordering_query(compiler_):
-    try:
-        setattr(query, '_mssql_ordering_not_allowed', True)
-        yield
-    finally:
-        delattr(query, '_mssql_ordering_not_allowed')
 
 class SQLCompiler(compiler.SQLCompiler):
+    def __pad_fields_with_aggregates(self, fields):
+        """
+        Fix for Django ticket #21126 backported to Django 1.5-1.5.4
+        """
+        if django.VERSION[:2] == (1, 5) and django.VERSION[2] <= 5:
+            if not bool(self.query.aggregate_select):
+                return fields
+            aggregate_start = len(self.query.extra_select) + len(self.query.select)
+            aggregate_end = aggregate_start + len(self.query.aggregate_select)
+
+            # pad None in to fields for aggregates
+            return fields[:aggregate_start] + [
+                None for x in range(0, aggregate_end - aggregate_start)
+            ] + fields[aggregate_start:]
+        return fields
+
     def resolve_columns(self, row, fields=()):
+        fields = self.__pad_fields_with_aggregates(fields)
+
         # If the results are sliced, the resultset will have an initial 
         # "row number" column. Remove this column before the ORM sees it.
         if getattr(self, '_using_row_number', False):
             row = row[1:]
-       
         values = []
         index_extra_select = len(self.query.extra_select)
-        for value, field in zip(row[index_extra_select:], chain(fields, repeat(None))):
+        for value, field in zip_longest(row[index_extra_select:], fields):
+            # print '\tfield=%s\tvalue=%s' % (repr(field), repr(value))
             if field:
-                if isinstance(value, datetime.datetime):
-                    internal_type = field.get_internal_type()
-                    if internal_type == 'DateField':
-                        value = value.date()
-                    elif internal_type == 'TimeField':
-                        value = value.time()
+                internal_type = field.get_internal_type()
+                if internal_type in self.connection.ops._convert_values_map:
+                    value = self.connection.ops._convert_values_map[internal_type].to_python(value)
             values.append(value)
-
         return row[:index_extra_select] + tuple(values)
 
     def _fix_aggregates(self):
             # The ORDER BY clause is invalid in views, inline functions, 
             # derived tables, subqueries, and common table expressions, 
             # unless TOP or FOR XML is also specified.
-            self.query._mssql_ordering_not_allowed = with_col_aliases
-            result = super(SQLCompiler, self).as_sql(with_limits, with_col_aliases)
-            # remove in case query is every reused
-            delattr(self.query, '_mssql_ordering_not_allowed')            
+            try:
+                setattr(self.query, '_mssql_ordering_not_allowed', with_col_aliases)
+                result = super(SQLCompiler, self).as_sql(with_limits, with_col_aliases)
+            finally:
+                # remove in case query is every reused
+                delattr(self.query, '_mssql_ordering_not_allowed')
             return result
 
         raw_sql, fields = super(SQLCompiler, self).as_sql(False, with_col_aliases)
             else:
                 col = x
             f.append('{0}.{1}'.format(inner_table_name, col.strip()))
-        
-        
+
         # inject a subselect to get around OVER requiring ORDER BY to come from FROM
         inner_select = '{fields} FROM ( SELECT {inner} ) AS {inner_as}'.format(
             fields=', '.join(f),

File sqlserver/creation.py

-# This dictionary maps Field objects to their associated Server Server column
-# types, as strings. Column-type strings can contain format strings; they'll
-# be interpolated against the values of Field.__dict__.
+from __future__ import absolute_import
+
 from django.conf import settings
 from django.db.backends.creation import BaseDatabaseCreation, TEST_DATABASE_PREFIX
 from django.db.utils import load_backend
-import sys
-import  six
+from django.utils import six
 
 class DatabaseCreation(BaseDatabaseCreation):
+    # This dictionary maps Field objects to their associated Server Server column
+    # types, as strings. Column-type strings can contain format strings; they'll
+    # be interpolated against the values of Field.__dict__.
     data_types = {
         'AutoField':                    'int IDENTITY (1, 1)',
         'BigAutoField':                 'bigint IDENTITY (1, 1)',
         'BooleanField':                 'bit',
         'CharField':                    'nvarchar(%(max_length)s)',
         'CommaSeparatedIntegerField':   'nvarchar(%(max_length)s)',
-        'DateField':                    'datetime',
-        'DateTimeField':                'datetime',
+        'DateField':                    'date',
+        'DateTimeField':                'datetime2',
+        'DateTimeOffsetField':          'datetimeoffset',
         'DecimalField':                 'decimal(%(max_digits)s, %(decimal_places)s)',
         'FileField':                    'nvarchar(%(max_length)s)',
         'FilePathField':                'nvarchar(%(max_length)s)',
         'FloatField':                   'double precision',
+        'GenericIPAddressField':        'nvarchar(39)',
         'IntegerField':                 'int',
         'IPAddressField':               'nvarchar(15)',
-        'GenericIPAddressField':        'nvarchar(39)',
+        'LegacyDateField':              'datetime',
+        'LegacyDateTimeField':          'datetime',
+        'LegacyTimeField':              'time',
+        'NewDateField':                 'date',
+        'NewDateTimeField':             'datetime2',
+        'NewTimeField':                 'time',
         'NullBooleanField':             'bit',
         'OneToOneField':                'int',
         'PositiveIntegerField':         'int CHECK ([%(column)s] >= 0)',
         'SlugField':                    'nvarchar(%(max_length)s)',
         'SmallIntegerField':            'smallint',
         'TextField':                    'nvarchar(max)',
-        'TimeField':                    'datetime',
+        'TimeField':                    'time',
         'BinaryField':                  'varbinary(max)',
     }
-    _sql2008_date_types = {
-        'DateField': 'date',
-        'DateTimeField': 'datetime2(6)',
-        'TimeField': 'time(6)',
-        }
 
-    def _patch_for_sql2008_and_up(self):
-        self.data_types.update(self._sql2008_date_types)
+    def __init__(self, *args, **kwargs):
+        super(DatabaseCreation, self).__init__(*args, **kwargs)
+
+        if self.connection.use_legacy_date_fields:
+            self.data_types.update({
+                'DateField': 'datetime',
+                'DateTimeField': 'datetime',
+                'TimeField': 'datetime',
+            })
 
     def _create_master_connection(self):
         """
         try:
             super(DatabaseCreation, self)._create_test_db(verbosity, autoclobber)
             self.install_regex(test_database_name)
+        except Exception as e:
+            if 'Choose a different database name.' in str(e):
+                six.print_('Database "%s" could not be created because it already exists.' % test_database_name)
+            else:
+                raise
         finally:
             # set thing back
             self.connection.close()
 
         try:
             super(DatabaseCreation, self)._destroy_test_db(test_database_name, verbosity)
+        except Exception as e:
+            if 'it is currently in use' in str(e):
+                six.print_('Cannot drop database %s because it is in use' % test_database_name)
+            else:
+                raise
         finally:
             self.connection = old_wrapper
 

File sqlserver/fields.py

 """This module provides SQL Server specific fields for Django models."""
-from django.db.models import AutoField, ForeignKey, BigIntegerField
+import datetime
+from django.db import models
 from django.forms import ValidationError
+from django.utils import timezone
 from django.utils.translation import ugettext_lazy as _
+from django.utils import six
+
 
 __all__ = (
     'BigAutoField',
     'BigForeignKey',
     'BigIntegerField',
+    'DateField',
+    'DateTimeField',
+    'DateTimeOffsetField',
+    'LegacyTimeField',
+    'LegacyDateField',
+    'LegacyDateTimeField',
+    'TimeField',
 )
 
-class BigAutoField(AutoField):
+class BigAutoField(models.AutoField):
     """A bigint IDENTITY field"""
     def get_internal_type(self):
         return "BigAutoField"
             return None
         return int(value)
 
-class BigForeignKey(ForeignKey):
+class BigForeignKey(models.ForeignKey):
     """A ForeignKey field that points to a BigAutoField or BigIntegerField"""
     def db_type(self, connection=None):
         try:
-            return BigIntegerField().db_type(connection=connection)
+            return models.BigIntegerField().db_type(connection=connection)
         except AttributeError:
-            return BigIntegerField().db_type()
+            return models.BigIntegerField().db_type()
+
+BigIntegerField = models.BigIntegerField
+
+def convert_microsoft_date_to_isoformat(value):
+    if isinstance(value, six.string_types):
+        value = value.replace(' +', '+').replace(' -', '-')
+    return value
+
+class DateField(models.DateField):
+    """
+    A DateField backed by a 'date' database field.
+    """
+    def get_internal_type(self):
+        return 'NewDateField'
+
+    def to_python(self, value):
+        val = super(DateField, self).to_python(convert_microsoft_date_to_isoformat(value))
+        if isinstance(val, datetime.datetime):
+            val = datetime.date()
+        return val
+ 
+class DateTimeField(models.DateTimeField):
+    """
+    A DateTimeField backed by a 'datetime2' database field.
+    """
+    def get_internal_type(self):
+        return 'NewDateTimeField'
+
+    def to_python(self, value):
+        from django.conf import settings
+        result = super(DateTimeField, self).to_python(convert_microsoft_date_to_isoformat(value))
+        if result:
+            if getattr(settings, 'USE_TZ', False):
+                if timezone.is_aware(result):
+                    result = result.astimezone(timezone.utc)
+                else:
+                    result = result.replace(tzinfo=timezone.utc)
+            else:
+                if timezone.is_aware(result):
+                    result = result.astimezone(timezone.utc).replace(tzinfo=None)
+        return result
+
+    def get_db_prep_value(self, value, connection, prepared=False):
+        if not prepared:
+            value = self.get_prep_value(value)
+        return connection.ops.value_to_db_datetime(value)
+
+class DateTimeOffsetField(models.DateTimeField):
+    """
+    A DateTimeOffsetField backed by a 'datetimeoffset' database field.
+    """
+    def get_internal_type(self):
+        return 'DateTimeOffsetField'
+
+    def to_python(self, value):
+        return super(DateTimeOffsetField, self).to_python(convert_microsoft_date_to_isoformat(value))
+
+    def get_db_prep_value(self, value, connection, prepared=False):
+        if not prepared:
+            value = self.get_prep_value(value)
+        if value is None:
+            return None
+        return value.isoformat(' ')
+
+class TimeField(models.TimeField):
+    """
+    A TimeField backed by a 'time' database field.
+    """
+    def get_internal_type(self):
+        return 'NewTimeField'
+
+    def to_python(self, value):
+        return super(TimeField, self).to_python(convert_microsoft_date_to_isoformat(value))
+
+    def get_db_prep_value(self, value, connection, prepared=False):
+        if not prepared:
+            value = self.get_prep_value(value)
+        return connection.ops.value_to_db_time(value)
+
+class LegacyDateField(models.DateField):
+    """
+    A DateField that is backed by a 'datetime' database field.
+    """
+    def get_internal_type(self):
+        return 'LegacyDateField'
+
+    def to_python(self, value):
+        val = super(LegacyDateField, self).to_python(convert_microsoft_date_to_isoformat(value))
+        if isinstance(val, datetime.datetime):
+            val = datetime.date()
+        return val
+
+class LegacyDateTimeField(models.DateTimeField):
+    """
+    A DateTimeField that is backed by a 'datetime' database field.
+    """
+    def get_internal_type(self):
+        return 'LegacyDateTimeField'
+
+    def to_python(self, value):
+        return super(LegacyDateTimeField, self).to_python(convert_microsoft_date_to_isoformat(value))
+
+    def get_db_prep_value(self, value, connection, prepared=False):
+        if not prepared:
+            value = self.get_prep_value(value)
+        return connection.ops.value_to_db_datetime(value)
+
+class LegacyTimeField(models.TimeField):
+    """
+    A TimeField that is backed by a 'datetime' database field.
+    """
+    def get_internal_type(self):
+        return 'LegacyTimeField'
+
+    def to_python(self, value):
+        val = super(LegacyTimeField, self).to_python(convert_microsoft_date_to_isoformat(value))
+        if isinstance(val, datetime.datetime):
+            val = val.time()
+        return val
+  
+    def get_db_prep_value(self, value, connection, prepared=False):
+        if not prepared:
+            value = self.get_prep_value(value)
+        return connection.ops.value_to_db_time(value)

File sqlserver/introspection.py

+from __future__ import absolute_import
+
 from django.db.backends import BaseDatabaseIntrospection
 
+AUTO_FIELD_MARKER = -1000
+BIG_AUTO_FIELD_MARKER = -1001
+MONEY_FIELD_MARKER = -1002
+
 class BaseSqlDatabaseIntrospection(BaseDatabaseIntrospection):
     def get_table_list(self, cursor):
         "Return a list of table and view names in the current database."
         cursor.execute(sql)
         return cursor.fetchone()[0]
 
+    def _get_table_field_type_map(self, cursor, table_name):
+        """
+        Return a dict mapping field name to data type. DB-API cursor description 
+        interprets the date columns as chars.
+        """
+        cursor.execute('SELECT [COLUMN_NAME], [DATA_TYPE] FROM INFORMATION_SCHEMA.COLUMNS WHERE [TABLE_NAME] LIKE \'%s\'' % table_name)
+        results = dict(cursor.fetchall())
+        return results
+
     def get_table_description(self, cursor, table_name, identity_check=True):
         """Return a description of the table, with DB-API cursor.description interface.
 
         items = list()
         for column in columns:
             column = list(column) # Convert tuple to list
-            #if identity_check and self._is_auto_field(cursor, table_name, column[0]):
-            #    column[1] = 'AUTO_FIELD_MARKER'
             items.append(column)
         return items
 
 	join sys.indexes IX on IX.object_id = T.object_id and IX.index_id = IC.index_id
 where
 	T.name = %s
-	--and (IX.is_unique=1 or IX.is_primary_key=1)
     -- Omit multi-column keys
 	and not exists (
 		select *

File sqlserver/operations.py

+from __future__ import absolute_import
+
 import datetime
-import time
 import django
 import django.core.exceptions
 from django.conf import settings
 
 from django.utils import timezone
 
+from . import fields as mssql_fields
+
+def total_seconds(td):
+    if hasattr(td, 'total_seconds'):
+        return td.total_seconds()
+    else:
+        return td.days * 24 * 60 * 60 + td.seconds
+
+
 class DatabaseOperations(BaseDatabaseOperations):
     compiler_module = "sqlserver.compiler"
 
+    _convert_values_map = {
+        # custom fields
+        'DateTimeOffsetField':  mssql_fields.DateTimeOffsetField(),
+        'LegacyDateField':      mssql_fields.LegacyDateField(),
+        'LegacyDateTimeField':  mssql_fields.LegacyDateTimeField(),
+        'LegacyTimeField':      mssql_fields.LegacyTimeField(),
+        'NewDateField':         mssql_fields.DateField(),
+        'NewDateTimeField':     mssql_fields.DateTimeField(),
+        'NewTimeField':         mssql_fields.TimeField(),
+    }
+
+    def __init__(self, *args, **kwargs):
+        super(DatabaseOperations, self).__init__(*args, **kwargs)
+
+        if self.connection.use_legacy_date_fields:
+            self._convert_values_map.update({
+                'DateField':        self._convert_values_map['LegacyDateField'],
+                'DateTimeField':    self._convert_values_map['LegacyDateTimeField'],
+                'TimeField':        self._convert_values_map['LegacyTimeField'],
+            })
+        else:
+            self._convert_values_map.update({
+                'DateField':        self._convert_values_map['NewDateField'],
+                'DateTimeField':    self._convert_values_map['NewDateTimeField'],
+                'TimeField':        self._convert_values_map['NewTimeField'],
+            })
+
     def cache_key_culling_sql(self):
         return """
             SELECT [cache_key]
                                            "but it isn't installed.")
             tz = pytz.timezone(tzname)
             td = tz.utcoffset(datetime.datetime(2000, 1, 1))
-            total_minutes = td.total_seconds() // 60
+            total_minutes = total_seconds(td) // 60
             hours, minutes = divmod(total_minutes, 60)
             tzoffset = "%+03d:%02d" % (hours, minutes)
             field_name = "CAST(SWITCHOFFSET(TODATETIMEOFFSET(%s, '+00:00'), '%s') AS DATETIME2)" % (field_name, tzoffset)
                                            "but it isn't installed.")
             tz = pytz.timezone(tzname)
             td = tz.utcoffset(datetime.datetime(2000, 1, 1))
-            total_minutes = td.total_seconds() // 60
+            total_minutes = total_seconds(td) // 60
             hours, minutes = divmod(total_minutes, 60)
             tzoffset = "%+03d:%02d" % (hours, minutes)
             field_name = "CAST(SWITCHOFFSET(TODATETIMEOFFSET(%s, '+00:00'), '%s') AS DATETIME2)" % (field_name, tzoffset)
         if value is None:
             return None
 
-        if timezone.is_aware(value):
+        if timezone.is_aware(value):# and not self.connection.features.supports_timezones:
             if getattr(settings, 'USE_TZ', False):
                 value = value.astimezone(timezone.utc).replace(tzinfo=None)
             else:
                 raise ValueError("SQL Server backend does not support timezone-aware datetimes.")
 
-        if not self.features.supports_microsecond_precision:
+        if not self.connection.features.supports_microsecond_precision:
             value = value.replace(microsecond=0)
         return value
 
             return value
 
         if timezone.is_aware(value):
-            raise ValueError("SQL Server backend does not support timezone-aware times.")
+            if not getattr(settings, 'USE_TZ', False) and hasattr(value, 'astimezone'):
+                value = timezone.make_naive(value, timezone.utc)
+            else:
+                raise ValueError("SQL Server backend does not support timezone-aware times.")
 
         # MS SQL 2005 doesn't support microseconds
         #...but it also doesn't really suport bare times
         if value is None:
             return None
 
-        if not self.features.supports_microsecond_precision:
+        if not self.connection.features.supports_microsecond_precision:
             value = value.replace(microsecond=0)
         return value
 
         `value` is an int, containing the looked-up year.
         """
         first = datetime.datetime(value, 1, 1)
-        if self.features.supports_microsecond_precision:
+        if self.connection.features.supports_microsecond_precision:
             second = datetime.datetime(value, 12, 31, 23, 59, 59, 999999)
         else:
             second = datetime.datetime(value, 12, 31, 23, 59, 59, 997)
         all the objects to be inserted.
         """
         return min(len(objs), 1000)
+        
+    def convert_values(self, value, field):
+        """
+        MSSQL needs help with date fields that might come out as strings.
+        """
+        if field:
+            internal_type = field.get_internal_type()
+            if internal_type in self._convert_values_map:
+                value = self._convert_values_map[internal_type].to_python(value)
+            else:
+                value = super(DatabaseOperations, self).convert_values(value, field)
+        return value
 
     def bulk_insert_sql(self, fields, num_values):
         """

File sqlserver/pymssql/base.py

 import sys
-import six
+from django.utils import six
 from django.conf import settings
 from django.db import utils
 from django.db.backends.signals import connection_created

File sqlserver/pytds/base.py

 try:
     import pytds
 except ImportError:
+    pytds = None
     raise Exception('pytds is not available, run pip install python-tds to fix this')
 
-from sqlserver.base import (
-    SqlServerBaseWrapper,
-    )
+from sqlserver.base import SqlServerBaseWrapper
 
 from .introspection import DatabaseIntrospection
 
 Database = pytds
 
+
 def utc_tzinfo_factory(offset):
     if offset != 0:
         raise AssertionError("database connection isn't set to UTC")
     return utc
 
+
 class DatabaseWrapper(SqlServerBaseWrapper):
     Database = pytds
 
             'use_mars': options.get('use_mars', False),
             'load_balancer': options.get('load_balancer', None),
             'use_tz': utc if getattr(settings, 'USE_TZ', False) else None,
-            }
+        }
         return conn_params
 
     def _get_new_connection(self, conn_params):

File sqlserver/pytds/introspection.py

 from ..introspection import BaseSqlDatabaseIntrospection
 import pytds
 
+
 class DatabaseIntrospection(BaseSqlDatabaseIntrospection):
     data_types_reverse = {
         #'AUTO_FIELD_MARKER': 'AutoField',
         pytds.SYBFLT8: 'FloatField',
         pytds.SYBINT4: 'IntegerField',
         pytds.SYBINT8: 'BigIntegerField',
-        pytds.SYBINT2: 'IntegerField',
-        pytds.SYBINT1: 'IntegerField',
+        pytds.SYBINT2: 'SmallIntegerField',
+        pytds.SYBINT1: 'SmallIntegerField',
         pytds.XSYBVARCHAR: 'CharField',
         pytds.XSYBNVARCHAR: 'CharField',
         pytds.SYBTEXT: 'TextField',

File sqlserver/schema.py

-import six
+from django.utils import six
 from django.db.backends.schema import BaseDatabaseSchemaEditor
 from django.db.models.fields.related import ManyToManyField
 

File tests/dbsettings.py

         'OPTIONS' : {
             'provider': 'SQLNCLI10',
             'extra_params': 'DataTypeCompatibility=80;MARS Connection=True;',
+            'use_legacy_date_fields': False,
         },
     }
 }

File tests/test_main/aggregates/models.py

     Avg, Count, Max, Min, StdDev, Sum, Variance)
 
 class AmountTable(models.Model):
-    """Test that basic aggregates work.
-
-    >>> AmountTable(amount=100).save()
-    >>> AmountTable(amount=101).save()
-    >>> AmountTable(amount=102).save()
-    >>> len(list(AmountTable.objects.all()))
-    3
-
-    >>> AmountTable.objects.aggregate(Avg('amount'))
-    {'amount__avg': 101.0}
-
-    >>> AmountTable.objects.aggregate(Max('amount'))
-    {'amount__max': 102}
-
-    >>> AmountTable.objects.aggregate(Min('amount'))
-    {'amount__min': 100}
-
-    >>> AmountTable.objects.aggregate(Sum('amount'))
-    {'amount__sum': 303}
-    """
     amount = models.IntegerField()
 
 class Player(models.Model):
     avatar = models.CharField(max_length=40)
     
 class Bet(models.Model):
-    """Test cross-table aggregates
-    
-    >>> p1 = Player(name="Adam Vandenberg")
-    >>> p1.save()
-    
-    >>> p2 = Player(name="Joe Betsalot")
-    >>> p2.save()
-    
-    >>> GamerCard(player=p1).save()
-    >>> GamerCard(player=p2).save()
-    
-    >>> Bet(player=p1, amount="100.00").save()
-    >>> Bet(player=p1, amount="200.00").save()
-    >>> Bet(player=p1, amount="300.00").save()
-    >>> Bet(player=p1, amount="400.00").save()
-    >>> Bet(player=p1, amount="500.00").save()
-    
-    >>> Bet(player=p2, amount="1000.00").save()
-    >>> Bet(player=p2, amount="2000.00").save()
-    >>> Bet(player=p2, amount="3000.00").save()
-    >>> Bet(player=p2, amount="4000.00").save()
-    >>> Bet(player=p2, amount="5000.00").save()
-    
-    >>> p = list(Player.objects.annotate(Count('bet'), avg_bet=Avg('bet__amount')).order_by('name'))
-    >>> import six
-    
-    >>> six.text_type(p[0].name)
-    u'Adam Vandenberg'
-    >>> p[0].bet__count
-    5
-    >>> p[0].avg_bet
-    300
-    
-    >>> six.text_type(p[1].name)
-    u'Joe Betsalot'
-    >>> p[1].bet__count
-    5
-    >>> p[1].avg_bet
-    3000
-    
-    >>> p = Player.objects.annotate(bets=Count('bet'), avg_bet=Avg('bet__amount')).values()
-    """
     player = models.ForeignKey(Player)
     amount = models.DecimalField(max_digits=10, decimal_places=2)

File tests/test_main/aggregates/tests.py

+from __future__ import absolute_import
+
+from operator import attrgetter
+
+from django.db import connection
+from django.test import TestCase, skipUnlessDBFeature
+
+from .models import *
+
+class BasicAggregateTest(TestCase):
+    def setUp(self):
+        AmountTable.objects.create(amount=100)
+        AmountTable.objects.create(amount=101)
+        AmountTable.objects.create(amount=102)
+
+    def test_avg_disable_avg_cast(self):
+        try:
+            old_val = connection.cast_avg_to_float
+            connection.cast_avg_to_float = False
+
+            self.assertEqual(AmountTable.objects.aggregate(Avg('amount')), {'amount__avg': 101})
+        finally:
+            connection.cast_avg_to_float = old_val
+
+    def test_avg_cast_avg_to_float(self):
+        try:
+            old_val = connection.cast_avg_to_float
+            connection.cast_avg_to_float = True
+
+            self.assertEqual(AmountTable.objects.aggregate(Avg('amount')), {'amount__avg': 101.0})
+        finally:
+            connection.cast_avg_to_float = old_val
+
+    def test_max(self):
+        self.assertEqual(AmountTable.objects.aggregate(Max('amount')), {'amount__max': 102})
+
+    def test_min(self):
+        self.assertEqual(AmountTable.objects.aggregate(Min('amount')), {'amount__min': 100})
+
+    def test_sum(self):
+        self.assertEqual(AmountTable.objects.aggregate(Sum('amount')), {'amount__sum': 303})
+
+class CrossTableAggregateTest(TestCase):
+    def setUp(self):
+        p1 = Player.objects.create(name='player 1')
+        p2 = Player.objects.create(name='player 2')
+
+        GamerCard.objects.create(player=p1)
+        GamerCard.objects.create(player=p2)
+
+        Bet.objects.create(player=p1, amount="100.00")
+        Bet.objects.create(player=p1, amount="200.00")
+        Bet.objects.create(player=p1, amount="300.00")
+        Bet.objects.create(player=p1, amount="400.00")
+        Bet.objects.create(player=p1, amount="500.00")
+
+        Bet.objects.create(player=p2, amount="1000.00")
+        Bet.objects.create(player=p2, amount="2000.00")
+        Bet.objects.create(player=p2, amount="3000.00")
+        Bet.objects.create(player=p2, amount="4000.00")
+        Bet.objects.create(player=p2, amount="5000.00")
+
+
+    def test_cross_table_aggregates(self):
+        p = Player.objects.annotate(Count('bet'), avg_bet=Avg('bet__amount')).order_by('name')
+
+        self.assertEqual('player 1', p[0].name)
+        self.assertEqual(5, p[0].bet__count)
+        self.assertEqual(300, p[0].avg_bet)
+
+        self.assertEqual('player 2', p[1].name)
+        self.assertEqual(5, p[1].bet__count)
+        self.assertEqual(3000, p[1].avg_bet)
+
+    def test_cross_table_aggregates_values(self):
+        p = Player.objects.annotate(bets=Count('bet'), avg_bet=Avg('bet__amount')).values()
+
+        self.assertEqual(2, len(p))
+
+        self.assertEqual(p[0], {'avg_bet': 300.0, 'bets': 5, u'id': 3, 'name': u'player 1'})
+        self.assertEqual(p[1], {'avg_bet': 3000.0, 'bets': 5, u'id': 4, 'name': u'player 2'})

File tests/test_main/regressiontests/models.py

+from __future__ import absolute_import
+
 import datetime
 import decimal
 from django.db import models, IntegrityError
 from django.test import TestCase
 from django.core.paginator import Paginator
-from sqlserver.fields import BigAutoField, BigIntegerField, BigForeignKey
+from sqlserver.fields import *
+
 
 class Bug19Table(models.Model):
     """ A simple model for testing string comparisons.
 
 class StringTable(models.Model):
     name = models.CharField(max_length=50)
+
+
+class LegacyDateTimeTable(models.Model):
+    val = LegacyDateTimeField()
+
+class LegacyDateTable(models.Model):
+    val = LegacyDateField()
+
+class LegacyTimeTable(models.Model):
+    val = LegacyTimeField()
+
+class DateTable(models.Model):
+    val = DateField()
+
+class DateTimeTable(models.Model):
+    val = DateTimeField()
+
+class TimeTable(models.Model):
+    val = TimeField()
+
+class DateTimeOffsetTable(models.Model):
+    val = DateTimeOffsetField()

File tests/test_main/regressiontests/tests.py

 import sys
 import datetime
 import decimal
+from operator import attrgetter
+import time
 from django.core.exceptions import ImproperlyConfigured
 from django.db import models, connection
 from django.utils import unittest
 from django.test import TestCase
 from django.utils.safestring import mark_safe
-import six
-from six.moves import xrange
+from django.utils import six
+from django.utils.six.moves import xrange
 
-from regressiontests.models import Bug69Table1, Bug69Table2, Bug70Table, Bug93Table, IntegerIdTable, StringTable
+from regressiontests.models import *
 
 class Bug38Table(models.Model):
     d = models.DecimalField(max_digits=5, decimal_places=2)
         obj = StringTable(name=mark_safe(u'string'))
         obj.save()
         self.assertEqual(six.text_type(obj.name), six.text_type(StringTable.objects.get(pk=obj.id).name))
+
+class DateTestCase(TestCase):
+    def _test(self, cls, val):
+        cls.objects.create(val=val)
+        self.assertQuerysetEqual(
+            cls.objects.all(),
+            [val],
+            attrgetter('val')
+        )
+
+    def test_legacy_date(self):
+        self._test(LegacyDateTable, datetime.date(1901, 1, 1))
+
+    #def test_legacy_datetime(self):
+    #    self._test(LegacyDateTimeTable, datetime.datetime(1901, 1, 1, 1, 1, 1, 123000))
+
+    def test_legacy_time(self):
+        self._test(LegacyTimeTable, datetime.time(13, 13, 59, 123000))
+
+    def test_new_date(self):
+        self._test(DateTable, datetime.date(2013, 9, 18))
+
+    #def test_new_datetime(self):
+    #    self._test(DateTimeTable, datetime.datetime(2013, 9, 18, 13, 1, 59, 123456))
+
+    def test_new_time(self):
+        self._test(TimeTable, datetime.time(13, 13, 59, 123456))
+
+    def test_datetimeoffset(self):
+        from django.utils import timezone
+        val = timezone.make_aware(datetime.datetime.now(), timezone.LocalTimezone())
+        self._test(DateTimeOffsetTable, val)