Commits

Hajime Nakagami  committed ecb7bf3 Merge

merge from default

  • Participants
  • Parent commits 2d87677, 73f1123
  • Branches cymysql

Comments (0)

Files changed (21)

File doc/build/changelog/changelog_08.rst

 
     .. change::
       :tags: bug, orm
-      :tickets: 2697
-
-      Fixed bug whereby ORM would run the wrong kind of
-      query when refreshing an inheritance-mapped class
-      where the superclass was mapped to a non-Table
-      object, like a custom join() or a select(),
-      running a query that assumed a hierarchy that's
-      mapped to individual Table-per-class.
-
-    .. change::
-      :tags: bug, orm
-
-      Fixed `__repr__()` on mapper property constructs
-      to work before the object is initialized, so
-      that Sphinx builds with recent Sphinx versions
-      can read them.
-
-    .. change::
-      :tags: bug, sql, postgresql
-
-      The _Binary base type now converts values through
-      the bytes() callable when run on Python 3; in particular
-      psycopg2 2.5 with Python 3.3 seems to now be returning
-      the "memoryview" type, so this is converted to bytes
-      before return.
-
-    .. change::
-      :tags: bug, sql
-      :tickets: 2695
-
-      Improvements to Connection auto-invalidation
-      handling.  If a non-disconnect error occurs,
-      but leads to a delayed disconnect error within error
-      handling (happens with MySQL), the disconnect condition
-      is detected.  The Connection can now also be closed
-      when in an invalid state, meaning it will raise "closed"
-      on next usage, and additionally the "close with result"
-      feature will work even if the autorollback in an error
-      handling routine fails and regardless of whether the
-      condition is a disconnect or not.
+      :tickets: 2708
+
+      Improved the behavior of instance management regarding
+      the creation of strong references within the Session;
+      an object will no longer have an internal reference cycle
+      created if it's in the transient state or moves into the
+      detached state - the strong ref is created only when the
+      object is attached to a Session and is removed when the
+      object is detached.  This makes it somewhat safer for an
+      object to have a `__del__()` method, even though this is
+      not recommended, as relationships with backrefs produce
+      cycles too.  A warning has been added when a class with
+      a `__del__()` method is mapped.
 
     .. change::
       :tags: bug, sql
       an anonymous label which isn't generally user-friendly.
 
     .. change::
+      :tags: bug, mysql
+      :pullreq: 54
+
+      Updated a regexp to correctly extract error code on
+      google app engine v1.7.5 and newer.  Courtesy
+      Dan Ring.
+
+    .. change::
+      :tags: bug, examples
+
+      Fixed a long-standing bug in the caching example, where
+      the limit/offset parameter values wouldn't be taken into
+      account when computing the cache key.  The
+      _key_from_query() function has been simplified to work
+      directly from the final compiled statement in order to get
+      at both the full statement as well as the fully processed
+      parameter list.
+
+    .. change::
+      :tags: bug, mssql
+      :tickets: 2355
+
+      Part of a longer series of fixes needed for pyodbc+
+      mssql, a CAST to NVARCHAR(max) has been added to the bound
+      parameter for the table name and schema name in all information schema
+      queries to avoid the issue of comparing NVARCHAR to NTEXT,
+      which seems to be rejected by the ODBC driver in some cases,
+      such as FreeTDS (0.91 only?) plus unicode bound parameters being passed.
+      The issue seems to be specific to the SQL Server information
+      schema tables and the workaround is harmless for those cases
+      where the problem doesn't exist in the first place.
+
+    .. change::
+      :tags: bug, sql
+      :tickets: 2691
+
+      Fixed bug where disconnect detect on error would
+      raise an attribute error if the error were being
+      raised after the Connection object had already
+      been closed.
+
+    .. change::
+      :tags: bug, sql
+      :tickets: 2703
+
+      Reworked internal exception raises that emit
+      a rollback() before re-raising, so that the stack
+      trace is preserved from sys.exc_info() before entering
+      the rollback.  This so that the traceback is preserved
+      when using coroutine frameworks which may have switched
+      contexts before the rollback function returns.
+
+    .. change::
+      :tags: bug, orm
+      :tickets: 2697
+
+      Fixed bug whereby ORM would run the wrong kind of
+      query when refreshing an inheritance-mapped class
+      where the superclass was mapped to a non-Table
+      object, like a custom join() or a select(),
+      running a query that assumed a hierarchy that's
+      mapped to individual Table-per-class.
+
+    .. change::
+      :tags: bug, orm
+
+      Fixed `__repr__()` on mapper property constructs
+      to work before the object is initialized, so
+      that Sphinx builds with recent Sphinx versions
+      can read them.
+
+    .. change::
+      :tags: bug, sql, postgresql
+
+      The _Binary base type now converts values through
+      the bytes() callable when run on Python 3; in particular
+      psycopg2 2.5 with Python 3.3 seems to now be returning
+      the "memoryview" type, so this is converted to bytes
+      before return.
+
+    .. change::
+      :tags: bug, sql
+      :tickets: 2695
+
+      Improvements to Connection auto-invalidation
+      handling.  If a non-disconnect error occurs,
+      but leads to a delayed disconnect error within error
+      handling (happens with MySQL), the disconnect condition
+      is detected.  The Connection can now also be closed
+      when in an invalid state, meaning it will raise "closed"
+      on next usage, and additionally the "close with result"
+      feature will work even if the autorollback in an error
+      handling routine fails and regardless of whether the
+      condition is a disconnect or not.
+
+
+    .. change::
       :tags: bug, orm, declarative
       :tickets: 2656
 

File doc/build/dialects/mysql.rst

 
 .. automodule:: sqlalchemy.dialects.mysql.mysqlconnector
 
+cymysql
+------------
+
+.. automodule:: sqlalchemy.dialects.mysql.cymysql
+
 Google App Engine
 -----------------------
 

File examples/dogpile_caching/caching_query.py

 
     """
 
-    v = []
-    def visit_bindparam(bind):
-
-        if bind.key in query._params:
-            value = query._params[bind.key]
-        elif bind.callable:
-            value = bind.callable()
-        else:
-            value = bind.value
-
-        v.append(unicode(value))
-
     stmt = query.statement
-    visitors.traverse(stmt, {}, {'bindparam': visit_bindparam})
+    compiled = stmt.compile()
+    params = compiled.params
 
     # here we return the key as a long string.  our "key mangler"
     # set up with the region will boil it down to an md5.
-    return " ".join([unicode(stmt)] + v)
+    return " ".join(
+                    [unicode(compiled)] +
+                    [unicode(params[k]) for k in sorted(params)])
 
 class FromCache(MapperOption):
     """Specifies that a Query should load results from a cache."""
 
     propagate_to_loaders = True
 
-    def __init__(self, attribute, region="default"):
-        self.region = region
-        self.cls_ = attribute.property.parent.class_
-        self.key = attribute.property.key
-
-    def process_query_conditionally(self, query):
-        if query._current_path:
-            mapper, key = query._current_path[-2:]
-            if issubclass(mapper.class_, self.cls_) and \
-                key == self.key:
-                query._cache_region = self
-
-class RelationshipCache(MapperOption):
-    """Specifies that a Query as called within a "lazy load"
-       should load results from a cache."""
-
-    propagate_to_loaders = True
-
     def __init__(self, attribute, region="default", cache_key=None):
         """Construct a new RelationshipCache.
 

File lib/sqlalchemy/dialects/mssql/information_schema.py

 
 from ... import Table, MetaData, Column
 from ...types import String, Unicode, Integer, TypeDecorator
+from ... import cast
 
 ischema = MetaData()
 
         # end Py2K
         return value
 
+    def bind_expression(self, bindvalue):
+        return cast(bindvalue, Unicode)
+
 schemata = Table("SCHEMATA", ischema,
     Column("CATALOG_NAME", CoerceUnicode, key="catalog_name"),
     Column("SCHEMA_NAME", CoerceUnicode, key="schema_name"),

File lib/sqlalchemy/dialects/mysql/gaerdbms.py

         return [], opts
 
     def _extract_error_code(self, exception):
-        match = re.compile(r"^(\d+):").match(str(exception))
+        match = re.compile(r"^(\d+):|^\((\d+),").match(str(exception))
         # The rdbms api will wrap then re-raise some types of errors
         # making this regex return no matches.
-        code = match.group(1) if match else None
+        code = match.group(1) or match.group(2) if match else None
         if code:
             return int(code)
 

File lib/sqlalchemy/engine/base.py

             self.engine.dialect.do_begin(self.connection)
         except Exception, e:
             self._handle_dbapi_exception(e, None, None, None, None)
-            raise
 
     def _rollback_impl(self):
         if self._has_events:
                 self.__transaction = None
             except Exception, e:
                 self._handle_dbapi_exception(e, None, None, None, None)
-                raise
         else:
             self.__transaction = None
 
             self.__transaction = None
         except Exception, e:
             self._handle_dbapi_exception(e, None, None, None, None)
-            raise
 
     def _savepoint_impl(self, name=None):
         if self._has_events:
                                 dialect, self, conn)
         except Exception, e:
             self._handle_dbapi_exception(e, None, None, None, None)
-            raise
 
         ret = ctx._exec_default(default, None)
         if self.should_close_with_result:
             self._handle_dbapi_exception(e,
                         str(statement), parameters,
                         None, None)
-            raise
 
         if context.compiled:
             context.pre_exec()
                                 parameters,
                                 cursor,
                                 context)
-            raise
 
         if self._has_events:
             self.dispatch.after_cursor_execute(self, cursor,
                                 parameters,
                                 cursor,
                                 None)
-            raise
 
     def _safe_close_cursor(self, cursor):
         """Close the given cursor, catching exceptions
                                     cursor,
                                     context):
 
+        exc_info = sys.exc_info()
+
         if not self._is_disconnect:
             self._is_disconnect = isinstance(e, self.dialect.dbapi.Error) and \
                 not self.closed and \
                 self.dialect.is_disconnect(e, self.__connection, cursor)
 
         if self._reentrant_error:
-            # Py3K
-            #raise exc.DBAPIError.instance(statement, parameters, e,
-            #                               self.dialect.dbapi.Error) from e
-            # Py2K
-            raise exc.DBAPIError.instance(statement,
+            util.raise_from_cause(
+                        exc.DBAPIError.instance(statement,
                                             parameters,
                                             e,
-                                            self.dialect.dbapi.Error), \
-                                            None, sys.exc_info()[2]
-            # end Py2K
+                                            self.dialect.dbapi.Error),
+                        exc_info
+                        )
         self._reentrant_error = True
         try:
             # non-DBAPI error - if we already got a context,
                     self._safe_close_cursor(cursor)
                 self._autorollback()
 
-            if not should_wrap:
-                return
+            if should_wrap:
+                util.raise_from_cause(
+                                    exc.DBAPIError.instance(
+                                        statement,
+                                        parameters,
+                                        e,
+                                        self.dialect.dbapi.Error,
+                                        connection_invalidated=self._is_disconnect),
+                                    exc_info
+                                )
 
-            # Py3K
-            #raise exc.DBAPIError.instance(
-            #                        statement,
-            #                        parameters,
-            #                        e,
-            #                        self.dialect.dbapi.Error,
-            #                        connection_invalidated=self._is_disconnect) \
-            #                        from e
-            # Py2K
-            raise exc.DBAPIError.instance(
-                                    statement,
-                                    parameters,
-                                    e,
-                                    self.dialect.dbapi.Error,
-                                    connection_invalidated=self._is_disconnect), \
-                                    None, sys.exc_info()[2]
-            # end Py2K
+            util.reraise(*exc_info)
 
         finally:
             del self._reentrant_error
             trans.commit()
             return ret
         except:
-            trans.rollback()
-            raise
+            with util.safe_reraise():
+                trans.rollback()
 
     def run_callable(self, callable_, *args, **kwargs):
         """Given a callable object or function, execute it, passing
             try:
                 self.commit()
             except:
-                self.rollback()
-                raise
+                with util.safe_reraise():
+                    self.rollback()
         else:
             self.rollback()
 
         try:
             trans = conn.begin()
         except:
-            conn.close()
-            raise
+            with util.safe_reraise():
+                conn.close()
         return Engine._trans_ctx(conn, trans, close_with_result)
 
     def transaction(self, callable_, *args, **kwargs):

File lib/sqlalchemy/engine/default.py

             except Exception, e:
                 self.root_connection._handle_dbapi_exception(
                                 e, None, None, None, self)
-                raise
         else:
             inputsizes = {}
             for key in self.compiled.bind_names.values():
             except Exception, e:
                 self.root_connection._handle_dbapi_exception(
                                 e, None, None, None, self)
-                raise
 
     def _exec_default(self, default, type_):
         if default.is_sequence:

File lib/sqlalchemy/engine/result.py

         except Exception, e:
             self.connection._handle_dbapi_exception(
                               e, None, None, self.cursor, self.context)
-            raise
 
     @property
     def lastrowid(self):
             self.connection._handle_dbapi_exception(
                                  e, None, None,
                                  self._saved_cursor, self.context)
-            raise
 
     @property
     def returns_rows(self):
             self.connection._handle_dbapi_exception(
                                     e, None, None,
                                     self.cursor, self.context)
-            raise
 
     def fetchmany(self, size=None):
         """Fetch many rows, just like DB-API
             self.connection._handle_dbapi_exception(
                                     e, None, None,
                                     self.cursor, self.context)
-            raise
 
     def fetchone(self):
         """Fetch one row, just like DB-API ``cursor.fetchone()``.
             self.connection._handle_dbapi_exception(
                                     e, None, None,
                                     self.cursor, self.context)
-            raise
 
     def first(self):
         """Fetch the first row and then close the result set unconditionally.
             self.connection._handle_dbapi_exception(
                                     e, None, None,
                                     self.cursor, self.context)
-            raise
 
         try:
             if row is not None:

File lib/sqlalchemy/orm/instrumentation.py

         self.manage()
         self._instrument_init()
 
+        if '__del__' in class_.__dict__:
+            util.warn("__del__() method on class %s will "
+                        "cause unreachable cycles and memory leaks, "
+                        "as SQLAlchemy instrumentation often creates "
+                        "reference cycles.  Please remove this method." %
+                        class_)
+
     dispatch = event.dispatcher(events.InstanceEvents)
 
     @property

File lib/sqlalchemy/orm/mapper.py

             del self._configure_failed
 
         if not self.non_primary and \
+            self.class_manager is not None and \
             self.class_manager.is_mapped and \
-            self.class_manager.mapper is self:
+                self.class_manager.mapper is self:
             instrumentation.unregister_class(self.class_)
 
     def _configure_pks(self):

File lib/sqlalchemy/orm/session.py

 #
 # This module is part of SQLAlchemy and is released under
 # the MIT License: http://www.opensource.org/licenses/mit-license.php
-
 """Provides the Session class and related utilities."""
 
+from __future__ import with_statement
+
 import weakref
 from .. import util, sql, engine, exc as sa_exc, event
 from ..sql import util as sql_util, expression
                 for t in set(self._connections.values()):
                     t[1].prepare()
             except:
-                self.rollback()
-                raise
+                with util.safe_reraise():
+                    self.rollback()
 
         self._state = PREPARED
 
             try:
                 self.commit()
             except:
-                self.rollback()
-                raise
+                with util.safe_reraise():
+                    self.rollback()
         else:
             self.rollback()
 
 
     def _before_attach(self, state):
         if state.session_id != self.hash_key and \
-            self.dispatch.before_attach:
+                self.dispatch.before_attach:
             self.dispatch.before_attach(self, state.obj())
 
     def _attach(self, state, include_before=False):
         if state.key and \
             state.key in self.identity_map and \
-            not self.identity_map.contains_state(state):
+                not self.identity_map.contains_state(state):
             raise sa_exc.InvalidRequestError("Can't attach instance "
                     "%s; another instance with key %s is already "
                     "present in this session."
 
         if state.session_id != self.hash_key:
             if include_before and \
-                self.dispatch.before_attach:
+                    self.dispatch.before_attach:
                 self.dispatch.before_attach(self, state.obj())
             state.session_id = self.hash_key
+            if state.modified and state._strong_obj is None:
+                state._strong_obj = state.obj()
             if self.dispatch.after_attach:
                 self.dispatch.after_attach(self, state.obj())
 
             transaction.commit()
 
         except:
-            transaction.rollback(_capture_exception=True)
-            raise
+            with util.safe_reraise():
+                transaction.rollback(_capture_exception=True)
 
     def is_modified(self, instance, include_collections=True,
                             passive=True):

File lib/sqlalchemy/orm/state.py

         return bool(self.key)
 
     def _detach(self):
-        self.session_id = None
+        self.session_id = self._strong_obj = None
 
     def _dispose(self):
         self._detach()
             instance_dict.discard(self)
 
         self.callables = {}
-        self.session_id = None
+        self.session_id = self._strong_obj = None
         del self.obj
 
     def obj(self):
         self.expired = state.get('expired', False)
         self.callables = state.get('callables', {})
 
-        if self.modified:
-            self._strong_obj = inst
-
         self.__dict__.update([
             (k, state[k]) for k in (
                 'key', 'load_options',
             modified_set.discard(self)
 
         self.modified = False
+        self._strong_obj = None
 
         self.committed_state.clear()
 
         for key in self.manager:
             impl = self.manager[key].impl
             if impl.accepts_scalar_loader and \
-                (impl.expire_missing or key in dict_):
+                    (impl.expire_missing or key in dict_):
                 self.callables[key] = self
             old = dict_.pop(key, None)
             if impl.collection and old is not None:
 
             self.committed_state[attr.key] = previous
 
-        # the "or not self.modified" is defensive at
-        # this point.  The assertion below is expected
-        # to be True:
         # assert self._strong_obj is None or self.modified
 
-        if self._strong_obj is None or not self.modified:
+        if (self.session_id and self._strong_obj is None) \
+                or not self.modified:
             instance_dict = self._instance_dict()
             if instance_dict:
                 instance_dict._modified.add(self)
 
-            self._strong_obj = self.obj()
-            if self._strong_obj is None:
+            # only create _strong_obj link if attached
+            # to a session
+
+            inst = self.obj()
+            if self.session_id:
+                self._strong_obj = inst
+
+            if inst is None:
                 raise orm_exc.ObjectDereferencedError(
                         "Can't emit change event for attribute '%s' - "
                         "parent object of type %s has been garbage "
         this step if a value was not populated in state.dict.
 
         """
-        class_manager = self.manager
         for key in keys:
             self.committed_state.pop(key, None)
 

File lib/sqlalchemy/orm/strategies.py

         # to look only for significant columns
         q = orig_query._clone().correlate(None)
 
-        # TODO: why does polymporphic etc. require hardcoding
-        # into _adapt_col_list ?  Does query.add_columns(...) work
-        # with polymorphic loading ?
-        if entity_mapper.isa(leftmost_mapper):
+        # set a real "from" if not present, as this is more
+        # accurate than just going off of the column expression
+        if not q._from_obj and entity_mapper.isa(leftmost_mapper):
             q._set_select_from(entity_mapper)
+
+        # select from the identity columns of the outer
         q._set_entities(q._adapt_col_list(leftmost_attr))
 
         if q._order_by is False:

File lib/sqlalchemy/util/__init__.py

 
 from .compat import callable, cmp, reduce,  \
     threading, py3k, py3k_warning, jython, pypy, cpython, win32, set_types, \
-    pickle, dottedgetter, parse_qsl, namedtuple, next, WeakSet
+    pickle, dottedgetter, parse_qsl, namedtuple, next, WeakSet, reraise, \
+    raise_from_cause
 
 from ._collections import KeyedTuple, ImmutableContainer, immutabledict, \
     Properties, OrderedProperties, ImmutableProperties, OrderedDict, \
     duck_type_collection, assert_arg_type, symbol, dictlike_iteritems,\
     classproperty, set_creation_order, warn_exception, warn, NoneType,\
     constructor_copy, methods_equivalent, chop_traceback, asint,\
-    generic_repr, counter, PluginLoader, hybridmethod
+    generic_repr, counter, PluginLoader, hybridmethod, safe_reraise
 
 from .deprecations import warn_deprecated, warn_pending_deprecation, \
     deprecated, pending_deprecation

File lib/sqlalchemy/util/compat.py

     def b(s):
         return s
 
+
+if py3k:
+    def reraise(tp, value, tb=None, cause=None):
+        if cause is not None:
+            value.__cause__ = cause
+        if value.__traceback__ is not tb:
+            raise value.with_traceback(tb)
+        raise value
+
+    def raise_from_cause(exception, exc_info):
+        exc_type, exc_value, exc_tb = exc_info
+        reraise(type(exception), exception, tb=exc_tb, cause=exc_value)
+else:
+    exec("def reraise(tp, value, tb=None, cause=None):\n"
+            "    raise tp, value, tb\n")
+
+    def raise_from_cause(exception, exc_info):
+        # not as nice as that of Py3K, but at least preserves
+        # the code line where the issue occurred
+        exc_type, exc_value, exc_tb = exc_info
+        reraise(type(exception), exception, tb=exc_tb)
+
+
+
+

File lib/sqlalchemy/util/langhelpers.py

 from functools import update_wrapper
 from .. import exc
 import hashlib
+from . import compat
 
 def md5_hex(x):
     # Py3K
     m.update(x)
     return m.hexdigest()
 
+class safe_reraise(object):
+    """Reraise an exception after invoking some
+    handler code.
+
+    Stores the existing exception info before
+    invoking so that it is maintained across a potential
+    coroutine context switch.
+
+    e.g.::
+
+        try:
+            sess.commit()
+        except:
+            with safe_reraise():
+                sess.rollback()
+
+    """
+
+    def __enter__(self):
+        self._exc_info = sys.exc_info()
+
+    def __exit__(self, type_, value, traceback):
+        # see #2703 for notes
+        if type_ is None:
+            exc_type, exc_value, exc_tb = self._exc_info
+            self._exc_info = None   # remove potential circular references
+            compat.reraise(exc_type, exc_value, exc_tb)
+        else:
+            self._exc_info = None   # remove potential circular references
+            compat.reraise(type_, value, traceback)
+
 def decode_slice(slc):
     """decode a slice object as sent to __getitem__.
 

File test/dialect/test_mssql.py

                 engine.execute(tbl.delete())
 
 class MonkeyPatchedBinaryTest(fixtures.TestBase):
-    __only_on__ = 'mssql'
+    __only_on__ = 'mssql+pymssql'
 
     def test_unicode(self):
         module = __import__('pymssql')

File test/engine/test_reconnect.py

     def __init__(self, parent):
         self.explode = parent.explode
         self.description = ()
+        self.closed = False
     def execute(self, *args, **kwargs):
         if self.explode == 'execute':
             raise MockDisconnect("Lost the DB connection on execute")
         elif self.explode in ('rollback', 'rollback_no_disconnect'):
             raise MockError(
                 "something broke on execute but we didn't lose the connection")
+        elif args and "select" in args[0]:
+            self.description = [('foo', None, None, None, None, None)]
         else:
             return
+    def fetchall(self):
+        if self.closed:
+            raise MockError("cursor closed")
+        return []
+    def fetchone(self):
+        if self.closed:
+            raise MockError("cursor closed")
+        return None
     def close(self):
-        pass
+        self.closed = True
 
 db, dbapi = None, None
 class MockReconnectTest(fixtures.TestBase):
             conn.execute, select([1])
         )
 
+    def test_check_disconnect_no_cursor(self):
+        conn = db.connect()
+        result = conn.execute("select 1")
+        result.cursor.close()
+        conn.close()
+        assert_raises_message(
+            tsa.exc.DBAPIError,
+            "cursor closed",
+            list, result
+        )
+
 class CursorErrTest(fixtures.TestBase):
 
     def setup(self):

File test/orm/test_instrumentation.py

         # C is not mapped in the current implementation
         assert_raises(sa.orm.exc.UnmappedClassError, class_mapper, C)
 
+    def test_del_warning(self):
+        class A(object):
+            def __del__(self):
+                pass
+
+        assert_raises_message(
+            sa.exc.SAWarning,
+            r"__del__\(\) method on class "
+            "<class 'test.orm.test_instrumentation.A'> will cause "
+            "unreachable cycles and memory leaks, as SQLAlchemy "
+            "instrumentation often creates reference cycles.  "
+            "Please remove this method.",
+            mapper, A, self.fixture()
+        )
 
 class OnLoadTest(fixtures.ORMTest):
     """Check that Events.load is not hit in regular attributes operations."""

File test/orm/test_session.py

             assert sa.orm.attributes.instance_state(a).session_id is None
 
 
+class NoCyclesOnTransientDetachedTest(_fixtures.FixtureTest):
+    """Test the instance_state._strong_obj link that it
+    is present only on persistent/pending objects and never
+    transient/detached.
+
+    """
+    run_inserts = None
+
+    def setup(self):
+        mapper(self.classes.User, self.tables.users)
+
+    def _assert_modified(self, u1):
+        assert sa.orm.attributes.instance_state(u1).modified
+
+    def _assert_not_modified(self, u1):
+        assert not sa.orm.attributes.instance_state(u1).modified
+
+    def _assert_cycle(self, u1):
+        assert sa.orm.attributes.instance_state(u1)._strong_obj is not None
+
+    def _assert_no_cycle(self, u1):
+        assert sa.orm.attributes.instance_state(u1)._strong_obj is None
+
+    def _persistent_fixture(self):
+        User = self.classes.User
+        u1 = User()
+        u1.name = "ed"
+        sess = Session()
+        sess.add(u1)
+        sess.flush()
+        return sess, u1
+
+    def test_transient(self):
+        User = self.classes.User
+        u1 = User()
+        u1.name = 'ed'
+        self._assert_no_cycle(u1)
+        self._assert_modified(u1)
+
+    def test_transient_to_pending(self):
+        User = self.classes.User
+        u1 = User()
+        u1.name = 'ed'
+        self._assert_modified(u1)
+        self._assert_no_cycle(u1)
+        sess = Session()
+        sess.add(u1)
+        self._assert_cycle(u1)
+        sess.flush()
+        self._assert_no_cycle(u1)
+        self._assert_not_modified(u1)
+
+    def test_dirty_persistent_to_detached_via_expunge(self):
+        sess, u1 = self._persistent_fixture()
+        u1.name = 'edchanged'
+        self._assert_cycle(u1)
+        sess.expunge(u1)
+        self._assert_no_cycle(u1)
+
+    def test_dirty_persistent_to_detached_via_close(self):
+        sess, u1 = self._persistent_fixture()
+        u1.name = 'edchanged'
+        self._assert_cycle(u1)
+        sess.close()
+        self._assert_no_cycle(u1)
+
+    def test_clean_persistent_to_detached_via_close(self):
+        sess, u1 = self._persistent_fixture()
+        self._assert_no_cycle(u1)
+        self._assert_not_modified(u1)
+        sess.close()
+        u1.name = 'edchanged'
+        self._assert_modified(u1)
+        self._assert_no_cycle(u1)
+
+    def test_detached_to_dirty_deleted(self):
+        sess, u1 = self._persistent_fixture()
+        sess.expunge(u1)
+        u1.name = 'edchanged'
+        self._assert_no_cycle(u1)
+        sess.delete(u1)
+        self._assert_cycle(u1)
+
+    def test_detached_to_dirty_persistent(self):
+        sess, u1 = self._persistent_fixture()
+        sess.expunge(u1)
+        u1.name = 'edchanged'
+        self._assert_modified(u1)
+        self._assert_no_cycle(u1)
+        sess.add(u1)
+        self._assert_cycle(u1)
+        self._assert_modified(u1)
+
+    def test_detached_to_clean_persistent(self):
+        sess, u1 = self._persistent_fixture()
+        sess.expunge(u1)
+        self._assert_no_cycle(u1)
+        self._assert_not_modified(u1)
+        sess.add(u1)
+        self._assert_no_cycle(u1)
+        self._assert_not_modified(u1)
+
+    def test_move_persistent_clean(self):
+        sess, u1 = self._persistent_fixture()
+        sess.close()
+        s2 = Session()
+        s2.add(u1)
+        self._assert_no_cycle(u1)
+        self._assert_not_modified(u1)
+
+    def test_move_persistent_dirty(self):
+        sess, u1 = self._persistent_fixture()
+        u1.name = 'edchanged'
+        self._assert_cycle(u1)
+        self._assert_modified(u1)
+        sess.close()
+        self._assert_no_cycle(u1)
+        s2 = Session()
+        s2.add(u1)
+        self._assert_cycle(u1)
+        self._assert_modified(u1)
+
+    @testing.requires.predictable_gc
+    def test_move_gc_session_persistent_dirty(self):
+        sess, u1 = self._persistent_fixture()
+        u1.name = 'edchanged'
+        self._assert_cycle(u1)
+        self._assert_modified(u1)
+        del sess
+        gc_collect()
+        self._assert_cycle(u1)
+        s2 = Session()
+        s2.add(u1)
+        self._assert_cycle(u1)
+        self._assert_modified(u1)
+
+    def test_persistent_dirty_to_expired(self):
+        sess, u1 = self._persistent_fixture()
+        u1.name = 'edchanged'
+        self._assert_cycle(u1)
+        self._assert_modified(u1)
+        sess.expire(u1)
+        self._assert_no_cycle(u1)
+        self._assert_not_modified(u1)
 
 class WeakIdentityMapTest(_fixtures.FixtureTest):
     run_inserts = None

File test/orm/test_subquery_relations.py

         sess.add_all([e1, e2])
         sess.flush()
 
-    def test_correct_subquery(self):
+    def test_correct_subquery_nofrom(self):
         sess = create_session()
         # use Person.paperwork here just to give the least
         # amount of context
                 )
         )
 
+    def test_correct_subquery_existingfrom(self):
+        sess = create_session()
+        # use Person.paperwork here just to give the least
+        # amount of context
+        q = sess.query(Engineer).\
+                filter(Engineer.primary_language == 'java').\
+                join(Engineer.paperwork).\
+                filter(Paperwork.description == "tps report #2").\
+                options(subqueryload(Person.paperwork))
+        def go():
+            eq_(q.one().paperwork,
+                    [Paperwork(description="tps report #1"),
+                    Paperwork(description="tps report #2")],
+
+                )
+        self.assert_sql_execution(
+            testing.db,
+            go,
+            CompiledSQL(
+                "SELECT people.person_id AS people_person_id, "
+                "people.name AS people_name, people.type AS people_type, "
+                "engineers.engineer_id AS engineers_engineer_id, "
+                "engineers.primary_language AS engineers_primary_language "
+                "FROM people JOIN engineers "
+                    "ON people.person_id = engineers.engineer_id "
+                    "JOIN paperwork ON people.person_id = paperwork.person_id "
+                "WHERE engineers.primary_language = :primary_language_1 "
+                "AND paperwork.description = :description_1",
+                {"primary_language_1": "java",
+                    "description_1": "tps report #2"}
+            ),
+            CompiledSQL(
+                "SELECT paperwork.paperwork_id AS paperwork_paperwork_id, "
+                "paperwork.description AS paperwork_description, "
+                "paperwork.person_id AS paperwork_person_id, "
+                "anon_1.people_person_id AS anon_1_people_person_id "
+                "FROM (SELECT people.person_id AS people_person_id "
+                "FROM people JOIN engineers ON people.person_id = "
+                "engineers.engineer_id JOIN paperwork "
+                "ON people.person_id = paperwork.person_id "
+                "WHERE engineers.primary_language = :primary_language_1 AND "
+                "paperwork.description = :description_1) AS anon_1 "
+                "JOIN paperwork ON anon_1.people_person_id = "
+                "paperwork.person_id "
+                "ORDER BY anon_1.people_person_id, paperwork.paperwork_id",
+                {"primary_language_1": "java",
+                    "description_1": "tps report #2"}
+            )
+        )
+
+
 
 
 class SelfReferentialTest(fixtures.MappedTest):