yield_per working with oursql but not with mysqldb

Issue #2928 resolved
Former user created an issue

I have a large query which uses all of my system's memory unless I use yield_per.

While using the mysqldb library in the sqlalchemy connection string and watching TOP, the query will eat up all the systems memory (and get killed) with or without yield_per. However, when I use oursql - the memory usage will stay steady while I use yield_per (as expected).

You can simulate this by using the mysqldb library + a query using query(table).yield_per(5) on an extremely large dataset on a server with a small amount of RAM.

Comments (1)

  1. Mike Bayer repo owner

    this is MySQLdb's behavior, not SQLAlchemy's. The docs are pretty clear http://docs.sqlalchemy.org/en/rel_0_9/orm/query.html?highlight=yield_per#sqlalchemy.orm.query.Query.yield_per:

    Also note that while yield_per() will set the stream_results execution option to True, currently this is only understood by psycopg2 dialect which will stream results using server side cursors instead of pre-buffer all rows for this query. Other DBAPIs pre-buffer all rows before making them available.
    

    OurSQL is another DBAPI that has the advantage of being able to stream rows.

  2. Log in to comment