Memory leak in Pickled type with PostgreSQL engine

Issue #857 resolved
Former user created an issue

Hi. I have memory leaks in py Pylons application which uses SQLAlchemy 0.4.

As think I've found the reason, but I have no idea how to fix it. Memory leak is caused by list of PostgreSQL warnings, which grows to unlimited size. It contains strings with content like 'WARNING: nonstandard escape usage' or something simular.

Attaching a test case which consumes memory. To run, just set proper connect string in the beginning of the file.

Comments (3)

  1. Mike Bayer repo owner

    How do I see the warnings (i dont see any ?) ? how do you know they are the source of the leak ?

    afaict this is a bug in psycopg2. logging the gc counts like this:

    import gc
    
    counter = 0
    while True:
        try:
            session = Session()
            record = session.query(Sample).first()
            record.picklecolumn[0](0) = counter
            counter += 1
            session.commit()
            print "NEW OBJECTS:", len(gc.get_objects())
        finally:
            session.clear()
            del session
    

    reveals no growth in the total number of Python objects allocated. therefore its not a SQLAlchemy issue. (while we could say, "dont issue the warning!", thats not really a fix. warning messages shouldnt leak memory). but yeah, memory is definitely growing. you should construct a psycopg2-only test case and put a ticket on psycopg2's site.

  2. Former user Account Deleted

    I wrote th e debug middleware, which printed all objects which are constantly growing, and I saw

    (['WARNING:...', 'WARNING:...', ...]('WARNING:...',), 389)
    

    string there. Second number and number of items were constantly growing, while I didn't found any lists in psycopg2 core, there's only a property for last warning.

    As there are no objects which are subject to garbage collection, number of gc.get_objects() is really a constant

  3. Log in to comment