'str' object has no attribute 'read' on Oracle LONG columns

Issue #2620 resolved
Marcin Kasperski created an issue

While performing queries on UnicodeText columns, or reflected LONG columns, I get the following error (tested on 0.7.8, I faced it after upgrade to 0.7.8, IIRC from 0.6.1 but I am not sure):

Traceback (most recent call last):
  File "sqlalchemy_long_bug.py", line 35, in <module>
    print item
  File "/home/marcink/.virtualenvs/ias/lib/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-i686.egg/sqlalchemy/engine/base.py", line 2680, in __repr__
    return repr(tuple(self))
  File "/home/marcink/.virtualenvs/ias/lib/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-i686.egg/sqlalchemy/dialects/oracle/cx_oracle.py", line 218, in process
    return value.read()
AttributeError: 'str' object has no attribute 'read'

My platform: Ubuntu 10.04, python 2.6.5, SQLAlchemy 0.7.8, cx_Oracle-5.1.2, Oracle 10g, NLS_LANG=polish_poland.EE8ISO8859P2 (but NSL_LANG does not seem to matter).

Debugging shows, that _LOBMixin tries to call value.read() when value already is in fact string with exact column value. My bet is that there is some disagreement between cx_Oracle and SQLALchemy cx_oracle driver about output types.

Forcing column type to {{{UnicodeText}}} (instead of reflection) does not help, the same error.

Forcing column type to {{{String}}} helps.

Example script:

# -*- coding: utf-8 -*-

"""
Before running this script, execute the following SQL:

    CREATE TABLE slb_test(id INTEGER, value LONG);

    INSERT INTO slb_test VALUES (1, 'aaaaa');
    INSERT INTO slb_test VALUES (2, 'bbbbbb');
    COMMIT;

and fix DB_URI beelow.

"""

DB_URI="oracle://scott:tiger@my.db.machine:5521/DB"

import sqlalchemy
import sqlalchemy.orm

engine = sqlalchemy.create_engine(DB_URI)
metadata = sqlalchemy.MetaData(bind = engine)
Session = sqlalchemy.orm.sessionmaker(bind = engine, autoflush = True, autocommit = True)

slb_table = sqlalchemy.Table("slb_test", metadata, autoload=True, autoload_with=engine)

session = Session()

query = slb_table.select().order_by(slb_table.c.id)
for item in session.execute(query):
    print item
session.close()

Obviously adding some test for value type in _LOBMixin would help, but most likely there is some deeper problem below.

Comments (9)

  1. Marcin Kasperski reporter

    To be exact: my server is Oracle 10.2.0.5.0 (64bit), my client (and libraries) is Oracle XE 10.2.0 (32 bit).

  2. Mike Bayer repo owner

    will have to test locally to see if the issue is consistent, other users are using LONG so not sure why they've not reported this.

  3. Mike Bayer repo owner

    also the solution would be to not apply _LOBMixin to LONG, unless there are in fact cases where cx_oracle changes its mind.

  4. Marcin Kasperski reporter

    The problem may be dependant on various environment settings (database encoding, client locale, Oracle client and server version etc etc). I can only say that I observe it in my current environment (and that slight defensive programming like _MaybeLOB object which .read()'s after checking a type would make my scripts somewhat happier).

  5. Marcin Kasperski reporter

    For the sake of experiment, I made similar test but at this time filled the table with really long strings ("x" * 256000). Now another problem appears, which seems related:

      File "/home/marcink/.virtualenvs/ias/lib/python2.6/site-packages/SQLAlchemy-0.7.8-py2.6-linux-i686.egg/sqlalchemy/engine/base.py", line 3157, in _fetchone_impl
        return self.cursor.fetchone()
    sqlalchemy.exc.DatabaseError: (DatabaseError) column at array pos 0 fetched with error: 1406 None None
    

    Note that ORA-01406 is "fetched column value was truncated" (and IIRC happens when allocated output space is not big enough)

    So maybe the whole problem is on the other end of the tail: for same reason either SQLAlchemy, or cx_Oracle, improperly describes result column, preparing normal string output for this column instead of blob space.

  6. Log in to comment