- assigned issue to
Cannot insert in table when column keys are unicode-coded. (sqlsoup)
I'm using SqlSoup to reflect a postgresql database.
this bug may be related with #821.
The database use utf-8 as default encoding.
My goal is to use soup.table.insert(**dict) to insert a bunch of new rows.
the table looks like that:
Table('horse', MetaData(Engine(postgres://localhost/minitagedb)), Column(u'horse_id',
PGInteger(), table=<horse>, primary_key=True, nullable=False,
server_default=DefaultClause(<sqlalchemy.sql.expression._TextClause object at
0x8c38f8c>, for_update=False)), Column(u'horse_rev', PGInteger(), table=<horse>),
Column(u'horse_color_id', PGInteger(), ForeignKey(u'horse_color.horse_color_id'),
table=<horse>, nullable=False), Column(u'breed', PGString(length=255,
convert_unicode=False, assert_unicode=None), table=<horse>), Column(u'birthdate',
PGDate(), table=<horse>), Column(u'father_id', PGInteger(),
ForeignKey(u'horse.horse_id'), table=<horse>), Column(u'mother_id', PGInteger(),
ForeignKey(u'horse.horse_id'), table=<horse>), Column(u'breeder_id', PGInteger(),
ForeignKey(u'breeder.breeder_id'), table=<horse>), Column(u'birth_origin',
PGString(length=4, convert_unicode=False, assert_unicode=None), table=<horse>),
schema=None)
Problem is that you cannot put unicode keywordson that ''insert'' method:
(Pdb)print h
{u'birth_origin': None, u'breeder_id': 1, u'horse_id': 1000, u'mother_id': None,
u'breed': 'Dapsens Paint', u'birthdate': datetime.date(1986, 7, 15),
u'father_id': None, u'horse_color_id': 1, u'horse_rev': 1}
(Pdb) soup.horse_revision.insert(**h)
*** TypeError: insert() keywords must be strings
Then, now, i workaround like this:
def insert(table, row):
i = table.insert()
i.__dict__.update(row)
then, its trivial to do:
insert(soup.something, dict(id=0, value='a')
I tried to insert without unicode encoded keys for the h dict too, but it does not work either.
Comments (8)
-
repo owner -
Account Deleted The database use only ascii identifiers, the model is in 'english', so the problem is somewhere between the configuration (my side!?) or a bug in unicode handling.
As i am not yet a SQLAlchemy expert, i don't know how to determine the origin of the problem until i read complety its code. Either it is: * on the backend configuration * on SA itself.
All, that i can see is that the tables may be not not loaded correctly and all collumns result with unicode identifiers. More over if i do not apply the patch i gave on
#821, i can't even load some tables. And i repeat, those tables don't have any accentued chars.One subsidiary question can be :
Is that normal that a column is reflected like that:
Column(u'COLUMN_NAME', ...),
Or the normal value must be:
Column('COLUMN_NAME', ...),
If the first case is normal, then how use soup.TABLE.insert to give unicode keywords arguments to match the column identifiers?
-
Account Deleted - assigned issue to
-
Account Deleted The problem is with python itself -- you can't pass a unicode keyword argument, so **{dict with unicode keys} errors out.
Not sure what workaround would not make the API super ugly. :(
-
Account Deleted Maybe change the method to take an optionnal dictionnary element which will replace kwargs. As you can put unicode keys in the dictionnary.
def insert(cls, row=None, **kwargs): if not row: row = kwargs o = cls() o.__dict__.update(kwargs) return o
-
repo owner - changed title to Cannot insert in table when column keys are unicode-coded. (sqlsoup)
-
repo owner - changed status to wontfix
this is really outside the scope of SqlSoup which is only a quick and dirty convenience tool. The
insert()
operation is nothing more than instantiating a mapped class with the ORM, and the limitations are the same as that of using an__init__()
method on your mapped object, orgetattr()
to set attributes. if you're looking to slice and dice dictionaries from result sets into parameter, use the SQL expression language. The latest SQLsoup will accept them via the newly addedexecute()
call. -
repo owner - removed milestone
Removing milestone: 0.5.xx (automated comment)
- Log in to comment
the issue here depends on if you actually require non-ascii identifier names or not. If not, then some simple conversion step to ASCII would be needed.