- edited description
Default value doesn't work in SQLAlchemy + PostgreSQL + aiopg + psycopg2
I've found an unexpected behavior in SQLAlchemy. I'm using the following versions:
- SQLAlchemy (0.9.8)
- psycopg2 (2.5.4)
- PostgreSQL (9.3.5)
- aiopg (0.5.1)
This is the table definition for the example:
import asyncio
from aiopg.sa import create_engine
from sqlalchemy import (
MetaData,
Column,
Integer,
Table,
String,
)
metadata = MetaData()
users = Table('users', metadata,
Column('id_user', Integer, primary_key=True, nullable=False),
Column('name', String(20), unique=True),
Column('age', Integer, nullable=False, default=0),
)
If now I try to execute a simple insert to the table just populating the id_user and name, the column age should be auto-generated right? Lets see...
@asyncio.coroutine
def go():
engine = yield from create_engine('postgresql://USER@localhost/DB')
data = {'id_user':1, 'name':'Jimmy' }
stmt = users.insert().values(data)
with (yield from engine) as conn:
result = yield from conn.execute(stmt)
loop = asyncio.get_event_loop()
loop.run_until_complete(go())
This is the resulting statement and the corresponding error:
INSERT INTO users (id_user, name, age) VALUES (1, 'Jimmy', null);
psycopg2.IntegrityError: null value in column "age" violates not-null constraint
The age column was not provided, I was expecting something like this:
INSERT INTO users (id_user, name) VALUES (1, 'Jimmy');
Or if the default flag actually works should be:
INSERT INTO users (id_user, name, Age) VALUES (1, 'Jimmy', 0);
Could you put some light on this?
Comments (7)
-
reporter -
repo owner - changed status to invalid
works for me:
from sqlalchemy import ( MetaData, Column, Integer, Table, String, select ) metadata = MetaData() users = Table('users', metadata, Column('id_user', Integer, primary_key=True, nullable=False), Column('name', String(20), unique=True), Column('age', Integer, nullable=False, default=0), ) data = {'id_user':1, 'name':'Jimmy' } stmt = users.insert().values(data) from sqlalchemy import create_engine e = create_engine("postgresql://scott:tiger@localhost/test", echo=True) with e.begin() as conn: metadata.drop_all(conn) metadata.create_all(conn) conn.execute(stmt) with e.begin() as conn: assert conn.scalar(select([users.c.age])) == 0
output, the "0" is clearly included in the INSERT:
CREATE TABLE users ( id_user SERIAL NOT NULL, name VARCHAR(20), age INTEGER NOT NULL, PRIMARY KEY (id_user), UNIQUE (name) ) 2015-03-06 10:10:37,418 INFO sqlalchemy.engine.base.Engine {} 2015-03-06 10:10:37,422 INFO sqlalchemy.engine.base.Engine INSERT INTO users (id_user, name, age) VALUES (%(id_user)s, %(name)s, %(age)s) 2015-03-06 10:10:37,422 INFO sqlalchemy.engine.base.Engine {'age': 0, 'id_user': 1, 'name': 'Jimmy'} 2015-03-06 10:10:37,422 INFO sqlalchemy.engine.base.Engine COMMIT 2015-03-06 10:10:37,424 INFO sqlalchemy.engine.base.Engine BEGIN (implicit) 2015-03-06 10:10:37,424 INFO sqlalchemy.engine.base.Engine SELECT users.age FROM users 2015-03-06 10:10:37,424 INFO sqlalchemy.engine.base.Engine {} 2015-03-06 10:10:37,424 INFO sqlalchemy.engine.base.Engine COMMIT guest-wireless-43:sa098 classic$
if you're still having problems, please provide a reproducing test case that I can run, thanks.
-
reporter - changed title to Default value doesn't work in SQLAlchemy + PostgreSQL + aiopg + psycopg2
- edited description
Sorry Mike, I forgot mentioning that I was using aiopg for asynchronous transactions. I've added full testing scenario. Thanks
-
reporter Forget it Mike! It has been confirmed has an aiop bug.
-
repo owner that is not at all surprising, aiopg implements an extremely limited subset of core. im kind of surprised how minimal it is. Also it is slower than threads, see my speed tests at http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/. :)
-
reporter Thanks Mike, the post is actually quite thorough. You just smashed my hopes in the aiopg library, my biggest deception after I realized that Santa was the town drunk :'(
The project is still very young though I hope it will speed up as time goes on
-
reporter Pity that your blog doesn't accept comments... you could probably get interesting feedback on this topic
- Log in to comment