make schema collection members publically immutable
(This seems to work with postgresql, only issue with oracle.)
Any time the identifier needs quotes, sqla seems to consistently miss the tablename alias. This can result in ORA-00918: column ambiguously defined
Example output:
SELECT salespersons.user_name AS salespersons_user_name,
salespersons.departmentid AS salespersons_departmentid,
salespersons.bonusid AS salespersons_bonusid,
salespersons.siteid AS salespersons_siteid,
"group" AS "group",
users_1.password AS users_1_password,
users_1.firstname || ' ' || users_1.lastname AS anon_1,
users_1.user_name AS users_1_user_name,
users_1.firstname AS users_1_firstname,
If you need a script, let me know, but I think I've identified the criteria to recreate this issue.
Thanks, Kent''''
Comments (21)
-
repo owner -
repo owner - changed milestone to 0.6.5
yup....sorry, the rule still stands, need a test case, I just took the time and it works fine:
from sqlalchemy import * from sqlalchemy.dialects import oracle m = MetaData() t = Table("t", m, Column("x", Integer), Column("group", Integer) ) print t.select().where(t.c.group==5).order_by(t.c.group).compile(dialect=oracle.dialect()) SELECT t.x, t."group" FROM t WHERE t."group" = :group_1 ORDER BY t."group"
-
repo owner use_labels=True, also OK:
SELECT t.x AS t_x, t."group" AS t_group FROM t WHERE t."group" = :group_1 ORDER BY t."group"
-
Account Deleted Oops, sorry I made you take the time. I assumed cuz it seemed to always happen for me. I'll get a script in the tomorrow.
-
Account Deleted (Embarrassed at how poor my information was...) * Please rename this Ticket -> "table.columns.extend() doesn't set up column metadata the same as table.append_column()" * Also, this has nothing to do with oracle
Using table.append_column() works out appending a column correctly. Using table.columns.extend() causes the table name to be missing from rendered sql.
from sqlalchemy import * from sqlalchemy.orm import * m=MetaData() table_a=Table('a', m, Column('id', Integer, primary_key=True) ) ### this works #table_a.append_column(Column("cola", Integer)) #table_a.append_column(Column("colb", Integer)) ### this doesn't table_a.columns.extend( [ (Column("cola", Integer)), (Column("colb", Integer)) ]( ) ) class A(object): pass mapper(A, table_a) print Session().query(A)
append_column() output:
SELECT a.id AS a_id, a.cola AS a_cola, a.colb AS a_colb FROM a
columns.extend output:
SELECT a.id AS a_id, cola AS cola, colb AS colb FROM a
''Can columns.extend() be made to invoke append_column() for each item?''
-
Account Deleted The workaround is obviously simple, so no real problem if you don't want to get to this, but semantically, I think
table.columns.extend(col_list)
''should'' do the same asmap(table.append_column, col_list)
, do you agree? -
repo owner Replying to guest:
The workaround is obviously simple, so no real problem if you don't want to get to this, but semantically, I think
table.columns.extend(col_list)
''should'' do the same asmap(table.append_column, col_list)
, do you agree?perhaps though I'd prefer there's only, you know, "One Way To Do It". The column collection object used by Table would need some instrumentation to do the "right thing" and it just complicates matters. I.e. table.columns.extend(...), that has to fire off some kind of "columns are being added" type of thing, which means when Table first builds itself, ".columns" has to either be some different collection which it can append to without that event, or it is reorganized so that Table internally uses those "append" events to set up its bookkeeping. In which case append_column and all that just gets deprecated. Except then, theres all kinds of other collections...table.constraints, column.foreign_keys, etc, all kinds of stuff. I'd rather approach the whole issue holistically - either everything in schema is mutable via direct collection access, or none of it. Due to resources required for development and testing, I'd lean very strongly towards "none".
-
repo owner - changed title to table.columns.extend() doesn't set up column metadata the same as table.append_column()
-
Account Deleted Replying to zzzeek:
Replying to guest:
The workaround is obviously simple, so no real problem if you don't want to get to this, but semantically, I think
table.columns.extend(col_list)
''should'' do the same asmap(table.append_column, col_list)
, do you agree?perhaps though I'd prefer there's only, you know, "One Way To Do It". The column collection object used by Table would need some instrumentation to do the "right thing" and it just complicates matters. I.e. table.columns.extend(...), that has to fire off some kind of "columns are being added" type of thing, which means when Table first builds itself, ".columns" has to either be some different collection which it can append to without that event, or it is reorganized so that Table internally uses those "append" events to set up its bookkeeping. In which case append_column and all that just gets deprecated. Except then, theres all kinds of other collections...table.constraints, column.foreign_keys, etc, all kinds of stuff. I'd rather approach the whole issue holistically - either everything in schema is mutable via direct collection access, or none of it. Due to resources required for development and testing, I'd lean very strongly towards "none".
There are some very nice use cases for migration and database upgrading that require mutability. (So please don't go all or nothing, though it seems ideal.)
Is there a way, in this case, to just raise NotImplemented for ColumnCollection.extend()?
I thought this might be a "half hour" type fix because I assumed you could explicitly override the extend functionality, but if that isn't the case, I figured out a nice workaround, just call map(table.append_column, col_list), which seems to work fine...
-
repo owner Yeah I just don't like adding little ad-hoc mutabilities to various collections - it makes the API that much harder to learn. Every commit has to move things towards further consistency.
-
Account Deleted Replying to zzzeek:
I'd rather approach the whole issue holistically - either everything in schema is mutable via direct collection access, or none of it. Due to resources required for development and testing, I'd lean very strongly towards "none". [BR] [BR]
I misread this. I thought you were saying
"''either everything in schema is mutable, or none of it.''"
but you really said
"''either everything in schema is mutable via direct collection access, or none of it.''"
And I was worried you wanted to do away with the ability to change metadata after its "definition", which is precisely what we need for database migration and upgrading.
But you were only saying not supporting this "''via direct collection access''", correct? I can appreciate that...
-
repo owner yes via direct collection access - since that implies that all collection operations on all collections work as expected in the larger scheme. which is a huge undertaking. hence coarse grained methods like "append_column()" instead.
-
Account Deleted You can go ahead and close this ticket...
-
repo owner - changed component to sql
- changed title to make schema collection members publically immutable
- changed milestone to blue sky
- marked as enhancement
So, I'll attach here a patch that starts to make the public "columns" collection raise on mutate, of course very few of the more intricate tests involving generative mutation start to fail...but the idea would be along these lines. This would occur for the columns, primary_key, foreign_keys, indexes, all public collections, across schema.py and expression.py.
diff -r 611fb77186d7379a4ab8378326cbbb380c6e8cd6 lib/sqlalchemy/schema.py --- a/lib/sqlalchemy/schema.py Sat Sep 18 11:40:25 2010 -0400 +++ b/lib/sqlalchemy/schema.py Sat Sep 18 12:17:57 2010 -0400 @@ -930,7 +930,7 @@ nullable = self.nullable, quote=self.quote, _proxies=[self](self), *fk) c.table = selectable - selectable.columns.add(c) + selectable._columns.add(c) if self.primary_key: selectable.primary_key.add(c) for fn in c._table_events: diff -r 611fb77186d7379a4ab8378326cbbb380c6e8cd6 lib/sqlalchemy/sql/expression.py --- a/lib/sqlalchemy/sql/expression.py Sat Sep 18 11:40:25 2010 -0400 +++ b/lib/sqlalchemy/sql/expression.py Sat Sep 18 12:17:57 2010 -0400 @@ -1862,7 +1862,7 @@ type_=getattr(self, 'type', None)) co.proxies = [self](self) - selectable.columns[name](name) = co + selectable._columns[name](name) = co return co def compare(self, other, use_proxies=False, equivalents=None, **kw): @@ -1910,6 +1910,7 @@ return _generated_label('%%(%d %s)s' % (id(self), getattr(self, 'name', 'anon'))) + class ColumnCollection(util.OrderedProperties): """An ordered dictionary that stores a list of ColumnElement instances. @@ -1925,6 +1926,11 @@ def __str__(self): return repr([for c in self](str(c))) + + def _immutable(self): + """Return an immutable proxy to this :class:`ColumnCollection`.""" + + return FrozenColumnCollection(self) def replace(self, column): """add the given column to this collection, removing unaliased @@ -2002,6 +2008,17 @@ return col in util.column_set(self) +class FrozenColumnCollection(ColumnCollection): + def __init__(self, collection): + self.__dict__['_data']('_data') = collection.__dict__['_data']('_data') + + @property + def _blocked_attribute(obj): + raise AttributeError, "A frozendict cannot be modified." + + __delitem__ = __setitem__ = clear = _blocked_attribute + pop = popitem = setdefault = extend = update = _blocked_attribute + class ColumnSet(util.ordered_column_set): def contains_column(self, col): return col in self @@ -2226,7 +2243,11 @@ self._export_columns() return self._foreign_keys - columns = property(attrgetter('_columns'), doc=_columns.__doc__) + + @util.memoized_property + def columns(self): + return self._columns._immutable() + primary_key = property(attrgetter('_primary_key'), doc=_primary_key.__doc__) foreign_keys = property(attrgetter('_foreign_keys'), @@ -3445,7 +3466,7 @@ ) c.proxies = [self](self) if attach: - selectable.columns[c.name](c.name) = c + selectable._columns[c.name](c.name) = c return c class TableClause(_Immutable, FromClause):
-
Account Deleted Darn, I'm feeling I should have kept my mouth shut, because there is another portion of our project extensively using table.constraints.add(), table.constraints.remove(), table.columns.add() and table.columns.remove().
It works fine in that setting because it is only transiently removing constraints or columns from the collections while it builds the views that sqlalchemy thinks are tables... long story short: this is different use case for changing the collections that is working perfectly fine as is.
For example, I need to remove a few columns and constraints ''for a short time only'', knowing I'll add them back after building views out of the mutated table. All I want is:
table.columns.remove()
and later:table.columns.add()
. In this case, I don't need/want table.columns.add() to behave like table.append_column() because that logic was already applied before I removed the columns.(Ideally, I'd have just made a copy each table, but this was a work-around since
table.tometadata()
doesn't copy constraints, it only copies columns.)Anyway, now that I've shot myself in the foot, I'll still be able to invoke
table._columns.add() and table._columns.remove()
(underscored) correct? -
repo owner tometadata() should copy everything. It definitely copies constraints and as of the most recent tip it copies indexes too.
-
repo owner - changed milestone to 0.7.0
attached is another patch that works for the .columns collection and simplifies things slightly, so its an improvement for 0.7. related is
#1893where i want to stick the "immutable" collection onMetaData
too. -
repo owner - changed component to schema
-
repo owner the patch has been adapted to the latest trunk with some tests in http://bitbucket.org/zzzeek/ticket_1917
-
repo owner - changed status to resolved
diff:@2336b1cebfcb2f304e09cbc2a0e8bb3fb3a9ceeb:00a46457c1154975e17109c9e292e5d5bd6b6468
-
repo owner - removed milestone
Removing milestone: 0.7.0 (automated comment)
- Log in to comment
whats the full SQL , i.e. FROM clause, etc ?