diff options
41 files changed, 1486 insertions, 677 deletions
diff --git a/doc/build/changelog/changelog_07.rst b/doc/build/changelog/changelog_07.rst index 7b28f5f23..092e8af40 100644 --- a/doc/build/changelog/changelog_07.rst +++ b/doc/build/changelog/changelog_07.rst @@ -3,10 +3,44 @@ 0.7 Changelog ============== - + .. changelog:: :version: 0.7.10 - :released: + :released: + + .. change:: + :tags: oracle, bug + :tickets: 2620 + + The Oracle LONG type, while an unbounded text type, does not appear + to use the cx_Oracle.LOB type when result rows are returned, + so the dialect has been repaired to exclude LONG from + having cx_Oracle.LOB filtering applied. + + .. change:: + :tags: oracle, bug + :tickets: 2611 + + Repaired the usage of ``.prepare()`` in conjunction with + cx_Oracle so that a return value of ``False`` will result + in no call to ``connection.commit()``, hence avoiding + "no transaction" errors. Two-phase transactions have + now been shown to work in a rudimental fashion with + SQLAlchemy and cx_oracle, however are subject to caveats + observed with the driver; check the documentation + for details. + + .. change:: + :tags: orm, bug + :tickets: 2624 + + The :class:`.MutableComposite` type did not allow for the + :meth:`.MutableBase.coerce` method to be used, even though + the code seemed to indicate this intent, so this now works + and a brief example is added. As a side-effect, + the mechanics of this event handler have been changed so that + new :class:`.MutableComposite` types no longer add per-type + global event handlers. Also in 0.8.0b2. .. change:: :tags: orm, bug @@ -52,7 +86,7 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: Fixed bug mostly local to new AbstractConcreteBase helper where the "type" @@ -68,7 +102,7 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: A warning is emitted when lazy='dynamic' is combined with uselist=False. This is an @@ -76,7 +110,7 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: Fixed bug whereby user error in related-object assignment could cause recursion overflow if the @@ -140,7 +174,7 @@ .. change:: :tags: bug, sql - :tickets: + :tickets: Fixed more un-intuitivenesses in CTEs which prevented referring to a CTE in a union @@ -261,7 +295,7 @@ .. change:: :tags: engine, bug - :tickets: + :tickets: Fixed the repr() of Enum to include the "name" and "native_enum" flags. Helps @@ -294,7 +328,7 @@ .. change:: :tags: sqlite, feature - :tickets: + :tickets: Added support for the localtimestamp() SQL function implemented in SQLite, courtesy @@ -320,7 +354,7 @@ .. change:: :tags: bug, mysql - :tickets: + :tickets: Updated mysqlconnector interface to use updated "client flag" and "charset" APIs, @@ -347,7 +381,7 @@ .. change:: :tags: mssql, bug - :tickets: + :tickets: Fixed bug where reflection of primary key constraint would double up columns if the same constraint/table @@ -385,7 +419,7 @@ .. change:: :tags: orm, feature - :tickets: + :tickets: The 'objects' argument to flush() is no longer deprecated, as some @@ -511,7 +545,7 @@ .. change:: :tags: orm, feature - :tickets: + :tickets: Added new flag to @validates include_removes. When True, collection @@ -548,7 +582,7 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: Fixed bug in relationship comparisons whereby calling unimplemented methods like @@ -558,7 +592,7 @@ .. change:: :tags: bug, sql - :tickets: + :tickets: Removed warning when Index is created with no columns; while this might not be what @@ -568,7 +602,7 @@ .. change:: :tags: feature, sql - :tickets: + :tickets: Added new connection event dbapi_error(). Is called for all DBAPI-level @@ -578,7 +612,7 @@ .. change:: :tags: bug, sql - :tickets: + :tickets: If conn.begin() fails when calling "with engine.begin()", the newly acquired @@ -593,7 +627,7 @@ .. change:: :tags: mssql, feature - :tickets: + :tickets: Added interim create_engine flag supports_unicode_binds to PyODBC dialect, @@ -603,7 +637,7 @@ .. change:: :tags: mssql, bug - :tickets: + :tickets: Repaired the use_scope_identity create_engine() flag when using the pyodbc @@ -657,7 +691,7 @@ .. change:: :tags: bug, mysql - :tickets: + :tickets: Fixed bug whereby get_view_names() for "information_schema" schema would fail @@ -724,7 +758,7 @@ .. change:: :tags: orm, feature - :tickets: + :tickets: Added "no_autoflush" context manager to Session, used with with: @@ -797,7 +831,7 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: Improved the "declarative reflection" example to support single-table inheritance, @@ -932,7 +966,7 @@ .. change:: :tags: engine, bug - :tickets: + :tickets: Added execution_options() call to MockConnection (i.e., that used with @@ -952,19 +986,19 @@ .. change:: :tags: engine, feature - :tickets: + :tickets: Added some decent context managers to Engine, Connection: - + with engine.begin() as conn: <work with conn in a transaction> - + and: - + with engine.connect() as conn: <work with conn> - + Both close out the connection when done, commit or rollback transaction with errors on engine.begin(). @@ -1031,7 +1065,7 @@ .. change:: :tags: examples, bug - :tickets: + :tickets: Altered _params_from_query() function in Beaker example to pull bindparams from the @@ -1076,7 +1110,7 @@ .. change:: :tags: orm, feature - :tickets: + :tickets: Added "class_registry" argument to declarative_base(). Allows two or more declarative @@ -1084,7 +1118,7 @@ .. change:: :tags: orm, feature - :tickets: + :tickets: query.filter() accepts multiple criteria which will join via AND, i.e. @@ -1115,14 +1149,14 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: Fixed bug whereby hybrid_property didn't work as a kw arg in any(), has(). .. change:: :tags: orm - :tickets: + :tickets: Fixed regression from 0.6 whereby if "load_on_pending" relationship() flag were used @@ -1200,7 +1234,7 @@ .. change:: :tags: feature, sql - :tickets: + :tickets: Added "false()" and "true()" expression constructs to sqlalchemy.sql namespace, though @@ -1283,7 +1317,7 @@ .. change:: :tags: sqlite, bug - :tickets: + :tickets: removed an erroneous "raise" in the SQLite dialect when getting table names @@ -1544,17 +1578,17 @@ polymorphic_on now accepts many new kinds of values: - + - standalone expressions that aren't otherwise mapped - column_property() objects - string names of any column_property() or attribute name of a mapped Column - + The docs include an example using the case() construct, which is likely to be a common constructed used here. and part of - + Standalone expressions in polymorphic_on propagate to single-table inheritance subclasses so that they are used in the @@ -1571,7 +1605,7 @@ .. change:: :tags: orm, feature - :tickets: + :tickets: Added new value for Column autoincrement called "ignore_fk", can be used to force autoincrement @@ -1581,7 +1615,7 @@ .. change:: :tags: orm, bug - :tickets: + :tickets: Fixed bug in get_history() when referring to a composite attribute that has no value; @@ -1662,13 +1696,13 @@ .. change:: :tags: feature, schema - :tickets: + :tickets: Added new support for remote "schemas": .. change:: :tags: schema - :tickets: + :tickets: MetaData() accepts "schema" and "quote_schema" arguments, which will be applied to the same-named @@ -1678,13 +1712,13 @@ .. change:: :tags: schema - :tickets: + :tickets: Sequence accepts "quote_schema" argument .. change:: :tags: schema - :tickets: + :tickets: tometadata() for Table will use the "schema" of the incoming MetaData for the new Table @@ -1692,7 +1726,7 @@ .. change:: :tags: schema - :tickets: + :tickets: Added CreateSchema and DropSchema DDL constructs - these accept just the string @@ -1700,7 +1734,7 @@ .. change:: :tags: schema - :tickets: + :tickets: When using default "schema" with MetaData, ForeignKey will also assume the "default" schema @@ -1734,7 +1768,7 @@ .. change:: :tags: bug, schema - :tickets: + :tickets: Fixed bug whereby TypeDecorator would return a stale value for _type_affinity, when @@ -1743,7 +1777,7 @@ .. change:: :tags: bug, schema - :tickets: + :tickets: Fixed bug whereby "order_by='foreign_key'" option to Inspector.get_table_names @@ -1781,7 +1815,7 @@ .. change:: :tags: postgresql, feature - :tickets: + :tickets: Added create_type constructor argument to pg.ENUM. When False, no CREATE/DROP or @@ -1836,14 +1870,14 @@ .. change:: :tags: bug, mysql - :tickets: + :tickets: Unicode adjustments allow latest pymysql (post 0.4) to pass 100% on Python 2. .. change:: :tags: ext, feature - :tickets: + :tickets: Added an example to the hybrid docs of a "transformer" - a hybrid that returns a @@ -1855,7 +1889,7 @@ .. change:: :tags: ext, bug - :tickets: + :tickets: the @compiles decorator raises an informative error message when no "default" @@ -1864,7 +1898,7 @@ .. change:: :tags: examples, bug - :tickets: + :tickets: Fixed bug in history_meta.py example where the "unique" flag was not removed from a @@ -1916,7 +1950,7 @@ .. change:: :tags: orm - :tickets: + :tickets: added "adapt_on_names" boolean flag to orm.aliased() construct. Allows an aliased() construct @@ -1927,7 +1961,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Added new flag expire_on_flush=False to column_property(), marks those properties that would otherwise be considered @@ -1959,7 +1993,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Fixed a variety of synonym()-related regressions from 0.6: @@ -2013,7 +2047,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Query will convert an OFFSET of zero when slicing into None, so that needless OFFSET @@ -2021,7 +2055,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Repaired edge case where mapper would fail to fully update internal state when a relationship @@ -2034,7 +2068,7 @@ Fixed bug whereby if __eq__() was redefined, a relationship many-to-one lazyload - would hit the __eq__() and fail. + would hit the __eq__() and fail. Does not apply to 0.6.9. .. change:: @@ -2048,7 +2082,7 @@ .. change:: :tags: orm - :tickets: + :tickets: New event hook, MapperEvents.after_configured(). Called after a configure() step has completed and @@ -2183,7 +2217,7 @@ .. change:: :tags: engine - :tickets: + :tickets: Added optional "sa_pool_key" argument to pool.manage(dbapi).connect() so that serialization @@ -2222,7 +2256,7 @@ .. change:: :tags: sqlite - :tickets: + :tickets: Ensured that the same ValueError is raised for illegal date/time/datetime string parsed from @@ -2265,7 +2299,7 @@ .. change:: :tags: postgresql - :tickets: + :tickets: Use an atomic counter as the "random number" source for server side cursor names; @@ -2395,7 +2429,7 @@ Adjusted dictlike-polymorphic.py example to apply the CAST such that it works on - PG, other databases. + PG, other databases. Also in 0.6.9. .. changelog:: @@ -2450,7 +2484,7 @@ from a joined-inh structure to itself on relationship() with join condition on the child table would convert the lead entity into the - joined one inappropriately. + joined one inappropriately. Also in 0.6.9. .. change:: @@ -2495,7 +2529,7 @@ sorting of persistent + pending objects during flush would produce an illegal comparison, if the persistent object primary key - is not a single integer. + is not a single integer. Also in 0.6.9 .. change:: @@ -2505,7 +2539,7 @@ Fixed bug whereby the source clause used by query.join() would be inconsistent if against a column expression that combined - multiple entities together. + multiple entities together. Also in 0.6.9 .. change:: @@ -2518,7 +2552,7 @@ as SQLA should never consult these, the methods would be consulted if the class was part of a "composite" (i.e. non-single-entity) - result set. + result set. Also in 0.6.9. .. change:: @@ -2553,7 +2587,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Added the same "columns-only" check to mapper.polymorphic_on as used when @@ -2612,7 +2646,7 @@ .. change:: :tags: schema - :tickets: + :tickets: Added an informative error message when ForeignKeyConstraint refers to a column name in @@ -2630,7 +2664,7 @@ .. change:: :tags: schema - :tickets: + :tickets: Fixed bug where "autoincrement" detection on Table would fail if the type had no "affinity" @@ -2649,7 +2683,7 @@ .. change:: :tags: engine - :tickets: + :tickets: Context manager provided by Connection.begin() will issue rollback() if the commit() fails, @@ -2664,7 +2698,7 @@ .. change:: :tags: engine - :tickets: + :tickets: Added mixin class sqlalchemy.ext.DontWrapMixin. User-defined exceptions of this type are never @@ -2674,7 +2708,7 @@ .. change:: :tags: engine - :tickets: + :tickets: StatementException wrapping will display the original exception class in the message. @@ -2719,7 +2753,7 @@ .. change:: :tags: mssql - :tickets: + :tickets: Adjusted the pyodbc dialect such that bound values are passed as bytes and not unicode @@ -2749,7 +2783,7 @@ :tickets: 2220 repaired the oracle.RAW type which did not - generate the correct DDL. + generate the correct DDL. Also in 0.6.9. .. change:: @@ -2760,7 +2794,7 @@ .. change:: :tags: oracle - :tickets: + :tickets: Fixed bug in the mutable extension whereby if the same type were used twice in one @@ -2769,7 +2803,7 @@ .. change:: :tags: oracle - :tickets: + :tickets: Fixed bug in the mutable extension whereby if None or a non-corresponding type were set, @@ -2779,7 +2813,7 @@ .. change:: :tags: examples - :tickets: + :tickets: Repaired the examples/versioning test runner to not rely upon SQLAlchemy test libs, @@ -2789,7 +2823,7 @@ .. change:: :tags: examples - :tickets: + :tickets: Tweak to examples/versioning to pick the correct foreign key in a multi-level @@ -2797,7 +2831,7 @@ .. change:: :tags: examples - :tickets: + :tickets: Fixed the attribute shard example to check for bind param callable correctly in 0.7 @@ -2859,7 +2893,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Fixed bug whereby metadata.reflect(bind) would close a Connection passed as a @@ -2867,7 +2901,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Streamlined the process by which a Select determines what's in it's '.c' collection. @@ -2879,7 +2913,7 @@ .. change:: :tags: engine - :tickets: + :tickets: Deprecate schema/SQL-oriented methods on Connection/Engine that were never well known @@ -2921,7 +2955,7 @@ .. change:: :tags: mysql - :tickets: + :tickets: Unit tests pass 100% on MySQL installed on windows. @@ -2947,7 +2981,7 @@ .. change:: :tags: mysql - :tickets: + :tickets: supports_sane_rowcount will be set to False if using MySQLdb and the DBAPI doesn't provide @@ -2958,8 +2992,8 @@ :released: Fri May 20 2011 .. change:: - :tags: - :tickets: + :tags: + :tickets: This section documents those changes from 0.7b4 to 0.7.0. For an overview of what's new in @@ -3007,7 +3041,7 @@ for joined-inh subclass related to itself, or joined-inh subclass related to a subclass of that with no cols in the sub-sub class - in the join condition. + in the join condition. Also in 0.6.8. .. change:: @@ -3045,7 +3079,7 @@ .. change:: :tags: orm - :tickets: + :tickets: added Query.with_session() method, switches Query to use a different session. @@ -3088,7 +3122,7 @@ .. change:: :tags: orm - :tickets: + :tickets: polymorphic_union() renders the columns in their original table order, as according to the first @@ -3102,7 +3136,7 @@ Fixed bug whereby mapper mapped to an anonymous alias would fail if logging were used, due to - unescaped % sign in the alias name. + unescaped % sign in the alias name. Also in 0.6.8. .. change:: @@ -3118,7 +3152,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Changed the handling in determination of join conditions such that foreign key errors are @@ -3131,7 +3165,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Some improvements to error handling inside of the execute procedure to ensure auto-close @@ -3140,7 +3174,7 @@ .. change:: :tags: sql - :tickets: + :tickets: metadata.reflect() and reflection.Inspector() had some reliance on GC to close connections @@ -3165,7 +3199,7 @@ .. change:: :tags: postgresql - :tickets: + :tickets: Fixed the psycopg2_version parsing in the psycopg2 dialect. @@ -3198,7 +3232,7 @@ .. change:: :tags: examples - :tickets: + :tickets: removed the ancient "polymorphic association" examples and replaced with an updated set of @@ -3220,7 +3254,7 @@ .. change:: :tags: general - :tickets: + :tickets: Changes to the format of CHANGES, this file. The format changes have been applied to @@ -3228,7 +3262,7 @@ .. change:: :tags: general - :tickets: + :tickets: The "-declarative" changes will now be listed directly under the "-orm" section, as these @@ -3236,7 +3270,7 @@ .. change:: :tags: general - :tickets: + :tickets: The 0.5 series changes have been moved to the file CHANGES_PRE_06 which replaces @@ -3244,7 +3278,7 @@ .. change:: :tags: general - :tickets: + :tickets: The changelog for 0.6.7 and subsequent within the 0.6 series is now listed only in the @@ -3279,7 +3313,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Still more wording adjustments when a query option can't find the target entity. Explain that the @@ -3294,7 +3328,7 @@ the back-referenced collection wouldn't properly handle add/removes with no net change. Thanks to Richard Murri for the - test case + patch. + test case + patch. (also in 0.6.7). .. change:: @@ -3352,7 +3386,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Restored the "catchall" constructor on the base TypeEngine class, with a deprecation warning. @@ -3375,12 +3409,12 @@ The limit/offset keywords to select() as well as the value passed to select.limit()/offset() - will be coerced to integer. + will be coerced to integer. (also in 0.6.7) .. change:: :tags: sql - :tickets: + :tickets: fixed bug where "from" clause gathering from an over() clause would be an itertools.chain() and @@ -3465,7 +3499,7 @@ .. change:: :tags: pool - :tickets: + :tickets: The "pool.manage" feature doesn't use pickle anymore to hash the arguments for each pool. @@ -3476,12 +3510,12 @@ Fixed bug where reflection of foreign key created as "REFERENCES <tablename>" without - col name would fail. + col name would fail. (also in 0.6.7) .. change:: :tags: postgresql - :tickets: + :tickets: Psycopg2 for Python 3 is now supported. @@ -3532,7 +3566,7 @@ .. change:: :tags: general - :tickets: + :tickets: Lots of fixes to unit tests when run under Pypy (courtesy Alex Gaynor). @@ -3543,11 +3577,11 @@ Changed the underlying approach to query.count(). query.count() is now in all cases exactly: - + query. from_self(func.count(literal_column('1'))). scalar() - + That is, "select count(1) from (<full query>)". This produces a subquery in all cases, but vastly simplifies all the guessing count() @@ -3619,7 +3653,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Added a fully descriptive error message for the case where Column is subclassed and _make_proxy() @@ -3640,7 +3674,7 @@ .. change:: :tags: sql - :tickets: + :tickets: To help with the "column_reflect" event being used with specific Table objects instead of all instances @@ -3703,7 +3737,7 @@ :tickets: 2073 Fixed the BIT type to allow a "length" parameter, "varying" - parameter. Reflection also fixed. + parameter. Reflection also fixed. (also in 0.6.7) .. change:: @@ -3714,7 +3748,7 @@ typically when using the Inspector interface, to use sys.sql_modules instead of the information schema, thereby allowing views definitions longer than 4000 - characters to be fully returned. + characters to be fully returned. (also in 0.6.7) .. change:: @@ -3741,7 +3775,7 @@ .. change:: :tags: examples - :tickets: + :tickets: Updated the association, association proxy examples to use declarative, added a new example @@ -3753,7 +3787,7 @@ :tickets: 2090 The Beaker caching example allows a "query_cls" argument - to the query_callable() function. + to the query_callable() function. (also in 0.6.7) .. changelog:: @@ -3852,7 +3886,7 @@ .. change:: :tags: examples - :tickets: + :tickets: Beaker example now takes into account 'limit' and 'offset', bind params within embedded @@ -3864,8 +3898,8 @@ :released: Sat Feb 12 2011 .. change:: - :tags: - :tickets: + :tags: + :tickets: Detailed descriptions of each change below are described at: @@ -3892,7 +3926,7 @@ .. change:: :tags: general - :tickets: + :tickets: The "sqlalchemy.exceptions" alias in sys.modules has been removed. Base SQLA exceptions are @@ -3921,7 +3955,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Mutation Event Extension, supercedes "mutable=True" @@ -4029,7 +4063,7 @@ .. change:: :tags: orm - :tickets: + :tickets: Query.join(), Query.outerjoin(), eagerload(), eagerload_all(), others no longer allow lists @@ -4038,7 +4072,7 @@ .. change:: :tags: orm - :tickets: + :tickets: ScopedSession.mapper is removed (deprecated since 0.5). @@ -4062,7 +4096,7 @@ .. change:: :tags: orm - :tickets: + :tickets: The "name" field used in orm.aliased() now renders in the resulting SQL statement. @@ -4106,7 +4140,7 @@ .. change:: :tags: orm - :tickets: + :tickets: the value of "passive" as passed to attributes.get_history() should be one of the @@ -4126,7 +4160,7 @@ A warning is emitted when a joined-table inheriting mapper has no primary keys on the locally mapped table - (but has pks on the superclass table). + (but has pks on the superclass table). (also in 0.6.7) .. change:: @@ -4195,7 +4229,7 @@ .. change:: :tags: sql - :tickets: + :tickets: select.prefix_with() accepts multiple expressions (i.e. *expr), 'prefix' keyword argument to select() @@ -4203,7 +4237,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Passing a string to the `distinct` keyword argument of `select()` for the purpose of emitting special @@ -4257,7 +4291,7 @@ .. change:: :tags: sql - :tickets: + :tickets: The Index() construct can be created inline with a Table definition, using strings as column names, as an alternative @@ -4306,7 +4340,7 @@ .. change:: :tags: sql - :tickets: + :tickets: Bind parameters present in the "columns clause" of a select are now auto-labeled like other "anonymous" clauses, @@ -4315,7 +4349,7 @@ .. change:: :tags: sql - :tickets: + :tickets: TypeDecorator is present in the "sqlalchemy" import space. @@ -4406,7 +4440,7 @@ specified, so that the default length, normally '1' as per SQL server documentation, is instead 'unbounded'. This also occurs for the VARBINARY type.. - + This behavior makes these types more closely compatible with Postgresql's VARCHAR type which is similarly unbounded when no length is specified. @@ -4423,7 +4457,7 @@ :tickets: 2047 oursql dialect accepts the same "ssl" arguments in - create_engine() as that of MySQLdb. + create_engine() as that of MySQLdb. (also in 0.6.7) .. change:: diff --git a/doc/build/changelog/changelog_08.rst b/doc/build/changelog/changelog_08.rst index 4d36cf3b4..79597366c 100644 --- a/doc/build/changelog/changelog_08.rst +++ b/doc/build/changelog/changelog_08.rst @@ -7,8 +7,70 @@ :version: 0.8.0b2 .. change:: + :tags: oracle, bug + :tickets: 2620 + + The Oracle LONG type, while an unbounded text type, does not appear + to use the cx_Oracle.LOB type when result rows are returned, + so the dialect has been repaired to exclude LONG from + having cx_Oracle.LOB filtering applied. Also in 0.7.10. + + .. change:: + :tags: oracle, bug + :tickets: 2611 + + Repaired the usage of ``.prepare()`` in conjunction with + cx_Oracle so that a return value of ``False`` will result + in no call to ``connection.commit()``, hence avoiding + "no transaction" errors. Two-phase transactions have + now been shown to work in a rudimental fashion with + SQLAlchemy and cx_oracle, however are subject to caveats + observed with the driver; check the documentation + for details. Also in 0.7.10. + + .. change:: + :tags: sql, bug + :tickets: 2618 + + The :class:`.DECIMAL` type now honors the "precision" and + "scale" arguments when rendering DDL. + + .. change:: + :tags: orm, bug + :tickets: 2624 + + The :class:`.MutableComposite` type did not allow for the + :meth:`.MutableBase.coerce` method to be used, even though + the code seemed to indicate this intent, so this now works + and a brief example is added. As a side-effect, + the mechanics of this event handler have been changed so that + new :class:`.MutableComposite` types no longer add per-type + global event handlers. Also in 0.7.10. + + .. change:: + :tags: sql, bug + :tickets: 2621 + + Made an adjustment to the "boolean", (i.e. ``__nonzero__``) + evaluation of binary expressions, i.e. ``x1 == x2``, such + that the "auto-grouping" applied by :class:`.BinaryExpression` + in some cases won't get in the way of this comparison. + Previously, an expression like:: + + expr1 = mycolumn > 2 + bool(expr1 == expr1) + + Would evaulate as ``False``, even though this is an identity + comparison, because ``mycolumn > 2`` would be "grouped" before + being placed into the :class:`.BinaryExpression`, thus changing + its identity. :class:`.BinaryExpression` now keeps track + of the "original" objects passed in. + Additionally the ``__nonzero__`` method now only returns if + the operator is ``==`` or ``!=`` - all others raise ``TypeError``. + + .. change:: :tags: firebird, bug - :ticket: 2622 + :tickets: 2622 Added missing import for "fdb" to the experimental "firebird+fdb" dialect. @@ -21,17 +83,22 @@ .. change:: :tags: orm, bug - :ticket: 2614 - - Added a new exception to detect the case where two - subclasses are being loaded using with_polymorphic(), - each subclass contains a relationship attribute of the same - name, and eager loading is being applied to one or both. - This is an ongoing bug which can't be immediately fixed, - so since the results are usually wrong in any case it raises an - exception for now. 0.7 has the same issue, so an exception - raise here probably means the code was returning the wrong - data in 0.7. + :tickets: 2614 + + A second overhaul of aliasing/internal pathing mechanics + now allows two subclasses to have different relationships + of the same name, supported with subquery or joined eager + loading on both simultaneously when a full polymorphic + load is used. + + .. change:: + :tags: orm, bug + :tickets: 2617 + + Fixed bug whereby a multi-hop subqueryload within + a particular with_polymorphic load would produce a KeyError. + Takes advantage of the same internal pathing overhaul + as :ticket:`2614`. .. change:: :tags: sql, bug diff --git a/doc/build/core/connections.rst b/doc/build/core/connections.rst index de12a4689..d6726015a 100644 --- a/doc/build/core/connections.rst +++ b/doc/build/core/connections.rst @@ -42,7 +42,8 @@ is achieved via the usage of :class:`.NullPool`) does not have this requirement. The engine can be used directly to issue SQL to the database. The most generic -way is first procure a connection resource, which you get via the :class:`connect` method:: +way is first procure a connection resource, which you get via the +:meth:`.Engine.connect` method:: connection = engine.connect() result = connection.execute("select username from users") @@ -325,7 +326,7 @@ Overall, the usage of "bound metadata" has three general effects: on behalf of a particular mapped class, though the :class:`.Session` also features its own explicit system of establishing complex :class:`.Engine`/ mapped class configurations. -* The :meth:`.MetaData.create_all`, :meth:`.Metadata.drop_all`, :meth:`.Table.create`, +* The :meth:`.MetaData.create_all`, :meth:`.MetaData.drop_all`, :meth:`.Table.create`, :meth:`.Table.drop`, and "autoload" features all make usage of the bound :class:`.Engine` automatically without the need to pass it explicitly. @@ -416,7 +417,7 @@ connectionless execution:: db.rollback() Explicit execution can be mixed with connectionless execution by -using the :class:`.Engine.connect` method to acquire a :class:`.Connection` +using the :meth:`.Engine.connect` method to acquire a :class:`.Connection` that is not part of the threadlocal scope:: db.begin() diff --git a/doc/build/core/schema.rst b/doc/build/core/schema.rst index 310da23a6..759cc4f33 100644 --- a/doc/build/core/schema.rst +++ b/doc/build/core/schema.rst @@ -1283,14 +1283,14 @@ constraint will be added via ALTER: ALTER TABLE users DROP CONSTRAINT cst_user_name_length DROP TABLE users{stop} -The real usefulness of the above becomes clearer once we illustrate the :meth:`.DDLEvent.execute_if` -method. This method returns a modified form of the DDL callable which will -filter on criteria before responding to a received event. It accepts a -parameter ``dialect``, which is the string name of a dialect or a tuple of such, -which will limit the execution of the item to just those dialects. It also -accepts a ``callable_`` parameter which may reference a Python callable which will -be invoked upon event reception, returning ``True`` or ``False`` indicating if -the event should proceed. +The real usefulness of the above becomes clearer once we illustrate the +:meth:`.DDLElement.execute_if` method. This method returns a modified form of +the DDL callable which will filter on criteria before responding to a +received event. It accepts a parameter ``dialect``, which is the string +name of a dialect or a tuple of such, which will limit the execution of the +item to just those dialects. It also accepts a ``callable_`` parameter which +may reference a Python callable which will be invoked upon event reception, +returning ``True`` or ``False`` indicating if the event should proceed. If our :class:`~sqlalchemy.schema.CheckConstraint` was only supported by Postgresql and not other databases, we could limit its usage to just that dialect:: diff --git a/doc/build/orm/examples.rst b/doc/build/orm/examples.rst index 03d69cf4c..e0c87dadf 100644 --- a/doc/build/orm/examples.rst +++ b/doc/build/orm/examples.rst @@ -3,9 +3,14 @@ Examples ======== -The SQLAlchemy distribution includes a variety of code examples illustrating a select set of patterns, some typical and some not so typical. All are runnable and can be found in the ``/examples`` directory of the distribution. Each example contains a README in its ``__init__.py`` file, each of which are listed below. - -Additional SQLAlchemy examples, some user contributed, are available on the wiki at `<http://www.sqlalchemy.org/trac/wiki/UsageRecipes>`_. +The SQLAlchemy distribution includes a variety of code examples illustrating +a select set of patterns, some typical and some not so typical. All are +runnable and can be found in the ``/examples`` directory of the +distribution. Each example contains a README in its ``__init__.py`` file, +each of which are listed below. + +Additional SQLAlchemy examples, some user contributed, are available on the +wiki at `<http://www.sqlalchemy.org/trac/wiki/UsageRecipes>`_. .. _examples_adjacencylist: diff --git a/doc/build/orm/extensions/mutable.rst b/doc/build/orm/extensions/mutable.rst index 259055980..ba3e10542 100644 --- a/doc/build/orm/extensions/mutable.rst +++ b/doc/build/orm/extensions/mutable.rst @@ -9,7 +9,7 @@ API Reference ------------- .. autoclass:: MutableBase - :members: _parents + :members: _parents, coerce .. autoclass:: Mutable :show-inheritance: diff --git a/examples/custom_attributes/__init__.py b/examples/custom_attributes/__init__.py index 6f7613e5c..b28e97d95 100644 --- a/examples/custom_attributes/__init__.py +++ b/examples/custom_attributes/__init__.py @@ -1,8 +1,19 @@ """ -Two examples illustrating modifications to SQLAlchemy's attribute management system. +Two examples illustrating modifications to SQLAlchemy's attribute management +system. -``listen_for_events.py`` illustrates the usage of :class:`~sqlalchemy.orm.interfaces.AttributeExtension` to intercept attribute events. It additionally illustrates a way to automatically attach these listeners to all class attributes using a :class:`~sqlalchemy.orm.interfaces.InstrumentationManager`. +``listen_for_events.py`` illustrates the usage of +:class:`~sqlalchemy.orm.interfaces.AttributeExtension` to intercept attribute +events. It additionally illustrates a way to automatically attach these +listeners to all class attributes using a +:class:`.InstrumentationManager`. -``custom_management.py`` illustrates much deeper usage of :class:`~sqlalchemy.orm.interfaces.InstrumentationManager` as well as collection adaptation, to completely change the underlying method used to store state on an object. This example was developed to illustrate techniques which would be used by other third party object instrumentation systems to interact with SQLAlchemy's event system and is only intended for very intricate framework integrations. +``custom_management.py`` illustrates much deeper usage of +:class:`.InstrumentationManager` as well as +collection adaptation, to completely change the underlying method used to +store state on an object. This example was developed to illustrate +techniques which would be used by other third party object instrumentation +systems to interact with SQLAlchemy's event system and is only intended for +very intricate framework integrations. """
\ No newline at end of file diff --git a/lib/sqlalchemy/dialects/oracle/cx_oracle.py b/lib/sqlalchemy/dialects/oracle/cx_oracle.py index bee730800..c5d9e8a89 100644 --- a/lib/sqlalchemy/dialects/oracle/cx_oracle.py +++ b/lib/sqlalchemy/dialects/oracle/cx_oracle.py @@ -80,8 +80,40 @@ To disable this processing, pass ``auto_convert_lobs=False`` to :func:`create_en Two Phase Transaction Support ----------------------------- -Two Phase transactions are implemented using XA transactions. Success has been reported -with this feature but it should be regarded as experimental. +Two Phase transactions are implemented using XA transactions, and are known +to work in a rudimental fashion with recent versions of cx_Oracle +as of SQLAlchemy 0.8.0b2, 0.7.10. However, the mechanism is not yet +considered to be robust and should still be regarded as experimental. + +In particular, the cx_Oracle DBAPI as recently as 5.1.2 has a bug regarding +two phase which prevents +a particular DBAPI connection from being consistently usable in both +prepared transactions as well as traditional DBAPI usage patterns; therefore +once a particular connection is used via :meth:`.Connection.begin_prepared`, +all subsequent usages of the underlying DBAPI connection must be within +the context of prepared transactions. + +The default behavior of :class:`.Engine` is to maintain a pool of DBAPI +connections. Therefore, due to the above glitch, a DBAPI connection that has +been used in a two-phase operation, and is then returned to the pool, will +not be usable in a non-two-phase context. To avoid this situation, +the application can make one of several choices: + +* Disable connection pooling using :class:`.NullPool` + +* Ensure that the particular :class:`.Engine` in use is only used + for two-phase operations. A :class:`.Engine` bound to an ORM + :class:`.Session` which includes ``twophase=True`` will consistently + use the two-phase transaction style. + +* For ad-hoc two-phase operations without disabling pooling, the DBAPI + connection in use can be evicted from the connection pool using the + :class:`.Connection.detach` method. + +.. versionchanged:: 0.8.0b2,0.7.10 + Support for cx_oracle prepared transactions has been implemented + and tested. + Precision Numerics ------------------ @@ -150,8 +182,9 @@ a period "." as the decimal character. """ -from .base import OracleCompiler, OracleDialect, \ - RESERVED_WORDS, OracleExecutionContext +from __future__ import absolute_import + +from .base import OracleCompiler, OracleDialect, OracleExecutionContext from . import base as oracle from ...engine import result as _result from sqlalchemy import types as sqltypes, util, exc, processors @@ -270,6 +303,13 @@ class _OracleText(_LOBMixin, sqltypes.Text): return dbapi.CLOB +class _OracleLong(oracle.LONG): + # a raw LONG is a text type, but does *not* + # get the LobMixin with cx_oracle. + + def get_dbapi_type(self, dbapi): + return dbapi.LONG_STRING + class _OracleString(_NativeUnicodeMixin, sqltypes.String): pass @@ -499,6 +539,10 @@ class OracleDialect_cx_oracle(OracleDialect): sqltypes.UnicodeText: _OracleUnicodeText, sqltypes.CHAR: _OracleChar, + # a raw LONG is a text type, but does *not* + # get the LobMixin with cx_oracle. + oracle.LONG: _OracleLong, + # this is only needed for OUT parameters. # it would be nice if we could not use it otherwise. sqltypes.Integer: _OracleInteger, @@ -779,15 +823,23 @@ class OracleDialect_cx_oracle(OracleDialect): connection.connection.begin(*xid) def do_prepare_twophase(self, connection, xid): - connection.connection.prepare() + result = connection.connection.prepare() + connection.info['cx_oracle_prepared'] = result - def do_rollback_twophase(self, connection, xid, is_prepared=True, recover=False): + def do_rollback_twophase(self, connection, xid, is_prepared=True, + recover=False): self.do_rollback(connection.connection) - def do_commit_twophase(self, connection, xid, is_prepared=True, recover=False): - self.do_commit(connection.connection) + def do_commit_twophase(self, connection, xid, is_prepared=True, + recover=False): + if not is_prepared: + self.do_commit(connection.connection) + else: + oci_prepared = connection.info['cx_oracle_prepared'] + if oci_prepared: + self.do_commit(connection.connection) def do_recover_twophase(self, connection): - pass + connection.info.pop('cx_oracle_prepared', None) dialect = OracleDialect_cx_oracle diff --git a/lib/sqlalchemy/engine/base.py b/lib/sqlalchemy/engine/base.py index 626fee8c6..e3b09e63a 100644 --- a/lib/sqlalchemy/engine/base.py +++ b/lib/sqlalchemy/engine/base.py @@ -121,7 +121,7 @@ class Connection(Connectable): Note that any key/value can be passed to :meth:`.Connection.execution_options`, and it will be stored in the - ``_execution_options`` dictionary of the :class:`.Connnection`. It + ``_execution_options`` dictionary of the :class:`.Connection`. It is suitable for usage by end-user schemes to communicate with event listeners, for example. @@ -1295,8 +1295,8 @@ class TwoPhaseTransaction(Transaction): class Engine(Connectable, log.Identified): """ Connects a :class:`~sqlalchemy.pool.Pool` and - :class:`~sqlalchemy.engine.base.Dialect` together to provide a source - of database connectivity and behavior. + :class:`~sqlalchemy.engine.interfaces.Dialect` together to provide a + source of database connectivity and behavior. An :class:`.Engine` object is instantiated publicly using the :func:`~sqlalchemy.create_engine` function. @@ -1426,15 +1426,15 @@ class Engine(Connectable, log.Identified): @property def name(self): - """String name of the :class:`~sqlalchemy.engine.Dialect` in use by - this ``Engine``.""" + """String name of the :class:`~sqlalchemy.engine.interfaces.Dialect` + in use by this :class:`Engine`.""" return self.dialect.name @property def driver(self): - """Driver name of the :class:`~sqlalchemy.engine.Dialect` in use by - this ``Engine``.""" + """Driver name of the :class:`~sqlalchemy.engine.interfaces.Dialect` + in use by this :class:`Engine`.""" return self.dialect.driver diff --git a/lib/sqlalchemy/engine/reflection.py b/lib/sqlalchemy/engine/reflection.py index 8367d8761..f1681d616 100644 --- a/lib/sqlalchemy/engine/reflection.py +++ b/lib/sqlalchemy/engine/reflection.py @@ -55,7 +55,7 @@ class Inspector(object): """Performs database schema inspection. The Inspector acts as a proxy to the reflection methods of the - :class:`~sqlalchemy.engine.base.Dialect`, providing a + :class:`~sqlalchemy.engine.interfaces.Dialect`, providing a consistent interface as well as caching support for previously fetched metadata. @@ -72,7 +72,7 @@ class Inspector(object): engine = create_engine('...') insp = Inspector.from_engine(engine) - Where above, the :class:`~sqlalchemy.engine.base.Dialect` may opt + Where above, the :class:`~sqlalchemy.engine.interfaces.Dialect` may opt to return an :class:`.Inspector` subclass that provides additional methods specific to the dialect's target database. @@ -81,13 +81,13 @@ class Inspector(object): def __init__(self, bind): """Initialize a new :class:`.Inspector`. - :param bind: a :class:`~sqlalchemy.engine.base.Connectable`, + :param bind: a :class:`~sqlalchemy.engine.Connectable`, which is typically an instance of :class:`~sqlalchemy.engine.Engine` or :class:`~sqlalchemy.engine.Connection`. For a dialect-specific instance of :class:`.Inspector`, see - :meth:`Inspector.from_engine` + :meth:`.Inspector.from_engine` """ # this might not be a connection, it could be an engine. @@ -111,16 +111,16 @@ class Inspector(object): """Construct a new dialect-specific Inspector object from the given engine or connection. - :param bind: a :class:`~sqlalchemy.engine.base.Connectable`, + :param bind: a :class:`~sqlalchemy.engine.Connectable`, which is typically an instance of :class:`~sqlalchemy.engine.Engine` or :class:`~sqlalchemy.engine.Connection`. This method differs from direct a direct constructor call of :class:`.Inspector` in that the - :class:`~sqlalchemy.engine.base.Dialect` is given a chance to provide - a dialect-specific :class:`.Inspector` instance, which may provide - additional methods. + :class:`~sqlalchemy.engine.interfaces.Dialect` is given a chance to + provide a dialect-specific :class:`.Inspector` instance, which may + provide additional methods. See the example at :class:`.Inspector`. diff --git a/lib/sqlalchemy/engine/result.py b/lib/sqlalchemy/engine/result.py index 98b0ea4b2..dcfd5ac31 100644 --- a/lib/sqlalchemy/engine/result.py +++ b/lib/sqlalchemy/engine/result.py @@ -557,7 +557,7 @@ class ResultProxy(object): supports "returning" and the insert statement executed with the "implicit returning" enabled. - Raises :class:`.InvalidRequestError` if the executed + Raises :class:`~sqlalchemy.exc.InvalidRequestError` if the executed statement is not a compiled expression construct or is not an insert() construct. @@ -583,7 +583,7 @@ class ResultProxy(object): """Return the collection of updated parameters from this execution. - Raises :class:`.InvalidRequestError` if the executed + Raises :class:`~sqlalchemy.exc.InvalidRequestError` if the executed statement is not a compiled expression construct or is not an update() construct. @@ -605,7 +605,7 @@ class ResultProxy(object): """Return the collection of inserted parameters from this execution. - Raises :class:`.InvalidRequestError` if the executed + Raises :class:`~sqlalchemy.exc.InvalidRequestError` if the executed statement is not a compiled expression construct or is not an insert() construct. @@ -639,7 +639,7 @@ class ResultProxy(object): See :class:`.ExecutionContext` for details. - Raises :class:`.InvalidRequestError` if the executed + Raises :class:`~sqlalchemy.exc.InvalidRequestError` if the executed statement is not a compiled expression construct or is not an insert() or update() construct. @@ -661,7 +661,7 @@ class ResultProxy(object): See :class:`.ExecutionContext` for details. - Raises :class:`.InvalidRequestError` if the executed + Raises :class:`~sqlalchemy.exc.InvalidRequestError` if the executed statement is not a compiled expression construct or is not an insert() or update() construct. diff --git a/lib/sqlalchemy/event.py b/lib/sqlalchemy/event.py index bf996ae3c..6453a3987 100644 --- a/lib/sqlalchemy/event.py +++ b/lib/sqlalchemy/event.py @@ -245,6 +245,9 @@ class _DispatchDescriptor(object): self._clslevel = util.defaultdict(list) self._empty_listeners = {} + def _contains(self, cls, evt): + return evt in self._clslevel[cls] + def insert(self, obj, target, propagate): assert isinstance(target, type), \ "Class-level Event targets must be classes." diff --git a/lib/sqlalchemy/ext/compiler.py b/lib/sqlalchemy/ext/compiler.py index 93984d0d1..25de2c0b6 100644 --- a/lib/sqlalchemy/ext/compiler.py +++ b/lib/sqlalchemy/ext/compiler.py @@ -65,11 +65,12 @@ dialect is used. Compiling sub-elements of a custom expression construct ======================================================= -The ``compiler`` argument is the :class:`~sqlalchemy.engine.base.Compiled` -object in use. This object can be inspected for any information about the -in-progress compilation, including ``compiler.dialect``, -``compiler.statement`` etc. The :class:`~sqlalchemy.sql.compiler.SQLCompiler` -and :class:`~sqlalchemy.sql.compiler.DDLCompiler` both include a ``process()`` +The ``compiler`` argument is the +:class:`~sqlalchemy.engine.interfaces.Compiled` object in use. This object +can be inspected for any information about the in-progress compilation, +including ``compiler.dialect``, ``compiler.statement`` etc. The +:class:`~sqlalchemy.sql.compiler.SQLCompiler` and +:class:`~sqlalchemy.sql.compiler.DDLCompiler` both include a ``process()`` method which can be used for compilation of embedded attributes:: from sqlalchemy.sql.expression import Executable, ClauseElement diff --git a/lib/sqlalchemy/ext/mutable.py b/lib/sqlalchemy/ext/mutable.py index 36d60d6d5..e290a93e2 100644 --- a/lib/sqlalchemy/ext/mutable.py +++ b/lib/sqlalchemy/ext/mutable.py @@ -302,6 +302,31 @@ will flag the attribute as "dirty" on the parent object:: >>> assert v1 in sess.dirty True +Coercing Mutable Composites +--------------------------- + +The :meth:`.MutableBase.coerce` method is also supported on composite types. +In the case of :class:`.MutableComposite`, the :meth:`.MutableBase.coerce` +method is only called for attribute set operations, not load operations. +Overriding the :meth:`.MutableBase.coerce` method is essentially equivalent +to using a :func:`.validates` validation routine for all attributes which +make use of the custom composite type:: + + class Point(MutableComposite): + # other Point methods + # ... + + def coerce(cls, key, value): + if isinstance(value, tuple): + value = Point(*value) + elif not isinstance(value, Point): + raise ValueError("tuple or Point expected") + return value + +.. versionadded:: 0.7.10,0.8.0b2 + Support for the :meth:`.MutableBase.coerce` method in conjunction with + objects of type :class:`.MutableComposite`. + Supporting Pickling -------------------- @@ -329,7 +354,7 @@ pickling process of the parent's object-relational state so that the """ from ..orm.attributes import flag_modified from .. import event, types -from ..orm import mapper, object_mapper +from ..orm import mapper, object_mapper, Mapper from ..util import memoized_property import weakref @@ -354,9 +379,27 @@ class MutableBase(object): @classmethod def coerce(cls, key, value): - """Given a value, coerce it into this type. + """Given a value, coerce it into the target type. + + Can be overridden by custom subclasses to coerce incoming + data into a particular type. + + By default, raises ``ValueError``. + + This method is called in different scenarios depending on if + the parent class is of type :class:`.Mutable` or of type + :class:`.MutableComposite`. In the case of the former, it is called + for both attribute-set operations as well as during ORM loading + operations. For the latter, it is only called during attribute-set + operations; the mechanics of the :func:`.composite` construct + handle coercion during load operations. + + + :param key: string name of the ORM-mapped attribute being set. + :param value: the incoming value. + :return: the method should return the coerced value, or raise + ``ValueError`` if the coercion cannot be completed. - By default raises ValueError. """ if value is None: return None @@ -523,11 +566,6 @@ class Mutable(MutableBase): return sqltype -class _MutableCompositeMeta(type): - def __init__(cls, classname, bases, dict_): - cls._setup_listeners() - return type.__init__(cls, classname, bases, dict_) - class MutableComposite(MutableBase): """Mixin that defines transparent propagation of change @@ -536,16 +574,7 @@ class MutableComposite(MutableBase): See the example in :ref:`mutable_composites` for usage information. - .. warning:: - - The listeners established by the :class:`.MutableComposite` - class are *global* to all mappers, and are *not* garbage - collected. Only use :class:`.MutableComposite` for types that are - permanent to an application, not with ad-hoc types else this will - cause unbounded growth in memory usage. - """ - __metaclass__ = _MutableCompositeMeta def changed(self): """Subclasses should call this method whenever change events occur.""" @@ -558,24 +587,16 @@ class MutableComposite(MutableBase): prop._attribute_keys): setattr(parent, attr_name, value) - @classmethod - def _setup_listeners(cls): - """Associate this wrapper with all future mapped composites - of the given type. - - This is a convenience method that calls ``associate_with_attribute`` - automatically. - - """ - - def listen_for_type(mapper, class_): - for prop in mapper.iterate_properties: - if (hasattr(prop, 'composite_class') and - issubclass(prop.composite_class, cls)): - cls._listen_on_attribute( - getattr(class_, prop.key), False, class_) - - event.listen(mapper, 'mapper_configured', listen_for_type) +def _setup_composite_listener(): + def _listen_for_type(mapper, class_): + for prop in mapper.iterate_properties: + if (hasattr(prop, 'composite_class') and + issubclass(prop.composite_class, MutableComposite)): + prop.composite_class._listen_on_attribute( + getattr(class_, prop.key), False, class_) + if not Mapper.dispatch.mapper_configured._contains(Mapper, _listen_for_type): + event.listen(Mapper, 'mapper_configured', _listen_for_type) +_setup_composite_listener() class MutableDict(Mutable, dict): diff --git a/lib/sqlalchemy/ext/serializer.py b/lib/sqlalchemy/ext/serializer.py index 3ed41f48a..c129b0dcc 100644 --- a/lib/sqlalchemy/ext/serializer.py +++ b/lib/sqlalchemy/ext/serializer.py @@ -54,6 +54,7 @@ needed for: from ..orm import class_mapper from ..orm.session import Session from ..orm.mapper import Mapper +from ..orm.interfaces import MapperProperty from ..orm.attributes import QueryableAttribute from .. import Table, Column from ..engine import Engine @@ -90,6 +91,9 @@ def Serializer(*args, **kw): id = "attribute:" + key + ":" + b64encode(pickle.dumps(cls)) elif isinstance(obj, Mapper) and not obj.non_primary: id = "mapper:" + b64encode(pickle.dumps(obj.class_)) + elif isinstance(obj, MapperProperty) and not obj.parent.non_primary: + id = "mapperprop:" + b64encode(pickle.dumps(obj.parent.class_)) + \ + ":" + obj.key elif isinstance(obj, Table): id = "table:" + str(obj) elif isinstance(obj, Column) and isinstance(obj.table, Table): @@ -134,6 +138,10 @@ def Deserializer(file, metadata=None, scoped_session=None, engine=None): elif type_ == "mapper": cls = pickle.loads(b64decode(args)) return class_mapper(cls) + elif type_ == "mapperprop": + mapper, keyname = args.split(':') + cls = pickle.loads(b64decode(args)) + return class_mapper(cls).attrs[keyname] elif type_ == "table": return metadata.tables[args] elif type_ == "column": diff --git a/lib/sqlalchemy/orm/deprecated_interfaces.py b/lib/sqlalchemy/orm/deprecated_interfaces.py index bc9b352d4..d5b3d848e 100644 --- a/lib/sqlalchemy/orm/deprecated_interfaces.py +++ b/lib/sqlalchemy/orm/deprecated_interfaces.py @@ -385,7 +385,7 @@ class SessionExtension(object): :class:`.SessionEvents`. Subclasses may be installed into a :class:`.Session` (or - :func:`.sessionmaker`) using the ``extension`` keyword + :class:`.sessionmaker`) using the ``extension`` keyword argument:: from sqlalchemy.orm.interfaces import SessionExtension diff --git a/lib/sqlalchemy/orm/interfaces.py b/lib/sqlalchemy/orm/interfaces.py index c91746da0..55a980b2e 100644 --- a/lib/sqlalchemy/orm/interfaces.py +++ b/lib/sqlalchemy/orm/interfaces.py @@ -219,6 +219,10 @@ class MapperProperty(_MappedAttribute, _InspectionAttr): return operator(self.comparator, value) + def __repr__(self): + return '<%s at 0x%x; %s>' % ( + self.__class__.__name__, + id(self), self.key) class PropComparator(operators.ColumnOperators): """Defines boolean, comparison, and other operators for @@ -413,21 +417,18 @@ class StrategizedProperty(MapperProperty): return None def _get_context_strategy(self, context, path): - # this is essentially performance inlining. - key = ('loaderstrategy', path.reduced_path + (self.key,)) - cls = None - if key in context.attributes: - cls = context.attributes[key] - else: + strategy_cls = path._inlined_get_for(self, context, 'loaderstrategy') + + if not strategy_cls: wc_key = self._wildcard_path if wc_key and wc_key in context.attributes: - cls = context.attributes[wc_key] + strategy_cls = context.attributes[wc_key] - if cls: + if strategy_cls: try: - return self._strategies[cls] + return self._strategies[strategy_cls] except KeyError: - return self.__init_strategy(cls) + return self.__init_strategy(strategy_cls) return self.strategy def _get_strategy(self, cls): @@ -528,10 +529,8 @@ class PropertyOption(MapperOption): def _find_entity_prop_comparator(self, query, token, mapper, raiseerr): if orm_util._is_aliased_class(mapper): searchfor = mapper - isa = False else: searchfor = orm_util._class_to_mapper(mapper) - isa = True for ent in query._mapper_entities: if ent.corresponds_to(searchfor): return ent @@ -600,7 +599,7 @@ class PropertyOption(MapperOption): # exhaust current_path before # matching tokens to entities if current_path: - if current_path[1] == token: + if current_path[1].key == token: current_path = current_path[2:] continue else: @@ -634,7 +633,7 @@ class PropertyOption(MapperOption): # matching tokens to entities if current_path: if current_path[0:2] == \ - [token._parententity, prop.key]: + [token._parententity, prop]: current_path = current_path[2:] continue else: @@ -648,6 +647,7 @@ class PropertyOption(MapperOption): raiseerr) if not entity: return no_result + path_element = entity.entity_zero mapper = entity.mapper else: @@ -659,7 +659,7 @@ class PropertyOption(MapperOption): raise sa_exc.ArgumentError("Attribute '%s' does not " "link from element '%s'" % (token, path_element)) - path = path[path_element][prop.key] + path = path[path_element][prop] paths.append(path) @@ -670,7 +670,8 @@ class PropertyOption(MapperOption): if not ext_info.is_aliased_class: ac = orm_util.with_polymorphic( ext_info.mapper.base_mapper, - ext_info.mapper, aliased=True) + ext_info.mapper, aliased=True, + _use_mapper_path=True) ext_info = inspect(ac) path.set(query, "path_with_polymorphic", ext_info) else: diff --git a/lib/sqlalchemy/orm/loading.py b/lib/sqlalchemy/orm/loading.py index a5d156a1f..4bd80c388 100644 --- a/lib/sqlalchemy/orm/loading.py +++ b/lib/sqlalchemy/orm/loading.py @@ -271,6 +271,7 @@ def instance_processor(mapper, context, path, adapter, new_populators = [] existing_populators = [] eager_populators = [] + load_path = context.query._current_path + path \ if context.query._current_path.path \ else path @@ -504,9 +505,12 @@ def _populators(mapper, context, path, row, adapter, delayed_populators = [] pops = (new_populators, existing_populators, delayed_populators, eager_populators) + for prop in mapper._props.itervalues(): + for i, pop in enumerate(prop.create_row_processor( - context, path, + context, + path, mapper, row, adapter)): if pop is not None: pops[i].append((prop.key, pop)) @@ -529,17 +533,10 @@ def _configure_subclass_mapper(mapper, context, path, adapter): if sub_mapper is mapper: return None - # replace the tip of the path info with the subclass mapper - # being used, that way accurate "load_path" info is available - # for options invoked during deferred loads, e.g. - # query(Person).options(defer(Engineer.machines, Machine.name)). - # for AliasedClass paths, disregard this step (new in 0.8). return instance_processor( sub_mapper, context, - path.parent[sub_mapper] - if not path.is_aliased_class - else path, + path, adapter, polymorphic_from=mapper) return configure_subclass_mapper diff --git a/lib/sqlalchemy/orm/mapper.py b/lib/sqlalchemy/orm/mapper.py index b89163340..626105b5e 100644 --- a/lib/sqlalchemy/orm/mapper.py +++ b/lib/sqlalchemy/orm/mapper.py @@ -472,7 +472,7 @@ class Mapper(_InspectionAttr): dispatch = event.dispatcher(events.MapperEvents) @util.memoized_property - def _sa_path_registry(self): + def _path_registry(self): return PathRegistry.per_mapper(self) def _configure_inheritance(self): @@ -1403,7 +1403,7 @@ class Mapper(_InspectionAttr): if _new_mappers: configure_mappers() if not self.with_polymorphic: - return [self] + return [] return self._mappers_from_spec(*self.with_polymorphic) @_memoized_configured_property @@ -1458,10 +1458,10 @@ class Mapper(_InspectionAttr): return list(self._iterate_polymorphic_properties( self._with_polymorphic_mappers)) + def _iterate_polymorphic_properties(self, mappers=None): """Return an iterator of MapperProperty objects which will render into a SELECT.""" - if mappers is None: mappers = self._with_polymorphic_mappers diff --git a/lib/sqlalchemy/orm/query.py b/lib/sqlalchemy/orm/query.py index ca334e273..d6847177f 100644 --- a/lib/sqlalchemy/orm/query.py +++ b/lib/sqlalchemy/orm/query.py @@ -157,7 +157,7 @@ class Query(object): ent.setup_entity(*d[entity]) def _mapper_loads_polymorphically_with(self, mapper, adapter): - for m2 in mapper._with_polymorphic_mappers: + for m2 in mapper._with_polymorphic_mappers or [mapper]: self._polymorphic_adapters[m2] = adapter for m in m2.iterate_to_root(): self._polymorphic_adapters[m.local_table] = adapter @@ -2744,17 +2744,24 @@ class _MapperEntity(_QueryEntity): self._with_polymorphic = ext_info.with_polymorphic_mappers self._polymorphic_discriminator = \ ext_info.polymorphic_on + self.entity_zero = ext_info if ext_info.is_aliased_class: - self.entity_zero = ext_info.entity - self._label_name = self.entity_zero._sa_label_name + self._label_name = self.entity_zero.name else: - self.entity_zero = self.mapper self._label_name = self.mapper.class_.__name__ - self.path = self.entity_zero._sa_path_registry + self.path = self.entity_zero._path_registry def set_with_polymorphic(self, query, cls_or_mappers, selectable, polymorphic_on): + """Receive an update from a call to query.with_polymorphic(). + + Note the newer style of using a free standing with_polymporphic() + construct doesn't make use of this method. + + + """ if self.is_aliased_class: + # TODO: invalidrequest ? raise NotImplementedError( "Can't use with_polymorphic() against " "an Aliased object" @@ -2785,13 +2792,18 @@ class _MapperEntity(_QueryEntity): return self.entity_zero def corresponds_to(self, entity): - entity_info = inspect(entity) - if entity_info.is_aliased_class or self.is_aliased_class: - return entity is self.entity_zero \ - or \ - entity in self._with_polymorphic - else: - return entity.common_parent(self.entity_zero) + if entity.is_aliased_class: + if self.is_aliased_class: + if entity._base_alias is self.entity_zero._base_alias: + return True + return False + elif self.is_aliased_class: + if self.entity_zero._use_mapper_path: + return entity in self._with_polymorphic + else: + return entity is self.entity_zero + + return entity.common_parent(self.entity_zero) def adapt_to_selectable(self, query, sel): query._entities.append(self) @@ -3008,6 +3020,7 @@ class _ColumnEntity(_QueryEntity): if self.entity_zero is None: return False elif _is_aliased_class(entity): + # TODO: polymorphic subclasses ? return entity is self.entity_zero else: return not _is_aliased_class(self.entity_zero) and \ diff --git a/lib/sqlalchemy/orm/strategies.py b/lib/sqlalchemy/orm/strategies.py index 586ec4b4e..05c7ef37b 100644 --- a/lib/sqlalchemy/orm/strategies.py +++ b/lib/sqlalchemy/orm/strategies.py @@ -303,16 +303,6 @@ class AbstractRelationshipLoader(LoaderStrategy): self.uselist = self.parent_property.uselist - def _warn_existing_path(self): - raise sa_exc.InvalidRequestError( - "Eager loading cannot currently function correctly when two or " - "more " - "same-named attributes associated with multiple polymorphic " - "classes " - "of the same base are present. Encountered more than one " - r"eager path for attribute '%s' on mapper '%s'." % - (self.key, self.parent.base_mapper, )) - class NoLoader(AbstractRelationshipLoader): """Provide loading behavior for a :class:`.RelationshipProperty` @@ -564,7 +554,7 @@ class LazyLoader(AbstractRelationshipLoader): q = q.autoflush(False) if state.load_path: - q = q._with_current_path(state.load_path[self.key]) + q = q._with_current_path(state.load_path[self.parent_property]) if state.load_options: q = q._conditional_options(*state.load_options) @@ -694,7 +684,7 @@ class SubqueryLoader(AbstractRelationshipLoader): if not context.query._enable_eagerloads: return - path = path[self.key] + path = path[self.parent_property] # build up a path indicating the path from the leftmost # entity to the thing we're subquery loading. @@ -757,22 +747,20 @@ class SubqueryLoader(AbstractRelationshipLoader): # add new query to attributes to be picked up # by create_row_processor - existing = path.replace(context, "subquery", q) - if existing: - self._warn_existing_path() + path.set(context, "subquery", q) def _get_leftmost(self, subq_path): subq_path = subq_path.path subq_mapper = orm_util._class_to_mapper(subq_path[0]) # determine attributes of the leftmost mapper - if self.parent.isa(subq_mapper) and self.key == subq_path[1]: + if self.parent.isa(subq_mapper) and self.parent_property is subq_path[1]: leftmost_mapper, leftmost_prop = \ self.parent, self.parent_property else: leftmost_mapper, leftmost_prop = \ subq_mapper, \ - subq_mapper._props[subq_path[1]] + subq_path[1] leftmost_cols = leftmost_prop.local_columns @@ -805,23 +793,35 @@ class SubqueryLoader(AbstractRelationshipLoader): # the original query now becomes a subquery # which we'll join onto. embed_q = q.with_labels().subquery() - left_alias = orm_util.AliasedClass(leftmost_mapper, embed_q) + left_alias = orm_util.AliasedClass(leftmost_mapper, embed_q, + use_mapper_path=True) return left_alias def _prep_for_joins(self, left_alias, subq_path): - subq_path = subq_path.path - # figure out what's being joined. a.k.a. the fun part - to_join = [ - (subq_path[i], subq_path[i + 1]) - for i in xrange(0, len(subq_path), 2) - ] + to_join = [] + pairs = list(subq_path.pairs()) + + for i, (mapper, prop) in enumerate(pairs): + if i > 0: + # look at the previous mapper in the chain - + # if it is as or more specific than this prop's + # mapper, use that instead. + # note we have an assumption here that + # the non-first element is always going to be a mapper, + # not an AliasedClass + + prev_mapper = pairs[i - 1][1].mapper + to_append = prev_mapper if prev_mapper.isa(mapper) else mapper + else: + to_append = mapper + + to_join.append((to_append, prop.key)) # determine the immediate parent class we are joining from, # which needs to be aliased. - if len(to_join) > 1: - info = inspect(subq_path[-2]) + info = inspect(to_join[-1][0]) if len(to_join) < 2: # in the case of a one level eager load, this is the @@ -833,11 +833,13 @@ class SubqueryLoader(AbstractRelationshipLoader): # in the vast majority of cases, and [ticket:2014] # illustrates a case where sub_path[-2] is a subclass # of self.parent - parent_alias = orm_util.AliasedClass(subq_path[-2]) + parent_alias = orm_util.AliasedClass(to_join[-1][0], + use_mapper_path=True) else: # if of_type() were used leading to this relationship, # self.parent is more specific than subq_path[-2] - parent_alias = orm_util.AliasedClass(self.parent) + parent_alias = orm_util.AliasedClass(self.parent, + use_mapper_path=True) local_cols = self.parent_property.local_columns @@ -916,9 +918,10 @@ class SubqueryLoader(AbstractRelationshipLoader): "population - eager loading cannot be applied." % self) - path = path[self.key] + path = path[self.parent_property] subq = path.get(context, 'subquery') + if subq is None: return None, None, None @@ -1000,7 +1003,7 @@ class JoinedLoader(AbstractRelationshipLoader): if not context.query._enable_eagerloads: return - path = path[self.key] + path = path[self.parent_property] with_polymorphic = None @@ -1040,6 +1043,7 @@ class JoinedLoader(AbstractRelationshipLoader): with_polymorphic = None path = path[self.mapper] + for value in self.mapper._iterate_polymorphic_properties( mappers=with_polymorphic): value.setup( @@ -1079,7 +1083,8 @@ class JoinedLoader(AbstractRelationshipLoader): if with_poly_info: to_adapt = with_poly_info.entity else: - to_adapt = orm_util.AliasedClass(self.mapper) + to_adapt = orm_util.AliasedClass(self.mapper, + use_mapper_path=True) clauses = orm_util.ORMAdapter( to_adapt, equivalents=self.mapper._equivalent_columns, @@ -1104,9 +1109,8 @@ class JoinedLoader(AbstractRelationshipLoader): ) add_to_collection = context.secondary_columns - existing = path.replace(context, "eager_row_processor", clauses) - if existing: - self._warn_existing_path() + path.set(context, "eager_row_processor", clauses) + return clauses, adapter, add_to_collection, allow_innerjoin def _create_eager_join(self, context, entity, @@ -1154,7 +1158,8 @@ class JoinedLoader(AbstractRelationshipLoader): onclause = getattr( orm_util.AliasedClass( self.parent, - adapter.selectable + adapter.selectable, + use_mapper_path=True ), self.key, self.parent_property ) @@ -1238,7 +1243,7 @@ class JoinedLoader(AbstractRelationshipLoader): "population - eager loading cannot be applied." % self) - our_path = path[self.key] + our_path = path[self.parent_property] eager_adapter = self._create_eager_adapter( context, @@ -1391,15 +1396,13 @@ class LoadEagerFromAliasOption(PropertyOption): def process_query_property(self, query, paths): if self.chained: for path in paths[0:-1]: - (root_mapper, propname) = path.path[-2:] - prop = root_mapper._props[propname] + (root_mapper, prop) = path.path[-2:] adapter = query._polymorphic_adapters.get(prop.mapper, None) path.setdefault(query, "user_defined_eager_row_processor", adapter) - root_mapper, propname = paths[-1].path[-2:] - prop = root_mapper._props[propname] + root_mapper, prop = paths[-1].path[-2:] if self.alias is not None: if isinstance(self.alias, basestring): self.alias = prop.target.alias(self.alias) diff --git a/lib/sqlalchemy/orm/util.py b/lib/sqlalchemy/orm/util.py index e5e725138..04838cb64 100644 --- a/lib/sqlalchemy/orm/util.py +++ b/lib/sqlalchemy/orm/util.py @@ -245,6 +245,8 @@ class ORMAdapter(sql_util.ColumnAdapter): else: return None +def _unreduce_path(path): + return PathRegistry.deserialize(path) class PathRegistry(object): """Represent query load paths and registry functions. @@ -277,19 +279,13 @@ class PathRegistry(object): self.path == other.path def set(self, reg, key, value): - reg._attributes[(key, self.reduced_path)] = value - - def replace(self, reg, key, value): - path_key = (key, self.reduced_path) - existing = reg._attributes.get(path_key, None) - reg._attributes[path_key] = value - return existing + reg._attributes[(key, self.path)] = value def setdefault(self, reg, key, value): - reg._attributes.setdefault((key, self.reduced_path), value) + reg._attributes.setdefault((key, self.path), value) def get(self, reg, key, value=None): - key = (key, self.reduced_path) + key = (key, self.path) if key in reg._attributes: return reg._attributes[key] else: @@ -302,17 +298,25 @@ class PathRegistry(object): def length(self): return len(self.path) + def pairs(self): + path = self.path + for i in xrange(0, len(path), 2): + yield path[i], path[i + 1] + def contains_mapper(self, mapper): return mapper in self.path def contains(self, reg, key): - return (key, self.reduced_path) in reg._attributes + return (key, self.path) in reg._attributes + + def __reduce__(self): + return _unreduce_path, (self.serialize(), ) def serialize(self): path = self.path return zip( [m.class_ for m in [path[i] for i in range(0, len(path), 2)]], - [path[i] for i in range(1, len(path), 2)] + [None] + [path[i].key for i in range(1, len(path), 2)] + [None] ) @classmethod @@ -320,7 +324,10 @@ class PathRegistry(object): if path is None: return None - p = tuple(chain(*[(class_mapper(mcls), key) for mcls, key in path])) + p = tuple(chain(*[(class_mapper(mcls), + class_mapper(mcls).attrs[key] + if key is not None else None) + for mcls, key in path])) if p and p[-1] is None: p = p[0:-1] return cls.coerce(p) @@ -337,7 +344,7 @@ class PathRegistry(object): @classmethod def token(cls, token): - return KeyRegistry(cls.root, token) + return TokenRegistry(cls.root, token) def __add__(self, other): return util.reduce( @@ -354,19 +361,36 @@ class RootRegistry(PathRegistry): """ path = () - reduced_path = () - def __getitem__(self, mapper): - return mapper._sa_path_registry + def __getitem__(self, entity): + return entity._path_registry PathRegistry.root = RootRegistry() +class TokenRegistry(PathRegistry): + def __init__(self, parent, token): + self.token = token + self.parent = parent + self.path = parent.path + (token,) -class KeyRegistry(PathRegistry): - def __init__(self, parent, key): - self.key = key + def __getitem__(self, entity): + raise NotImplementedError() + +class PropRegistry(PathRegistry): + def __init__(self, parent, prop): + # restate this path in terms of the + # given MapperProperty's parent. + insp = inspection.inspect(parent[-1]) + if not insp.is_aliased_class or insp._use_mapper_path: + parent = parent.parent[prop.parent] + elif insp.is_aliased_class and insp.with_polymorphic_mappers: + if prop.parent is not insp.mapper and \ + prop.parent in insp.with_polymorphic_mappers: + subclass_entity = parent[-1]._entity_for_mapper(prop.parent) + parent = parent.parent[subclass_entity] + + self.prop = prop self.parent = parent - self.path = parent.path + (key,) - self.reduced_path = parent.reduced_path + (key,) + self.path = parent.path + (prop,) def __getitem__(self, entity): if isinstance(entity, (int, slice)): @@ -381,15 +405,11 @@ class EntityRegistry(PathRegistry, dict): is_aliased_class = False def __init__(self, parent, entity): - self.key = reduced_key = entity + self.key = entity self.parent = parent - if hasattr(entity, 'base_mapper'): - reduced_key = entity.base_mapper - else: - self.is_aliased_class = True + self.is_aliased_class = entity.is_aliased_class self.path = parent.path + (entity,) - self.reduced_path = parent.reduced_path + (reduced_key,) def __nonzero__(self): return True @@ -400,8 +420,26 @@ class EntityRegistry(PathRegistry, dict): else: return dict.__getitem__(self, entity) + def _inlined_get_for(self, prop, context, key): + """an inlined version of: + + cls = path[mapperproperty].get(context, key) + + Skips the isinstance() check in __getitem__ + and the extra method call for get(). + Used by StrategizedProperty for its + very frequent lookup. + + """ + path = dict.__getitem__(self, prop) + path_key = (key, path.path) + if path_key in context._attributes: + return context._attributes[path_key] + else: + return None + def __missing__(self, key): - self[key] = item = KeyRegistry(self, key) + self[key] = item = PropRegistry(self, key) return item @@ -448,8 +486,11 @@ class AliasedClass(object): def __init__(self, cls, alias=None, name=None, adapt_on_names=False, + # TODO: None for default here? with_polymorphic_mappers=(), - with_polymorphic_discriminator=None): + with_polymorphic_discriminator=None, + base_alias=None, + use_mapper_path=False): mapper = _class_to_mapper(cls) if alias is None: alias = mapper._with_polymorphic_selectable.alias(name=name) @@ -458,11 +499,19 @@ class AliasedClass(object): mapper, alias, name, - with_polymorphic_mappers, + with_polymorphic_mappers + if with_polymorphic_mappers + else mapper.with_polymorphic_mappers, with_polymorphic_discriminator + if with_polymorphic_discriminator is not None + else mapper.polymorphic_on, + base_alias, + use_mapper_path ) + self._setup(self._aliased_insp, adapt_on_names) + def _setup(self, aliased_insp, adapt_on_names): self.__adapt_on_names = adapt_on_names mapper = aliased_insp.mapper @@ -473,18 +522,13 @@ class AliasedClass(object): equivalents=mapper._equivalent_columns, adapt_on_names=self.__adapt_on_names) for poly in aliased_insp.with_polymorphic_mappers: - setattr(self, poly.class_.__name__, - AliasedClass(poly.class_, alias)) + if poly is not mapper: + setattr(self, poly.class_.__name__, + AliasedClass(poly.class_, alias, base_alias=self, + use_mapper_path=self._aliased_insp._use_mapper_path)) - # used to assign a name to the RowTuple object - # returned by Query. - self._sa_label_name = aliased_insp.name self.__name__ = 'AliasedClass_%s' % self.__target.__name__ - @util.memoized_property - def _sa_path_registry(self): - return PathRegistry.per_mapper(self) - def __getstate__(self): return { 'mapper': self._aliased_insp.mapper, @@ -494,7 +538,9 @@ class AliasedClass(object): 'with_polymorphic_mappers': self._aliased_insp.with_polymorphic_mappers, 'with_polymorphic_discriminator': - self._aliased_insp.polymorphic_on + self._aliased_insp.polymorphic_on, + 'base_alias': self._aliased_insp._base_alias.entity, + 'use_mapper_path': self._aliased_insp._use_mapper_path } def __setstate__(self, state): @@ -503,8 +549,10 @@ class AliasedClass(object): state['mapper'], state['alias'], state['name'], - state.get('with_polymorphic_mappers'), - state.get('with_polymorphic_discriminator') + state['with_polymorphic_mappers'], + state['with_polymorphic_discriminator'], + state['base_alias'], + state['use_mapper_path'] ) self._setup(self._aliased_insp, state['adapt_on_names']) @@ -521,7 +569,7 @@ class AliasedClass(object): queryattr = attributes.QueryableAttribute( self, key, impl=existing.impl, - parententity=self, + parententity=self._aliased_insp, comparator=comparator) setattr(self, key, queryattr) return queryattr @@ -558,17 +606,7 @@ class AliasedClass(object): id(self), self.__target.__name__) -AliasedInsp = util.namedtuple("AliasedInsp", [ - "entity", - "mapper", - "selectable", - "name", - "with_polymorphic_mappers", - "polymorphic_on" - ]) - - -class AliasedInsp(_InspectionAttr, AliasedInsp): +class AliasedInsp(_InspectionAttr): """Provide an inspection interface for an :class:`.AliasedClass` object. @@ -604,6 +642,22 @@ class AliasedInsp(_InspectionAttr, AliasedInsp): """ + def __init__(self, entity, mapper, selectable, name, + with_polymorphic_mappers, polymorphic_on, + _base_alias, _use_mapper_path): + self.entity = entity + self.mapper = mapper + self.selectable = selectable + self.name = name + self.with_polymorphic_mappers = with_polymorphic_mappers + self.polymorphic_on = polymorphic_on + + # a little dance to get serialization to work + self._base_alias = _base_alias._aliased_insp if _base_alias \ + and _base_alias is not entity else self + self._use_mapper_path = _use_mapper_path + + is_aliased_class = True "always returns True" @@ -613,8 +667,29 @@ class AliasedInsp(_InspectionAttr, AliasedInsp): :class:`.AliasedInsp`.""" return self.mapper.class_ + @util.memoized_property + def _path_registry(self): + if self._use_mapper_path: + return self.mapper._path_registry + else: + return PathRegistry.per_mapper(self) + + def _entity_for_mapper(self, mapper): + self_poly = self.with_polymorphic_mappers + if mapper in self_poly: + return getattr(self.entity, mapper.class_.__name__)._aliased_insp + elif mapper.isa(self.mapper): + return self + else: + assert False, "mapper %s doesn't correspond to %s" % (mapper, self) + + def __repr__(self): + return '<AliasedInsp at 0x%x; %s>' % ( + id(self), self.class_.__name__) + inspection._inspects(AliasedClass)(lambda target: target._aliased_insp) +inspection._inspects(AliasedInsp)(lambda target: target) def aliased(element, alias=None, name=None, adapt_on_names=False): @@ -699,7 +774,7 @@ def aliased(element, alias=None, name=None, adapt_on_names=False): def with_polymorphic(base, classes, selectable=False, polymorphic_on=None, aliased=False, - innerjoin=False): + innerjoin=False, _use_mapper_path=False): """Produce an :class:`.AliasedClass` construct which specifies columns for descendant mappers of the given base. @@ -758,7 +833,8 @@ def with_polymorphic(base, classes, selectable=False, return AliasedClass(base, selectable, with_polymorphic_mappers=mappers, - with_polymorphic_discriminator=polymorphic_on) + with_polymorphic_discriminator=polymorphic_on, + use_mapper_path=_use_mapper_path) def _orm_annotate(element, exclude=None): @@ -1109,6 +1185,7 @@ def _entity_descriptor(entity, key): description = entity entity = insp.c elif insp.is_aliased_class: + entity = insp.entity description = entity elif hasattr(insp, "mapper"): description = entity = insp.mapper.class_ diff --git a/lib/sqlalchemy/schema.py b/lib/sqlalchemy/schema.py index f6a6b83b4..2fb542a43 100644 --- a/lib/sqlalchemy/schema.py +++ b/lib/sqlalchemy/schema.py @@ -146,7 +146,7 @@ class Table(SchemaItem, expression.TableClause): table. The metadata is used as a point of association of this table with other tables which are referenced via foreign key. It also may be used to associate this table with a particular - :class:`~sqlalchemy.engine.base.Connectable`. + :class:`.Connectable`. :param \*args: Additional positional arguments are used primarily to add the list of :class:`.Column` objects contained within this @@ -240,7 +240,7 @@ class Table(SchemaItem, expression.TableClause): This alternate hook to :func:`.event.listen` allows the establishment of a listener function specific to this :class:`.Table` before the "autoload" process begins. Particularly useful for - the :meth:`.events.column_reflect` event:: + the :meth:`.DDLEvents.column_reflect` event:: def listen_for_reflect(table, column_info): "handle the column reflection event" @@ -254,7 +254,7 @@ class Table(SchemaItem, expression.TableClause): ]) :param mustexist: When ``True``, indicates that this Table must already - be present in the given :class:`.MetaData`` collection, else + be present in the given :class:`.MetaData` collection, else an exception is raised. :param prefixes: @@ -2600,9 +2600,8 @@ class MetaData(SchemaItem): in this ``MetaData`` no longer exists in the database. :param bind: - A :class:`~sqlalchemy.engine.base.Connectable` used to access the - database; if None, uses the existing bind on this ``MetaData``, if - any. + A :class:`.Connectable` used to access the database; if None, uses + the existing bind on this ``MetaData``, if any. :param schema: Optional, query and reflect tables from an alterate schema. @@ -2689,7 +2688,7 @@ class MetaData(SchemaItem): present in the target database. :param bind: - A :class:`~sqlalchemy.engine.base.Connectable` used to access the + A :class:`.Connectable` used to access the database; if None, uses the existing bind on this ``MetaData``, if any. @@ -2716,7 +2715,7 @@ class MetaData(SchemaItem): the target database. :param bind: - A :class:`~sqlalchemy.engine.base.Connectable` used to access the + A :class:`.Connectable` used to access the database; if None, uses the existing bind on this ``MetaData``, if any. @@ -2858,14 +2857,14 @@ class DDLElement(expression.Executable, _DDLCompiles): """Execute this DDL immediately. Executes the DDL statement in isolation using the supplied - :class:`~sqlalchemy.engine.base.Connectable` or - :class:`~sqlalchemy.engine.base.Connectable` assigned to the ``.bind`` + :class:`.Connectable` or + :class:`.Connectable` assigned to the ``.bind`` property, if not supplied. If the DDL has a conditional ``on`` criteria, it will be invoked with None as the event. :param bind: Optional, an ``Engine`` or ``Connection``. If not supplied, a valid - :class:`~sqlalchemy.engine.base.Connectable` must be present in the + :class:`.Connectable` must be present in the ``.bind`` property. :param target: @@ -3146,7 +3145,7 @@ class DDL(DDLElement): available for use in string substitutions on the DDL statement. :param bind: - Optional. A :class:`~sqlalchemy.engine.base.Connectable`, used by + Optional. A :class:`.Connectable`, used by default when ``execute()`` is invoked without a bind argument. diff --git a/lib/sqlalchemy/sql/compiler.py b/lib/sqlalchemy/sql/compiler.py index 3856499fc..b7dc03414 100644 --- a/lib/sqlalchemy/sql/compiler.py +++ b/lib/sqlalchemy/sql/compiler.py @@ -2100,7 +2100,15 @@ class GenericTypeCompiler(engine.TypeCompiler): 'scale': type_.scale} def visit_DECIMAL(self, type_): - return "DECIMAL" + if type_.precision is None: + return "DECIMAL" + elif type_.scale is None: + return "DECIMAL(%(precision)s)" % \ + {'precision': type_.precision} + else: + return "DECIMAL(%(precision)s, %(scale)s)" % \ + {'precision': type_.precision, + 'scale': type_.scale} def visit_INTEGER(self, type_): return "INTEGER" diff --git a/lib/sqlalchemy/sql/expression.py b/lib/sqlalchemy/sql/expression.py index 42d6393ed..d3379bce5 100644 --- a/lib/sqlalchemy/sql/expression.py +++ b/lib/sqlalchemy/sql/expression.py @@ -1866,10 +1866,10 @@ class ClauseElement(Visitable): def compile(self, bind=None, dialect=None, **kw): """Compile this SQL expression. - The return value is a :class:`~sqlalchemy.engine.Compiled` object. + The return value is a :class:`~.Compiled` object. Calling ``str()`` or ``unicode()`` on the returned value will yield a string representation of the result. The - :class:`~sqlalchemy.engine.Compiled` object also can return a + :class:`~.Compiled` object also can return a dictionary of bind parameter names and values using the ``params`` accessor. @@ -3723,6 +3723,7 @@ class BinaryExpression(ColumnElement): # refer to BinaryExpression directly and pass strings if isinstance(operator, basestring): operator = operators.custom_op(operator) + self._orig = (left, right) self.left = _literal_as_text(left).self_group(against=operator) self.right = _literal_as_text(right).self_group(against=operator) self.operator = operator @@ -3735,9 +3736,9 @@ class BinaryExpression(ColumnElement): self.modifiers = modifiers def __nonzero__(self): - try: - return self.operator(hash(self.left), hash(self.right)) - except: + if self.operator in (operator.eq, operator.ne): + return self.operator(hash(self._orig[0]), hash(self._orig[1])) + else: raise TypeError("Boolean value of this clause is not defined") @property diff --git a/test/aaa_profiling/test_memusage.py b/test/aaa_profiling/test_memusage.py index fca0635d2..54c7d7ecc 100644 --- a/test/aaa_profiling/test_memusage.py +++ b/test/aaa_profiling/test_memusage.py @@ -426,12 +426,14 @@ class MemUsageTest(EnsureZeroed): metadata = MetaData() a = Table("a", metadata, Column('id', Integer, primary_key=True), + Column('foo', Integer), + Column('bar', Integer) ) m1 = mapper(A, a) @profile_memory() def go(): - ma = aliased(A) - m1._sa_path_registry['foo'][ma]['bar'] + ma = sa.inspect(aliased(A)) + m1._path_registry[m1.attrs.foo][ma][m1.attrs.bar] go() clear_mappers() diff --git a/test/dialect/test_oracle.py b/test/dialect/test_oracle.py index 3e7ebf012..80ab91a91 100644 --- a/test/dialect/test_oracle.py +++ b/test/dialect/test_oracle.py @@ -781,6 +781,76 @@ class ConstraintTest(fixtures.TestBase): onupdate='CASCADE')) assert_raises(exc.SAWarning, bat.create) + +class TwoPhaseTest(fixtures.TablesTest): + """test cx_oracle two phase, which remains in a semi-broken state + so requires a carefully written test.""" + + __only_on__ = 'oracle+cx_oracle' + + @classmethod + def define_tables(cls, metadata): + Table('datatable', metadata, + Column('id', Integer, primary_key=True), + Column('data', String(50)) + ) + + def _connection(self): + conn = testing.db.connect() + conn.detach() + return conn + + def _assert_data(self, rows): + eq_( + testing.db.scalar("select count(*) from datatable"), + rows + ) + def test_twophase_prepare_false(self): + conn = self._connection() + for i in xrange(2): + trans = conn.begin_twophase() + conn.execute("select 1 from dual") + trans.prepare() + trans.commit() + conn.close() + self._assert_data(0) + + def test_twophase_prepare_true(self): + conn = self._connection() + for i in xrange(2): + trans = conn.begin_twophase() + conn.execute("insert into datatable (id, data) " + "values (%s, 'somedata')" % i) + trans.prepare() + trans.commit() + conn.close() + self._assert_data(2) + + def test_twophase_rollback(self): + conn = self._connection() + trans = conn.begin_twophase() + conn.execute("insert into datatable (id, data) " + "values (%s, 'somedata')" % 1) + trans.rollback() + + trans = conn.begin_twophase() + conn.execute("insert into datatable (id, data) " + "values (%s, 'somedata')" % 1) + trans.prepare() + trans.commit() + + conn.close() + self._assert_data(1) + + def test_not_prepared(self): + conn = self._connection() + trans = conn.begin_twophase() + conn.execute("insert into datatable (id, data) " + "values (%s, 'somedata')" % 1) + trans.commit() + conn.close() + self._assert_data(1) + class DialectTypesTest(fixtures.TestBase, AssertsCompiledSQL): __dialect__ = oracle.OracleDialect() @@ -1245,6 +1315,21 @@ class TypesTest(fixtures.TestBase): finally: t1.drop() + @testing.provide_metadata + def test_long_type(self): + metadata = self.metadata + + t = Table('t', metadata, + Column('data', oracle.LONG) + ) + metadata.create_all(testing.db) + testing.db.execute(t.insert(), data='xyz') + eq_( + testing.db.scalar(select([t.c.data])), + "xyz" + ) + + def test_longstring(self): metadata = MetaData(testing.db) diff --git a/test/ext/test_mutable.py b/test/ext/test_mutable.py index 916ff9d4b..4516e3ac2 100644 --- a/test/ext/test_mutable.py +++ b/test/ext/test_mutable.py @@ -59,35 +59,35 @@ class _MutableDictTestBase(object): assert_raises_message( ValueError, "Attribute 'data' does not accept objects of type", - Foo, data=set([1,2,3]) + Foo, data=set([1, 2, 3]) ) def test_in_place_mutation(self): sess = Session() - f1 = Foo(data={'a':'b'}) + f1 = Foo(data={'a': 'b'}) sess.add(f1) sess.commit() f1.data['a'] = 'c' sess.commit() - eq_(f1.data, {'a':'c'}) + eq_(f1.data, {'a': 'c'}) def test_replace(self): sess = Session() - f1 = Foo(data={'a':'b'}) + f1 = Foo(data={'a': 'b'}) sess.add(f1) sess.flush() - f1.data = {'b':'c'} + f1.data = {'b': 'c'} sess.commit() - eq_(f1.data, {'b':'c'}) + eq_(f1.data, {'b': 'c'}) def test_pickle_parent(self): sess = Session() - f1 = Foo(data={'a':'b'}) + f1 = Foo(data={'a': 'b'}) sess.add(f1) sess.commit() f1.data @@ -102,7 +102,7 @@ class _MutableDictTestBase(object): def test_unrelated_flush(self): sess = Session() - f1 = Foo(data={"a":"b"}, unrelated_data="unrelated") + f1 = Foo(data={"a": "b"}, unrelated_data="unrelated") sess.add(f1) sess.flush() f1.unrelated_data = "unrelated 2" @@ -114,14 +114,14 @@ class _MutableDictTestBase(object): def _test_non_mutable(self): sess = Session() - f1 = Foo(non_mutable_data={'a':'b'}) + f1 = Foo(non_mutable_data={'a': 'b'}) sess.add(f1) sess.commit() f1.non_mutable_data['a'] = 'c' sess.commit() - eq_(f1.non_mutable_data, {'a':'b'}) + eq_(f1.non_mutable_data, {'a': 'b'}) class MutableWithScalarPickleTest(_MutableDictTestBase, fixtures.MappedTest): @classmethod @@ -142,7 +142,7 @@ class MutableWithScalarPickleTest(_MutableDictTestBase, fixtures.MappedTest): class MutableWithScalarJSONTest(_MutableDictTestBase, fixtures.MappedTest): # json introduced in 2.6 - __skip_if__ = lambda : sys.version_info < (2, 6), + __skip_if__ = lambda: sys.version_info < (2, 6), @classmethod def define_tables(cls, metadata): @@ -177,7 +177,6 @@ class MutableWithScalarJSONTest(_MutableDictTestBase, fixtures.MappedTest): class MutableAssocWithAttrInheritTest(_MutableDictTestBase, fixtures.MappedTest): @classmethod def define_tables(cls, metadata): - MutableDict = cls._type_fixture() Table('foo', metadata, Column('id', Integer, primary_key=True, test_needs_autoincrement=True), @@ -201,24 +200,24 @@ class MutableAssocWithAttrInheritTest(_MutableDictTestBase, fixtures.MappedTest) def test_in_place_mutation(self): sess = Session() - f1 = SubFoo(data={'a':'b'}) + f1 = SubFoo(data={'a': 'b'}) sess.add(f1) sess.commit() f1.data['a'] = 'c' sess.commit() - eq_(f1.data, {'a':'c'}) + eq_(f1.data, {'a': 'c'}) def test_replace(self): sess = Session() - f1 = SubFoo(data={'a':'b'}) + f1 = SubFoo(data={'a': 'b'}) sess.add(f1) sess.flush() - f1.data = {'b':'c'} + f1.data = {'b': 'c'} sess.commit() - eq_(f1.data, {'b':'c'}) + eq_(f1.data, {'b': 'c'}) class MutableAssociationScalarPickleTest(_MutableDictTestBase, fixtures.MappedTest): @classmethod @@ -235,7 +234,7 @@ class MutableAssociationScalarPickleTest(_MutableDictTestBase, fixtures.MappedTe class MutableAssociationScalarJSONTest(_MutableDictTestBase, fixtures.MappedTest): # json introduced in 2.6 - __skip_if__ = lambda : sys.version_info < (2, 6), + __skip_if__ = lambda: sys.version_info < (2, 6), @classmethod def define_tables(cls, metadata): @@ -259,7 +258,8 @@ class MutableAssociationScalarJSONTest(_MutableDictTestBase, fixtures.MappedTest MutableDict.associate_with(JSONEncodedDict) Table('foo', metadata, - Column('id', Integer, primary_key=True, test_needs_autoincrement=True), + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True), Column('data', JSONEncodedDict), Column('unrelated_data', String(50)) ) @@ -269,12 +269,19 @@ class _CompositeTestBase(object): @classmethod def define_tables(cls, metadata): Table('foo', metadata, - Column('id', Integer, primary_key=True, test_needs_autoincrement=True), + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True), Column('x', Integer), Column('y', Integer), Column('unrelated_data', String(50)) ) + def setup(self): + from sqlalchemy.ext import mutable + mutable._setup_composite_listener() + super(_CompositeTestBase, self).setup() + + def teardown(self): # clear out mapper events Mapper.dispatch._clear() @@ -284,7 +291,6 @@ class _CompositeTestBase(object): @classmethod def _type_fixture(cls): - from sqlalchemy.ext.mutable import Mutable from sqlalchemy.ext.mutable import MutableComposite global Point @@ -322,7 +328,7 @@ class MutableCompositesUnpickleTest(_CompositeTestBase, fixtures.MappedTest): cls.Point = cls._type_fixture() mapper(FooWithEq, foo, properties={ - 'data':composite(cls.Point, foo.c.x, foo.c.y) + 'data': composite(cls.Point, foo.c.x, foo.c.y) }) def test_unpickle_modified_eq(self): @@ -339,7 +345,7 @@ class MutableCompositesTest(_CompositeTestBase, fixtures.MappedTest): Point = cls._type_fixture() mapper(Foo, foo, properties={ - 'data':composite(Point, foo.c.x, foo.c.y) + 'data': composite(Point, foo.c.x, foo.c.y) }) def test_in_place_mutation(self): @@ -403,6 +409,71 @@ class MutableCompositesTest(_CompositeTestBase, fixtures.MappedTest): eq_(f1.data.x, 5) +class MutableCompositeCustomCoerceTest(_CompositeTestBase, fixtures.MappedTest): + @classmethod + def _type_fixture(cls): + + from sqlalchemy.ext.mutable import MutableComposite + + global Point + + class Point(MutableComposite): + def __init__(self, x, y): + self.x = x + self.y = y + + @classmethod + def coerce(cls, key, value): + if isinstance(value, tuple): + value = Point(*value) + return value + + def __setattr__(self, key, value): + object.__setattr__(self, key, value) + self.changed() + + def __composite_values__(self): + return self.x, self.y + + def __getstate__(self): + return self.x, self.y + + def __setstate__(self, state): + self.x, self.y = state + + def __eq__(self, other): + return isinstance(other, Point) and \ + other.x == self.x and \ + other.y == self.y + return Point + + + @classmethod + def setup_mappers(cls): + foo = cls.tables.foo + + Point = cls._type_fixture() + + mapper(Foo, foo, properties={ + 'data': composite(Point, foo.c.x, foo.c.y) + }) + + def test_custom_coerce(self): + f = Foo() + f.data = (3, 4) + eq_(f.data, Point(3, 4)) + + def test_round_trip_ok(self): + sess = Session() + f = Foo() + f.data = (3, 4) + + sess.add(f) + sess.commit() + + eq_(f.data, Point(3, 4)) + + class MutableInheritedCompositesTest(_CompositeTestBase, fixtures.MappedTest): @classmethod def define_tables(cls, metadata): @@ -423,7 +494,7 @@ class MutableInheritedCompositesTest(_CompositeTestBase, fixtures.MappedTest): Point = cls._type_fixture() mapper(Foo, foo, properties={ - 'data':composite(Point, foo.c.x, foo.c.y) + 'data': composite(Point, foo.c.x, foo.c.y) }) mapper(SubFoo, subfoo, inherits=Foo) diff --git a/test/orm/inheritance/test_polymorphic_rel.py b/test/orm/inheritance/test_polymorphic_rel.py index 8bbde972d..e22848912 100644 --- a/test/orm/inheritance/test_polymorphic_rel.py +++ b/test/orm/inheritance/test_polymorphic_rel.py @@ -496,41 +496,10 @@ class _PolymorphicTestBase(object): .all(), expected) - # TODO: this fails due to the change - # in _configure_subclass_mapper. however we might not - # need it anymore. - def test_polymorphic_option(self): - """ - Test that polymorphic loading sets state.load_path with its - actual mapper on a subclass, and not the superclass mapper. - - This only works for non-aliased mappers. - """ - paths = [] - class MyOption(interfaces.MapperOption): - propagate_to_loaders = True - def process_query_conditionally(self, query): - paths.append(query._current_path.path) - - sess = create_session() - names = ['dilbert', 'pointy haired boss'] - dilbert, boss = ( - sess.query(Person) - .options(MyOption()) - .filter(Person.name.in_(names)) - .order_by(Person.name).all()) - - dilbert.machines - boss.paperwork - - eq_(paths, - [(class_mapper(Engineer), 'machines'), - (class_mapper(Boss), 'paperwork')]) def test_subclass_option_pathing(self): from sqlalchemy.orm import defer sess = create_session() - names = ['dilbert', 'pointy haired boss'] dilbert = sess.query(Person).\ options(defer(Engineer.machines, Machine.name)).\ filter(Person.name == 'dilbert').first() @@ -963,7 +932,7 @@ class _PolymorphicTestBase(object): .filter(palias.name.in_(['dilbert', 'wally'])).all(), [e1, e2]) - def test_self_referential(self): + def test_self_referential_one(self): sess = create_session() palias = aliased(Person) expected = [(m1, e1), (m1, e2), (m1, b1)] @@ -975,6 +944,11 @@ class _PolymorphicTestBase(object): .order_by(Person.person_id, palias.person_id).all(), expected) + def test_self_referential_two(self): + sess = create_session() + palias = aliased(Person) + expected = [(m1, e1), (m1, e2), (m1, b1)] + eq_(sess.query(Person, palias) .filter(Person.company_id == palias.company_id) .filter(Person.name == 'dogbert') diff --git a/test/orm/inheritance/test_relationship.py b/test/orm/inheritance/test_relationship.py index a630e2f3d..509d540ef 100644 --- a/test/orm/inheritance/test_relationship.py +++ b/test/orm/inheritance/test_relationship.py @@ -1,6 +1,6 @@ from sqlalchemy.orm import create_session, relationship, mapper, \ contains_eager, joinedload, subqueryload, subqueryload_all,\ - Session, aliased + Session, aliased, with_polymorphic from sqlalchemy import Integer, String, ForeignKey from sqlalchemy.engine import default @@ -717,15 +717,29 @@ class EagerToSubclassTest(fixtures.MappedTest): def test_subq_through_related(self): Parent = self.classes.Parent - Sub = self.classes.Sub + Base = self.classes.Base sess = Session() + def go(): eq_(sess.query(Parent) - .options(subqueryload_all(Parent.children, Sub.related)) + .options(subqueryload_all(Parent.children, Base.related)) .order_by(Parent.data).all(), [p1, p2]) self.assert_sql_count(testing.db, go, 3) + def test_subq_through_related_aliased(self): + Parent = self.classes.Parent + Base = self.classes.Base + pa = aliased(Parent) + sess = Session() + + def go(): + eq_(sess.query(pa) + .options(subqueryload_all(pa.children, Base.related)) + .order_by(pa.data).all(), + [p1, p2]) + self.assert_sql_count(testing.db, go, 3) + class SubClassEagerToSubClassTest(fixtures.MappedTest): """Test joinedloads from subclass to subclass mappers""" @@ -876,3 +890,226 @@ class SubClassEagerToSubClassTest(fixtures.MappedTest): [p1, p2]) self.assert_sql_count(testing.db, go, 2) +class SameNamedPropTwoPolymorphicSubClassesTest(fixtures.MappedTest): + """test pathing when two subclasses contain a different property + for the same name, and polymorphic loading is used. + + #2614 + + """ + run_setup_classes = 'once' + run_setup_mappers = 'once' + run_inserts = 'once' + run_deletes = None + + @classmethod + def define_tables(cls, metadata): + Table('a', metadata, + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True), + Column('type', String(10)) + ) + Table('b', metadata, + Column('id', Integer, ForeignKey('a.id'), primary_key=True) + ) + Table('btod', metadata, + Column('bid', Integer, ForeignKey('b.id'), nullable=False), + Column('did', Integer, ForeignKey('d.id'), nullable=False) + ) + Table('c', metadata, + Column('id', Integer, ForeignKey('a.id'), primary_key=True) + ) + Table('ctod', metadata, + Column('cid', Integer, ForeignKey('c.id'), nullable=False), + Column('did', Integer, ForeignKey('d.id'), nullable=False) + ) + Table('d', metadata, + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True) + ) + + @classmethod + def setup_classes(cls): + class A(cls.Comparable): + pass + class B(A): + pass + class C(A): + pass + class D(cls.Comparable): + pass + + @classmethod + def setup_mappers(cls): + A = cls.classes.A + B = cls.classes.B + C = cls.classes.C + D = cls.classes.D + + mapper(A, cls.tables.a, polymorphic_on=cls.tables.a.c.type) + mapper(B, cls.tables.b, inherits=A, polymorphic_identity='b', + properties={ + 'related': relationship(D, secondary=cls.tables.btod) + }) + mapper(C, cls.tables.c, inherits=A, polymorphic_identity='c', + properties={ + 'related': relationship(D, secondary=cls.tables.ctod) + }) + mapper(D, cls.tables.d) + + + @classmethod + def insert_data(cls): + B = cls.classes.B + C = cls.classes.C + D = cls.classes.D + + session = Session() + + d = D() + session.add_all([ + B(related=[d]), + C(related=[d]) + ]) + session.commit() + + def test_free_w_poly_subquery(self): + A = self.classes.A + B = self.classes.B + C = self.classes.C + D = self.classes.D + + session = Session() + d = session.query(D).one() + a_poly = with_polymorphic(A, [B, C]) + def go(): + for a in session.query(a_poly).\ + options( + subqueryload(a_poly.B.related), + subqueryload(a_poly.C.related)): + eq_(a.related, [d]) + self.assert_sql_count(testing.db, go, 3) + + def test_fixed_w_poly_subquery(self): + A = self.classes.A + B = self.classes.B + C = self.classes.C + D = self.classes.D + + session = Session() + d = session.query(D).one() + def go(): + for a in session.query(A).with_polymorphic([B, C]).\ + options(subqueryload(B.related), subqueryload(C.related)): + eq_(a.related, [d]) + self.assert_sql_count(testing.db, go, 3) + + def test_free_w_poly_joined(self): + A = self.classes.A + B = self.classes.B + C = self.classes.C + D = self.classes.D + + session = Session() + d = session.query(D).one() + a_poly = with_polymorphic(A, [B, C]) + def go(): + for a in session.query(a_poly).\ + options( + joinedload(a_poly.B.related), + joinedload(a_poly.C.related)): + eq_(a.related, [d]) + self.assert_sql_count(testing.db, go, 1) + + def test_fixed_w_poly_joined(self): + A = self.classes.A + B = self.classes.B + C = self.classes.C + D = self.classes.D + + session = Session() + d = session.query(D).one() + def go(): + for a in session.query(A).with_polymorphic([B, C]).\ + options(joinedload(B.related), joinedload(C.related)): + eq_(a.related, [d]) + self.assert_sql_count(testing.db, go, 1) + + +class SubClassToSubClassFromParentTest(fixtures.MappedTest): + """test #2617 + + """ + run_setup_classes = 'once' + run_setup_mappers = 'once' + run_inserts = 'once' + run_deletes = None + + @classmethod + def define_tables(cls, metadata): + Table('z', metadata, + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True) + ) + Table('a', metadata, + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True), + Column('type', String(10)), + Column('z_id', Integer, ForeignKey('z.id')) + ) + Table('b', metadata, + Column('id', Integer, ForeignKey('a.id'), primary_key=True) + ) + Table('d', metadata, + Column('id', Integer, ForeignKey('a.id'), primary_key=True), + Column('b_id', Integer, ForeignKey('b.id')) + ) + + @classmethod + def setup_classes(cls): + class Z(cls.Comparable): + pass + class A(cls.Comparable): + pass + class B(A): + pass + class D(A): + pass + + @classmethod + def setup_mappers(cls): + Z = cls.classes.Z + A = cls.classes.A + B = cls.classes.B + D = cls.classes.D + + mapper(Z, cls.tables.z) + mapper(A, cls.tables.a, polymorphic_on=cls.tables.a.c.type, + with_polymorphic='*', + properties={ + 'zs': relationship(Z, lazy="subquery") + }) + mapper(B, cls.tables.b, inherits=A, polymorphic_identity='b', + properties={ + 'related': relationship(D, lazy="subquery", + primaryjoin=cls.tables.d.c.b_id == + cls.tables.b.c.id) + }) + mapper(D, cls.tables.d, inherits=A, polymorphic_identity='d') + + + @classmethod + def insert_data(cls): + B = cls.classes.B + + session = Session() + session.add(B()) + session.commit() + + def test_2617(self): + A = self.classes.A + session = Session() + def go(): + a1 = session.query(A).first() + eq_(a1.related, []) + self.assert_sql_count(testing.db, go, 3) diff --git a/test/orm/inheritance/test_single.py b/test/orm/inheritance/test_single.py index 08a693b92..de6e55e95 100644 --- a/test/orm/inheritance/test_single.py +++ b/test/orm/inheritance/test_single.py @@ -278,7 +278,8 @@ class RelationshipFromSingleTest(testing.AssertsCompiledSQL, fixtures.MappedTest sess = create_session() context = sess.query(Manager).options(subqueryload('stuff'))._compile_context() - subq = context.attributes[('subquery', (class_mapper(Employee), 'stuff'))] + subq = context.attributes[('subquery', + (class_mapper(Manager), class_mapper(Manager).attrs.stuff))] self.assert_compile(subq, 'SELECT employee_stuff.id AS ' diff --git a/test/orm/test_eager_relations.py b/test/orm/test_eager_relations.py index ae3853e75..2c59491b1 100644 --- a/test/orm/test_eager_relations.py +++ b/test/orm/test_eager_relations.py @@ -311,6 +311,8 @@ class EagerTest(_fixtures.FixtureTest, testing.AssertsCompiledSQL): }) mapper(Keyword, keywords) + + for opt, count in [ (( joinedload(User.orders, Order.items), @@ -2662,128 +2664,3 @@ class CyclicalInheritingEagerTestTwo(fixtures.DeclarativeMappedTest, assert len(list(session)) == 3 - -class WarnFor2614TestBase(object): - - @classmethod - def define_tables(cls, metadata): - Table('a', metadata, - Column('id', Integer, primary_key=True), - Column('type', String(50)), - ) - Table('b', metadata, - Column('id', Integer, ForeignKey('a.id'), primary_key=True), - ) - Table('c', metadata, - Column('id', Integer, ForeignKey('a.id'), primary_key=True), - ) - Table('d', metadata, - Column('id', Integer, primary_key=True), - Column('bid', Integer, ForeignKey('b.id')), - Column('cid', Integer, ForeignKey('c.id')), - ) - - def _mapping(self, lazy_b=True, lazy_c=True): - class A(object): - pass - - class B(A): - pass - - class C(A): - pass - - class D(object): - pass - - mapper(A, self.tables.a, polymorphic_on=self.tables.a.c.type) - mapper(B, self.tables.b, inherits=A, polymorphic_identity='b', - properties={ - 'ds': relationship(D, lazy=lazy_b) - }) - mapper(C, self.tables.c, inherits=A, polymorphic_identity='c', - properties={ - 'ds': relationship(D, lazy=lazy_c) - }) - mapper(D, self.tables.d) - - - return A, B, C, D - - def _assert_raises(self, fn): - assert_raises_message( - sa.exc.InvalidRequestError, - "Eager loading cannot currently function correctly when two or more " - r"same\-named attributes associated with multiple polymorphic classes " - "of the same base are present. Encountered more than one " - r"eager path for attribute 'ds' on mapper 'Mapper\|A\|a'.", - fn - ) - - def test_poly_both_eager(self): - A, B, C, D = self._mapping(lazy_b=self.eager_name, - lazy_c=self.eager_name) - - session = Session() - self._assert_raises( - session.query(A).with_polymorphic('*').all - ) - - def test_poly_one_eager(self): - A, B, C, D = self._mapping(lazy_b=self.eager_name, lazy_c=True) - - session = Session() - session.query(A).with_polymorphic('*').all() - - def test_poly_both_option(self): - A, B, C, D = self._mapping() - - session = Session() - self._assert_raises( - session.query(A).with_polymorphic('*').options( - self.eager_option(B.ds), self.eager_option(C.ds)).all - ) - - def test_poly_one_option_bs(self): - A, B, C, D = self._mapping() - - session = Session() - - # sucks, can't even do eager() on just one of them, as B.ds - # hits for both - self._assert_raises( - session.query(A).with_polymorphic('*').\ - options(self.eager_option(B.ds)).all - ) - - def test_poly_one_option_cs(self): - A, B, C, D = self._mapping() - - session = Session() - - # sucks, can't even do eager() on just one of them, as B.ds - # hits for both - self._assert_raises( - session.query(A).with_polymorphic('*').\ - options(self.eager_option(C.ds)).all - ) - - def test_single_poly_one_option_bs(self): - A, B, C, D = self._mapping() - - session = Session() - - session.query(A).with_polymorphic(B).\ - options(self.eager_option(B.ds)).all() - - def test_lazy_True(self): - A, B, C, D = self._mapping() - - session = Session() - session.query(A).with_polymorphic('*').all() - -class WarnFor2614Test(WarnFor2614TestBase, fixtures.MappedTest): - eager_name = "joined" - - def eager_option(self, arg): - return joinedload(arg) diff --git a/test/orm/test_inspect.py b/test/orm/test_inspect.py index fc2d2181f..fdf675183 100644 --- a/test/orm/test_inspect.py +++ b/test/orm/test_inspect.py @@ -113,7 +113,7 @@ class TestORMInspection(_fixtures.FixtureTest): def test_with_polymorphic(self): User = self.classes.User insp = inspect(User) - eq_(insp.with_polymorphic_mappers, [insp]) + eq_(insp.with_polymorphic_mappers, []) def test_col_property(self): User = self.classes.User @@ -198,7 +198,7 @@ class TestORMInspection(_fixtures.FixtureTest): is_(prop._parentmapper, class_mapper(User)) is_(prop.mapper, class_mapper(Address)) - is_(prop._parententity, ua) + is_(prop._parententity, inspect(ua)) def test_insp_column_prop(self): User = self.classes.User @@ -222,7 +222,7 @@ class TestORMInspection(_fixtures.FixtureTest): assert not hasattr(prop, "mapper") - is_(prop._parententity, ua) + is_(prop._parententity, inspect(ua)) def test_rel_accessors(self): User = self.classes.User diff --git a/test/orm/test_joins.py b/test/orm/test_joins.py index 5ef436b0f..8fd38a680 100644 --- a/test/orm/test_joins.py +++ b/test/orm/test_joins.py @@ -1922,7 +1922,7 @@ class SelfReferentialTest(fixtures.MappedTest, AssertsCompiledSQL): ) - def test_multiple_explicit_entities(self): + def test_multiple_explicit_entities_one(self): Node = self.classes.Node sess = create_session() @@ -1938,6 +1938,13 @@ class SelfReferentialTest(fixtures.MappedTest, AssertsCompiledSQL): (Node(data='n122'), Node(data='n12'), Node(data='n1')) ) + def test_multiple_explicit_entities_two(self): + Node = self.classes.Node + + sess = create_session() + + parent = aliased(Node) + grandparent = aliased(Node) eq_( sess.query(Node, parent, grandparent).\ join(parent, Node.parent).\ @@ -1947,6 +1954,13 @@ class SelfReferentialTest(fixtures.MappedTest, AssertsCompiledSQL): (Node(data='n122'), Node(data='n12'), Node(data='n1')) ) + def test_multiple_explicit_entities_three(self): + Node = self.classes.Node + + sess = create_session() + + parent = aliased(Node) + grandparent = aliased(Node) # same, change order around eq_( sess.query(parent, grandparent, Node).\ @@ -1957,6 +1971,13 @@ class SelfReferentialTest(fixtures.MappedTest, AssertsCompiledSQL): (Node(data='n12'), Node(data='n1'), Node(data='n122')) ) + def test_multiple_explicit_entities_four(self): + Node = self.classes.Node + + sess = create_session() + + parent = aliased(Node) + grandparent = aliased(Node) eq_( sess.query(Node, parent, grandparent).\ join(parent, Node.parent).\ @@ -1967,6 +1988,13 @@ class SelfReferentialTest(fixtures.MappedTest, AssertsCompiledSQL): (Node(data='n122'), Node(data='n12'), Node(data='n1')) ) + def test_multiple_explicit_entities_five(self): + Node = self.classes.Node + + sess = create_session() + + parent = aliased(Node) + grandparent = aliased(Node) eq_( sess.query(Node, parent, grandparent).\ join(parent, Node.parent).\ diff --git a/test/orm/test_mapper.py b/test/orm/test_mapper.py index 1d7e0228c..8c5b9cd84 100644 --- a/test/orm/test_mapper.py +++ b/test/orm/test_mapper.py @@ -3278,4 +3278,3 @@ class MagicNamesTest(fixtures.MappedTest): reserved: maps.c.state}) - diff --git a/test/orm/test_query.py b/test/orm/test_query.py index 9bf728de2..9aad19579 100644 --- a/test/orm/test_query.py +++ b/test/orm/test_query.py @@ -2,6 +2,7 @@ import operator from sqlalchemy import MetaData, null, exists, text, union, literal, \ literal_column, func, between, Unicode, desc, and_, bindparam, \ select, distinct, or_, collate +from sqlalchemy import inspect from sqlalchemy import exc as sa_exc, util from sqlalchemy.sql import compiler, table, column from sqlalchemy.sql import expression @@ -2309,6 +2310,9 @@ class OptionsTest(QueryTest): if i % 2 == 0: if isinstance(item, type): item = class_mapper(item) + else: + if isinstance(item, basestring): + item = inspect(r[-1]).mapper.attrs[item] r.append(item) return tuple(r) @@ -2351,7 +2355,8 @@ class OptionsTest(QueryTest): q = sess.query(User) opt = self._option_fixture('email_address', 'id') q = sess.query(Address)._with_current_path( - orm_util.PathRegistry.coerce([class_mapper(User), 'addresses']) + orm_util.PathRegistry.coerce([inspect(User), + inspect(User).attrs.addresses]) ) self._assert_path_result(opt, q, []) @@ -2462,13 +2467,13 @@ class OptionsTest(QueryTest): class SubAddr(Address): pass mapper(SubAddr, inherits=Address, properties={ - 'flub':relationship(Dingaling) + 'flub': relationship(Dingaling) }) q = sess.query(Address) opt = self._option_fixture(SubAddr.flub) - self._assert_path_result(opt, q, [(Address, 'flub')]) + self._assert_path_result(opt, q, [(SubAddr, 'flub')]) def test_from_subclass_to_subclass_attr(self): Dingaling, Address = self.classes.Dingaling, self.classes.Address @@ -2477,7 +2482,7 @@ class OptionsTest(QueryTest): class SubAddr(Address): pass mapper(SubAddr, inherits=Address, properties={ - 'flub':relationship(Dingaling) + 'flub': relationship(Dingaling) }) q = sess.query(SubAddr) @@ -2492,13 +2497,14 @@ class OptionsTest(QueryTest): class SubAddr(Address): pass mapper(SubAddr, inherits=Address, properties={ - 'flub':relationship(Dingaling) + 'flub': relationship(Dingaling) }) q = sess.query(Address) opt = self._option_fixture(SubAddr.user) - self._assert_path_result(opt, q, [(Address, 'user')]) + self._assert_path_result(opt, q, + [(Address, inspect(Address).attrs.user)]) def test_of_type(self): User, Address = self.classes.User, self.classes.Address @@ -2511,9 +2517,11 @@ class OptionsTest(QueryTest): q = sess.query(User) opt = self._option_fixture(User.addresses.of_type(SubAddr), SubAddr.user) + u_mapper = inspect(User) + a_mapper = inspect(Address) self._assert_path_result(opt, q, [ - (User, 'addresses'), - (User, 'addresses', SubAddr, 'user') + (u_mapper, u_mapper.attrs.addresses), + (u_mapper, u_mapper.attrs.addresses, a_mapper, a_mapper.attrs.user) ]) def test_of_type_plus_level(self): @@ -2525,15 +2533,17 @@ class OptionsTest(QueryTest): class SubAddr(Address): pass mapper(SubAddr, inherits=Address, properties={ - 'flub':relationship(Dingaling) + 'flub': relationship(Dingaling) }) q = sess.query(User) opt = self._option_fixture(User.addresses.of_type(SubAddr), SubAddr.flub) + u_mapper = inspect(User) + sa_mapper = inspect(SubAddr) self._assert_path_result(opt, q, [ - (User, 'addresses'), - (User, 'addresses', SubAddr, 'flub') + (u_mapper, u_mapper.attrs.addresses), + (u_mapper, u_mapper.attrs.addresses, sa_mapper, sa_mapper.attrs.flub) ]) def test_aliased_single(self): @@ -2543,7 +2553,7 @@ class OptionsTest(QueryTest): ualias = aliased(User) q = sess.query(ualias) opt = self._option_fixture(ualias.addresses) - self._assert_path_result(opt, q, [(ualias, 'addresses')]) + self._assert_path_result(opt, q, [(inspect(ualias), 'addresses')]) def test_with_current_aliased_single(self): User, Address = self.classes.User, self.classes.Address @@ -2554,7 +2564,7 @@ class OptionsTest(QueryTest): self._make_path_registry([Address, 'user']) ) opt = self._option_fixture(Address.user, ualias.addresses) - self._assert_path_result(opt, q, [(ualias, 'addresses')]) + self._assert_path_result(opt, q, [(inspect(ualias), 'addresses')]) def test_with_current_aliased_single_nonmatching_option(self): User, Address = self.classes.User, self.classes.Address @@ -2828,8 +2838,8 @@ class OptionsNoPropTest(_fixtures.FixtureTest): cls.tables.addresses, cls.classes.Address, cls.tables.orders, cls.classes.Order) mapper(User, users, properties={ - 'addresses':relationship(Address), - 'orders':relationship(Order) + 'addresses': relationship(Address), + 'orders': relationship(Order) }) mapper(Address, addresses) mapper(Order, orders) @@ -2839,7 +2849,7 @@ class OptionsNoPropTest(_fixtures.FixtureTest): cls.classes.Keyword, cls.classes.Item) mapper(Keyword, keywords, properties={ - "keywords":column_property(keywords.c.name + "some keyword") + "keywords": column_property(keywords.c.name + "some keyword") }) mapper(Item, items, properties=dict(keywords=relationship(Keyword, @@ -2850,7 +2860,7 @@ class OptionsNoPropTest(_fixtures.FixtureTest): q = create_session().query(*entity_list).\ options(joinedload(option)) - key = ('loaderstrategy', (class_mapper(Item), 'keywords')) + key = ('loaderstrategy', (inspect(Item), inspect(Item).attrs.keywords)) assert key in q._attributes def _assert_eager_with_entity_exception(self, entity_list, options, @@ -2865,3 +2875,4 @@ class OptionsNoPropTest(_fixtures.FixtureTest): assert_raises_message(sa.exc.ArgumentError, message, create_session().query(column).options, joinedload(eager_option)) + diff --git a/test/orm/test_subquery_relations.py b/test/orm/test_subquery_relations.py index 000f4abaf..a4cc830ee 100644 --- a/test/orm/test_subquery_relations.py +++ b/test/orm/test_subquery_relations.py @@ -1226,24 +1226,24 @@ class InheritanceToRelatedTest(fixtures.MappedTest): @classmethod def fixtures(cls): return dict( - foo = [ + foo=[ ('id', 'type', 'related_id'), (1, 'bar', 1), (2, 'bar', 2), (3, 'baz', 1), (4, 'baz', 2), ], - bar = [ + bar=[ ('id', ), (1,), (2,) ], - baz = [ + baz=[ ('id', ), (3,), (4,) ], - related = [ + related=[ ('id', ), (1,), (2,) @@ -1252,7 +1252,7 @@ class InheritanceToRelatedTest(fixtures.MappedTest): @classmethod def setup_mappers(cls): mapper(cls.classes.Foo, cls.tables.foo, properties={ - 'related':relationship(cls.classes.Related) + 'related': relationship(cls.classes.Related) }, polymorphic_on=cls.tables.foo.c.type) mapper(cls.classes.Bar, cls.tables.bar, polymorphic_identity='bar', inherits=cls.classes.Foo) @@ -1260,22 +1260,43 @@ class InheritanceToRelatedTest(fixtures.MappedTest): inherits=cls.classes.Foo) mapper(cls.classes.Related, cls.tables.related) - def test_caches_query_per_base(self): + def test_caches_query_per_base_subq(self): Foo, Bar, Baz, Related = self.classes.Foo, self.classes.Bar, \ self.classes.Baz, self.classes.Related s = Session(testing.db) def go(): eq_( - s.query(Foo).with_polymorphic([Bar, Baz]).order_by(Foo.id).options(subqueryload(Foo.related)).all(), + s.query(Foo).with_polymorphic([Bar, Baz]).\ + order_by(Foo.id).\ + options(subqueryload(Foo.related)).all(), [ - Bar(id=1,related=Related(id=1)), - Bar(id=2,related=Related(id=2)), - Baz(id=3,related=Related(id=1)), - Baz(id=4,related=Related(id=2)) + Bar(id=1, related=Related(id=1)), + Bar(id=2, related=Related(id=2)), + Baz(id=3, related=Related(id=1)), + Baz(id=4, related=Related(id=2)) ] ) self.assert_sql_count(testing.db, go, 2) + def test_caches_query_per_base_joined(self): + # technically this should be in test_eager_relations + Foo, Bar, Baz, Related = self.classes.Foo, self.classes.Bar, \ + self.classes.Baz, self.classes.Related + s = Session(testing.db) + def go(): + eq_( + s.query(Foo).with_polymorphic([Bar, Baz]).\ + order_by(Foo.id).\ + options(joinedload(Foo.related)).all(), + [ + Bar(id=1, related=Related(id=1)), + Bar(id=2, related=Related(id=2)), + Baz(id=3, related=Related(id=1)), + Baz(id=4, related=Related(id=2)) + ] + ) + self.assert_sql_count(testing.db, go, 1) + class CyclicalInheritingEagerTestOne(fixtures.MappedTest): @classmethod @@ -1344,14 +1365,13 @@ class CyclicalInheritingEagerTestTwo(fixtures.DeclarativeMappedTest, def test_from_subclass(self): Director = self.classes.Director - PersistentObject = self.classes.PersistentObject - s = create_session() ctx = s.query(Director).options(subqueryload('*'))._compile_context() - q = ctx.attributes[('subquery', (inspect(PersistentObject), 'movies'))] + q = ctx.attributes[('subquery', + (inspect(Director), inspect(Director).attrs.movies))] self.assert_compile(q, "SELECT anon_1.movie_id AS anon_1_movie_id, " "anon_1.persistent_id AS anon_1_persistent_id, " @@ -1384,10 +1404,3 @@ class CyclicalInheritingEagerTestTwo(fixtures.DeclarativeMappedTest, d = session.query(Director).options(subqueryload('*')).first() assert len(list(session)) == 3 -from . import test_eager_relations - -class WarnFor2614Test(test_eager_relations.WarnFor2614TestBase, fixtures.MappedTest): - eager_name = "subquery" - - def eager_option(self, arg): - return subqueryload(arg) diff --git a/test/orm/test_utils.py b/test/orm/test_utils.py index afe8fb5c1..b2853a8b8 100644 --- a/test/orm/test_utils.py +++ b/test/orm/test_utils.py @@ -4,7 +4,7 @@ from sqlalchemy import Column from sqlalchemy import Integer from sqlalchemy import MetaData from sqlalchemy import Table -from sqlalchemy.orm import aliased +from sqlalchemy.orm import aliased, with_polymorphic from sqlalchemy.orm import mapper, create_session from sqlalchemy.testing import fixtures from test.orm import _fixtures @@ -228,6 +228,7 @@ class IdentityKeyTest(_fixtures.FixtureTest): key = util.identity_key(User, row=row) eq_(key, (User, (1,))) + class PathRegistryTest(_fixtures.FixtureTest): run_setup_mappers = 'once' run_inserts = None @@ -242,7 +243,7 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) is_( util.RootRegistry()[umapper], - umapper._sa_path_registry + umapper._path_registry ) eq_( util.RootRegistry()[umapper], @@ -255,9 +256,10 @@ class PathRegistryTest(_fixtures.FixtureTest): path = PathRegistry.coerce((umapper,)) eq_( - path['addresses'][amapper]['email_address'], - PathRegistry.coerce((umapper, 'addresses', - amapper, 'email_address')) + path[umapper.attrs.addresses][amapper] + [amapper.attrs.email_address], + PathRegistry.coerce((umapper, umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) ) def test_entity_boolean(self): @@ -267,47 +269,48 @@ class PathRegistryTest(_fixtures.FixtureTest): def test_key_boolean(self): umapper = inspect(self.classes.User) - path = PathRegistry.coerce((umapper, 'addresses')) + path = PathRegistry.coerce((umapper, umapper.attrs.addresses)) is_(bool(path), True) def test_aliased_class(self): User = self.classes.User ua = aliased(User) - path = PathRegistry.coerce((ua, 'addresses')) + ua_insp = inspect(ua) + path = PathRegistry.coerce((ua_insp, ua_insp.mapper.attrs.addresses)) assert path.parent.is_aliased_class def test_indexed_entity(self): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - path = PathRegistry.coerce((umapper, 'addresses', - amapper, 'email_address')) + path = PathRegistry.coerce((umapper, umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) is_(path[0], umapper) is_(path[2], amapper) def test_indexed_key(self): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - path = PathRegistry.coerce((umapper, 'addresses', - amapper, 'email_address')) - eq_(path[1], 'addresses') - eq_(path[3], 'email_address') + path = PathRegistry.coerce((umapper, umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) + eq_(path[1], umapper.attrs.addresses) + eq_(path[3], amapper.attrs.email_address) def test_slice(self): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - path = PathRegistry.coerce((umapper, 'addresses', - amapper, 'email_address')) - eq_(path[1:3], ('addresses', amapper)) + path = PathRegistry.coerce((umapper, umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) + eq_(path[1:3], (umapper.attrs.addresses, amapper)) def test_addition(self): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((amapper, 'email_address')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((amapper, amapper.attrs.email_address)) eq_( p1 + p2, - PathRegistry.coerce((umapper, 'addresses', - amapper, 'email_address')) + PathRegistry.coerce((umapper, umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) ) def test_length(self): @@ -315,10 +318,10 @@ class PathRegistryTest(_fixtures.FixtureTest): amapper = inspect(self.classes.Address) pneg1 = PathRegistry.coerce(()) p0 = PathRegistry.coerce((umapper,)) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((umapper, 'addresses', - amapper, 'email_address')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((umapper, umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) eq_(len(pneg1), 0) eq_(len(p0), 1) @@ -334,14 +337,17 @@ class PathRegistryTest(_fixtures.FixtureTest): def test_eq(self): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses')) - p3 = PathRegistry.coerce((umapper, 'other')) - p4 = PathRegistry.coerce((amapper, 'addresses')) - p5 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p6 = PathRegistry.coerce((amapper, 'user', umapper, 'addresses')) - p7 = PathRegistry.coerce((amapper, 'user', umapper, 'addresses', - amapper, 'email_address')) + u_alias = inspect(aliased(self.classes.User)) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p3 = PathRegistry.coerce((umapper, umapper.attrs.name)) + p4 = PathRegistry.coerce((u_alias, umapper.attrs.addresses)) + p5 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p6 = PathRegistry.coerce((amapper, amapper.attrs.user, umapper, + umapper.attrs.addresses)) + p7 = PathRegistry.coerce((amapper, amapper.attrs.user, umapper, + umapper.attrs.addresses, + amapper, amapper.attrs.email_address)) is_(p1 == p2, True) is_(p1 == p3, False) @@ -358,7 +364,7 @@ class PathRegistryTest(_fixtures.FixtureTest): def test_contains_mapper(self): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) assert p1.contains_mapper(umapper) assert not p1.contains_mapper(amapper) @@ -373,18 +379,18 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((amapper, 'email_address')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((amapper, amapper.attrs.email_address)) eq_( - p1.path, (umapper, 'addresses') + p1.path, (umapper, umapper.attrs.addresses) ) eq_( - p2.path, (umapper, 'addresses', amapper) + p2.path, (umapper, umapper.attrs.addresses, amapper) ) eq_( - p3.path, (amapper, 'email_address') + p3.path, (amapper, amapper.attrs.email_address) ) def test_registry_set(self): @@ -392,9 +398,9 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((amapper, 'email_address')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((amapper, amapper.attrs.email_address)) p1.set(reg, "p1key", "p1value") p2.set(reg, "p2key", "p2value") @@ -413,9 +419,9 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((amapper, 'email_address')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((amapper, amapper.attrs.email_address)) reg.update( { ('p1key', p1.path): 'p1value', @@ -435,9 +441,9 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((amapper, 'email_address')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((amapper, amapper.attrs.email_address)) reg.update( { ('p1key', p1.path): 'p1value', @@ -455,8 +461,8 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) reg.update( { ('p1key', p1.path): 'p1value', @@ -481,10 +487,10 @@ class PathRegistryTest(_fixtures.FixtureTest): umapper = inspect(self.classes.User) amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses', amapper, - 'email_address')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((umapper, 'addresses')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper, + amapper.attrs.email_address)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) eq_( p1.serialize(), [(User, "addresses"), (Address, "email_address")] @@ -505,10 +511,10 @@ class PathRegistryTest(_fixtures.FixtureTest): amapper = inspect(self.classes.Address) - p1 = PathRegistry.coerce((umapper, 'addresses', amapper, - 'email_address')) - p2 = PathRegistry.coerce((umapper, 'addresses', amapper)) - p3 = PathRegistry.coerce((umapper, 'addresses')) + p1 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper, + amapper.attrs.email_address)) + p2 = PathRegistry.coerce((umapper, umapper.attrs.addresses, amapper)) + p3 = PathRegistry.coerce((umapper, umapper.attrs.addresses)) eq_( PathRegistry.deserialize([(User, "addresses"), @@ -523,3 +529,135 @@ class PathRegistryTest(_fixtures.FixtureTest): PathRegistry.deserialize([(User, "addresses")]), p3 ) + +from .inheritance import _poly_fixtures +class PathRegistryInhTest(_poly_fixtures._Polymorphic): + run_setup_mappers = 'once' + run_inserts = None + run_deletes = None + + def test_plain(self): + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + pmapper = inspect(Person) + emapper = inspect(Engineer) + + p1 = PathRegistry.coerce((pmapper, emapper.attrs.machines)) + + # given a mapper and an attribute on a subclass, + # the path converts what you get to be against that subclass + eq_( + p1.path, + (emapper, emapper.attrs.machines) + ) + + def test_plain_compound(self): + Company = _poly_fixtures.Company + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + cmapper = inspect(Company) + pmapper = inspect(Person) + emapper = inspect(Engineer) + + p1 = PathRegistry.coerce((cmapper, cmapper.attrs.employees, + pmapper, emapper.attrs.machines)) + + # given a mapper and an attribute on a subclass, + # the path converts what you get to be against that subclass + eq_( + p1.path, + (cmapper, cmapper.attrs.employees, emapper, emapper.attrs.machines) + ) + + def test_plain_aliased(self): + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + emapper = inspect(Engineer) + + p_alias = aliased(Person) + p_alias = inspect(p_alias) + + p1 = PathRegistry.coerce((p_alias, emapper.attrs.machines)) + # plain AliasedClass - the path keeps that AliasedClass directly + # as is in the path + eq_( + p1.path, + (p_alias, emapper.attrs.machines) + ) + + def test_plain_aliased_compound(self): + Company = _poly_fixtures.Company + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + cmapper = inspect(Company) + emapper = inspect(Engineer) + + c_alias = aliased(Company) + p_alias = aliased(Person) + + c_alias = inspect(c_alias) + p_alias = inspect(p_alias) + + p1 = PathRegistry.coerce((c_alias, cmapper.attrs.employees, + p_alias, emapper.attrs.machines)) + # plain AliasedClass - the path keeps that AliasedClass directly + # as is in the path + eq_( + p1.path, + (c_alias, cmapper.attrs.employees, p_alias, emapper.attrs.machines) + ) + + def test_with_poly_sub(self): + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + emapper = inspect(Engineer) + + p_poly = with_polymorphic(Person, [Engineer]) + e_poly = inspect(p_poly.Engineer) + p_poly = inspect(p_poly) + + p1 = PathRegistry.coerce((p_poly, emapper.attrs.machines)) + + # polymorphic AliasedClass - the path uses _entity_for_mapper() + # to get the most specific sub-entity + eq_( + p1.path, + (e_poly, emapper.attrs.machines) + ) + + def test_with_poly_base(self): + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + pmapper = inspect(Person) + emapper = inspect(Engineer) + + p_poly = with_polymorphic(Person, [Engineer]) + p_poly = inspect(p_poly) + + # "name" is actually on Person, not Engineer + p1 = PathRegistry.coerce((p_poly, emapper.attrs.name)) + + # polymorphic AliasedClass - because "name" is on Person, + # we get Person, not Engineer + eq_( + p1.path, + (p_poly, pmapper.attrs.name) + ) + + def test_with_poly_use_mapper(self): + Person = _poly_fixtures.Person + Engineer = _poly_fixtures.Engineer + emapper = inspect(Engineer) + + p_poly = with_polymorphic(Person, [Engineer], _use_mapper_path=True) + p_poly = inspect(p_poly) + + p1 = PathRegistry.coerce((p_poly, emapper.attrs.machines)) + + # polymorphic AliasedClass with the "use_mapper_path" flag - + # the AliasedClass acts just like the base mapper + eq_( + p1.path, + (emapper, emapper.attrs.machines) + ) + diff --git a/test/sql/test_operators.py b/test/sql/test_operators.py index 9da9d94c3..45f4978ed 100644 --- a/test/sql/test_operators.py +++ b/test/sql/test_operators.py @@ -833,6 +833,58 @@ class ComparisonOperatorTest(fixtures.TestBase, testing.AssertsCompiledSQL): def test_comparison_operators_ge(self): self._test_comparison_op(operator.ge, '>=', '<=') +class NonZeroTest(fixtures.TestBase): + def _raises(self, expr): + assert_raises_message( + TypeError, + "Boolean value of this clause is not defined", + bool, expr + ) + + def _assert_true(self, expr): + is_(bool(expr), True) + + def _assert_false(self, expr): + is_(bool(expr), False) + + def test_column_identity_eq(self): + c1 = column('c1') + self._assert_true(c1 == c1) + + def test_column_identity_gt(self): + c1 = column('c1') + self._raises(c1 > c1) + + def test_column_compare_eq(self): + c1, c2 = column('c1'), column('c2') + self._assert_false(c1 == c2) + + def test_column_compare_gt(self): + c1, c2 = column('c1'), column('c2') + self._raises(c1 > c2) + + def test_binary_identity_eq(self): + c1 = column('c1') + expr = c1 > 5 + self._assert_true(expr == expr) + + def test_labeled_binary_identity_eq(self): + c1 = column('c1') + expr = (c1 > 5).label(None) + self._assert_true(expr == expr) + + def test_annotated_binary_identity_eq(self): + c1 = column('c1') + expr1 = (c1 > 5) + expr2 = expr1._annotate({"foo": "bar"}) + self._assert_true(expr1 == expr2) + + def test_labeled_binary_compare_gt(self): + c1 = column('c1') + expr1 = (c1 > 5).label(None) + expr2 = (c1 > 5).label(None) + self._assert_false(expr1 == expr2) + class NegationTest(fixtures.TestBase, testing.AssertsCompiledSQL): __dialect__ = 'default' diff --git a/test/sql/test_selectable.py b/test/sql/test_selectable.py index 65dc65470..a60916b44 100644 --- a/test/sql/test_selectable.py +++ b/test/sql/test_selectable.py @@ -1287,7 +1287,9 @@ class AnnotationsTest(fixtures.TestBase): t.c.x, a, s, - s2 + s2, + t.c.x > 1, + (t.c.x > 1).label(None) ]: annot = obj._annotate({}) eq_(set([obj]), set([annot])) diff --git a/test/sql/test_types.py b/test/sql/test_types.py index f9ab785ed..8987743d4 100644 --- a/test/sql/test_types.py +++ b/test/sql/test_types.py @@ -1306,6 +1306,23 @@ class CompileTest(fixtures.TestBase, AssertsCompiledSQL): dialects.mysql.INTEGER(display_width=5), "INTEGER(5)", allow_dialect_select=True) + def test_numeric_plain(self): + self.assert_compile(types.NUMERIC(), 'NUMERIC') + + def test_numeric_precision(self): + self.assert_compile(types.NUMERIC(2), 'NUMERIC(2)') + + def test_numeric_scale(self): + self.assert_compile(types.NUMERIC(2, 4), 'NUMERIC(2, 4)') + + def test_decimal_plain(self): + self.assert_compile(types.DECIMAL(), 'DECIMAL') + + def test_decimal_precision(self): + self.assert_compile(types.DECIMAL(2), 'DECIMAL(2)') + + def test_decimal_scale(self): + self.assert_compile(types.DECIMAL(2, 4), 'DECIMAL(2, 4)') |