| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
such that they are matched to the received result set positionally,
rather than by name. Originally, this was seen as a way to handle
cases where we had columns returned with difficult-to-predict names,
though in modern use that issue has been overcome by anonymous
labeling. In this version, the approach basically reduces function
call count per-result by a few dozen calls, or more for larger
sets of result columns. The approach still degrades into a modern
version of the old approach if textual elements modify the result
map, or if any discrepancy in size exists between
the compiled set of columns versus what was received, so there's no
issue for partially or fully textual compilation scenarios where these
lists might not line up. fixes #918
- callcounts still need to be adjusted down for this so zoomark
tests won't pass at the moment
|
|
|
|
|
|
|
|
|
|
|
| |
and parameters are not displayed if None, reducing confusion for
error messages that weren't related to a statement. The full
module and classname for the DBAPI-level exception is displayed,
making it clear that this is a wrapped DBAPI exception. The
statement and parameters themselves are bounded within a bracketed
sections to better isolate them from the error message and from
each other.
fixes #3172
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
the python 3 merge, now does not expect percent signs (e.g.
as used as the modulus operator and others) to be doubled,
even when using the "pyformat" bound parameter format (this
change is not documented by Mysqlconnector). The dialect now
checks for py2k and for mysqlconnector less than version 2.0
when detecting if the modulus operator should be rendered as
``%%`` or ``%``.
- Unicode SQL is now passed for MySQLconnector version 2.0 and above;
for Py2k and MySQL < 2.0, strings are encoded. Note that mysqlconnector
as of 2.0.1 appears to have a bug with unicode DDL on py2k, so the tests here
are skipping until we observe it's fixed.
- take out profiling on mysqlconnector, callcounts vary too much with
its current development speed
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
constructs are now importable from the "from sqlalchemy" namespace,
just like every other Core construct.
- The implicit conversion of strings to :func:`.text` constructs
when passed to most builder methods of :func:`.select` as
well as :class:`.Query` now emits a warning with just the
plain string sent. The textual conversion still proceeds normally,
however. The only method that accepts a string without a warning
are the "label reference" methods like order_by(), group_by();
these functions will now at compile time attempt to resolve a single
string argument to a column or label expression present in the
selectable; if none is located, the expression still renders, but
you get the warning again. The rationale here is that the implicit
conversion from string to text is more unexpected than not these days,
and it is better that the user send more direction to the Core / ORM
when passing a raw string as to what direction should be taken.
Core/ORM tutorials have been updated to go more in depth as to how text
is handled.
fixes #2992
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
on :class:`.Insert`. This helps to fix a bug where an
INSERT...FROM SELECT construct would inadvertently be compiled
as "implicit returning" on supporting backends, which would
cause breakage in the case of an INSERT that inserts zero rows
(as implicit returning expects a row), as well as arbitrary
return data in the case of an INSERT that inserts multiple
rows (e.g. only the first row of many).
A similar change is also applied to an INSERT..VALUES
with multiple parameter sets; implicit RETURNING will no longer emit
for this statement either. As both of these constructs deal
with varible numbers of rows, the
:attr:`.ResultProxy.inserted_primary_key` accessor does not
apply. Previously, there was a documentation note that one
may prefer ``inline=True`` with INSERT..FROM SELECT as some databases
don't support returning and therefore can't do "implicit" returning,
but there's no reason an INSERT...FROM SELECT needs implicit returning
in any case. Regular explicit :meth:`.Insert.returning` should
be used to return variable numbers of result rows if inserted
data is needed.
fixes #3169
|
| |
|
| |
|
|
|
|
| |
- apply autopep8 + manual fixes to most of test/sql/
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
number of tests.
- move out logging tests from test_execute to test_logging
|
|
|
|
|
|
| |
suite
and adding some more requirements
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
is currently being supported in addition to nose, and will likely
be preferred to nose going forward. The nose plugin system used
by SQLAlchemy has been split out so that it works under pytest as
well. There are no plans to drop support for nose at the moment
and we hope that the test suite itself can continue to remain as
agnostic of testing platform as possible. See the file
README.unittests.rst for updated information on running tests
with pytest.
The test plugin system has also been enhanced to support running
tests against mutiple database URLs at once, by specifying the ``--db``
and/or ``--dburi`` flags multiple times. This does not run the entire test
suite for each database, but instead allows test cases that are specific
to certain backends make use of that backend as the test is run.
When using pytest as the test runner, the system will also run
specific test suites multiple times, once for each database, particularly
those tests within the "dialect suite". The plan is that the enhanced
system will also be used by Alembic, and allow Alembic to run
migration operation tests against multiple backends in one run, including
third-party backends not included within Alembic itself.
Third party dialects and extensions are also encouraged to standardize
on SQLAlchemy's test suite as a basis; see the file README.dialects.rst
for background on building out from SQLAlchemy's test platform.
|
|
|
|
|
|
|
|
|
|
| |
using a column key of the form ``<tablename>_<columnname>``
matching that of an aliased column in the text would still not
match at the ORM level, which is ultimately due to a core
column-matching issue. Additional rules have been added so that the
column ``_label`` is taken into account when working with a
:class:`.TextAsFrom` construct or with literal columns.
[ticket:2932]
|
|
|
|
|
|
|
|
|
|
| |
oriented row lookups were not matching up to the ad-hoc :class:`.ColumnClause`
objects that :class:`.TextAsFrom` generates, thereby making it not
usable as a target in :meth:`.Query.from_statement`. Also fixed
:meth:`.Query.from_statement` mechanics to not mistake a :class:`.TextAsFrom`
for a :class:`.Select` construct. This bug is also an 0.9 regression
as the :meth:`.Text.columns` method is called to accommodate the
:paramref:`.text.typemap` argument. [ticket:2932]
|
| |
|
|
|
|
|
| |
for the control is a TypeError for the row, as is raised on py3k when
less/greater operators are used on incompatible types
|
|
|
|
|
|
|
|
|
| |
would lead to ``TypeError`` when compared to non-tuple types as it attempted
to apply tuple() to the "other" object unconditionally. The
full range of Python comparison operators have now been implemented on
:class:`.RowProxy`, using an approach that guarantees a comparison
system that is equivalent to that of a tuple, and the "other" object
is only coerced if it's an instance of RowProxy. [ticket:2924]
|
|
|
|
|
|
|
| |
when a pre-DBAPI :class:`.StatementError` were raised within
:meth:`.Connection.execute`, causing encoding errors for
non-ASCII statements. The stringification now remains within
Python unicode thus avoiding encoding errors. [ticket:2871]
|
|
|
|
|
|
| |
tuple is; this is accomplished via ensuring tuple() conversion on
both sides within the ``__eq__()`` method as well as
the addition of a ``__lt__()`` method. [ticket:2848]
|
|
|
|
|
|
| |
https://bitbucket.org/zzzeek/sqlalchemy_informixdb
- remove informix, maxdb, access symbols from tests etc.
|
|
|
|
|
| |
has_table issues are OK. On OSX forget it.
- still some issues with PY3k + pyodbc + decimal values it doesn't expect, not sure
|
| |
|
|
|
|
| |
not happening too well (I need to stick with linux + freetds 0.91, I know)
|
|
|
|
| |
- went through examples/ and cleaned out excess list() calls
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
labeled columns when apply_labels() is used; this mode
produces a SELECT where each column is labeled as in
<tablename>_<columnname>, to remove column name collisions
for a multiple table select. The fix is that if two labels
collide when combined with the table name, i.e.
"foo.bar_id" and "foo_bar.id", anonymous aliasing will be
applied to one of the dupes. This allows the ORM to handle
both columns independently; previously, 0.7
would in some cases silently emit a second SELECT for the
column that was "duped", and in 0.8 an ambiguous column error
would be emitted. The "keys" applied to the .c. collection
of the select() will also be deduped, so that the "column
being replaced" warning will no longer emit for any select()
that specifies use_labels, though the dupe key will be given
an anonymous label which isn't generally user-friendly.
[ticket:2702]
|
| |
|
|
|
|
|
| |
for cursor.lastrowid would not function correctly
in conjunction with :attr:`.ResultProxy.inserted_primary_key`.
|
|
|
|
| |
- start filling in default versions of remaining requirements that are still only in test/
|
|
|
|
| |
- add suite tests for basic explicit Sequence support, result-row column access (tests that name_normalize is set correctly among many other things)
|
|
|
|
|
| |
an INSERT that's used in executemany() as opposed to one which has a VALUES
clause with multiple entries.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- update "not supported" messages for empty inserts, mutlivalue inserts
- rework the ValuesBase approach for multiple value sets so that stmt.parameters
does store a list for multiple values; the _has_multiple_parameters flag now indicates
which of the two modes the statement is within. it now raises exceptions if a subsequent
call to values() attempts to call a ValuesBase with one mode in the style of the other
mode; that is, you can't switch a single- or multi- valued ValuesBase to the other mode,
and also if a multiple value is passed simultaneously with a kwargs set.
Added tests for these error conditions
- Calling values() multiple times in multivalue mode now extends the parameter list to
include the new parameter sets.
- add error/test if multiple *args were passed to ValuesBase.values()
- rework the compiler approach for multivalue inserts, back to where
_get_colparams() returns the same list of (column, value) as before, thereby
maintaining the identical number of append() and other calls when multivalue
is not enabled. In the case of multivalue, it makes a last-minute switch to return
a list of lists instead of the single list. As it constructs the additional lists, the inline
defaults and other calculated default parameters of the first parameter
set are copied into the newly generated lists so that these features continue
to function for a multivalue insert. Multivalue inserts now add no additional
function calls to the compilation for regular insert constructs.
- parameter lists for multivalue inserts now includes an integer index for all
parameter sets.
- add detailed documentation for ValuesBase.values(), including careful wording
to describe the difference between multiple values and an executemany() call.
- add a test for multivalue insert + returning - it works !
- remove the very old/never used "postgresql_returning"/"firebird_returning" flags.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Some databases support this syntax for inserts:
INSERT INTO table (id, name) VALUES
('v1', 'v2'),
('v3', 'v4');
which greatly increases INSERT speed.
It is now possible to pass a list of lists/tuples/dictionaries as
the values param to the Insert construct. We convert it to a flat
dictionary so we can continue using bind params. The above query
will be converted to:
INSERT INTO table (id, name) VALUES
(:id, :name),
(:id0, :name0);
Currently only supported on postgresql, mysql and sqlite.
|
|
|
|
|
|
|
|
| |
in conjunction with "schema" for the owning
Table would fail to locate result rows due
to the MSSQL dialect's "schema rendering"
logic's failure to take .key into account.
Also in 0.7.10. [ticket:2607]
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
"tuple" rows that contain
types which aren't hashable, by setting the flag
"hashable=False" on the corresponding TypeEngine object
in use. Custom types that return unhashable types
(typically lists) can set this flag to False.
[ticket:2592]
- [bug] Applying a column expression to a select
statement using a label with or without other
modifying constructs will no longer "target" that
expression to the underlying Column; this affects
ORM operations that rely upon Column targeting
in order to retrieve results. That is, a query
like query(User.id, User.id.label('foo')) will now
track the value of each "User.id" expression separately
instead of munging them together. It is not expected
that any users will be impacted by this; however,
a usage that uses select() in conjunction with
query.from_statement() and attempts to load fully
composed ORM entities may not function as expected
if the select() named Column objects with arbitrary
.label() names, as these will no longer target to
the Column objects mapped by that entity.
[ticket:2591]
|
| |
|
|
|
|
|
|
|
| |
become an externally usable package but still remains within the main sqlalchemy parent package.
in this system, we use kind of an ugly hack to get the noseplugin imported outside of the
"sqlalchemy" package, while still making it available within sqlalchemy for usage by
third party libraries.
|
|
|
|
|
|
|
|
|
|
|
|
| |
- add some failure cases
- [bug] Firebird now uses strict "ansi bind rules"
so that bound parameters don't render in the
columns clause of a statement - they render
literally instead.
- [bug] Support for passing datetime as date when
using the DateTime type with Firebird; other
dialects support this.
|
| |
|
|
|
|
|
|
| |
- enhancements to test suite including ability to set up a testing engine
for a whole test class, fixes to how noseplugin sets up/tears
down per-class context
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
result-row targeting. It should be possible
to use a select() statement with string
based columns in it, that is
select(['id', 'name']).select_from('mytable'),
and have this statement be targetable by
Column objects with those names; this is the
mechanism by which
query(MyClass).from_statement(some_statement)
works. At some point the specific case of
using select(['id']), which is equivalent to
select([literal_column('id')]), stopped working
here, so this has been re-instated and of
course tested. [ticket:2558]
|
|
|
|
|
|
|
|
|
|
| |
True by default, if not passed explicitly,
on bindparam() if the "value" or "callable"
parameters are not passed.
This will cause statement execution to check
for the parameter being present in the final
collection of bound parameters, rather than
implicitly assigning None. [ticket:2556]
|
|
|
|
| |
- more oracle fixes
|
|
|
|
| |
[ticket:2461]
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
"ambiguous column error" would fail to
function properly if the given index were
a Column object and not a string.
Note there are still some column-targeting
issues here which are fixed in 0.8.
[ticket:2553]
- find more cases where column targeting is being inaccurate, add
more information to result_map to better differentiate "ambiguous"
results from "present" or "not present". In particular, result_map
is sensitive to dupes, even though no error is raised; the conflicting
columns are added to the "obj" member of the tuple so that the two
are both directly accessible in the result proxy
- handwringing over the damn "name fallback" thing in results. can't
really make it perfect yet
- fix up oracle returning clause. not sure why its guarding against
labels, remove that for now and see what the bot says.
|
| |
|
| |
|
|
|
|
| |
- remove deprecated 0.7 engine methods
|