diff options
| author | Mike Bayer <mike_mp@zzzcomputing.com> | 2021-01-10 13:44:14 -0500 |
|---|---|---|
| committer | Mike Bayer <mike_mp@zzzcomputing.com> | 2021-01-13 22:10:13 -0500 |
| commit | f1e96cb0874927a475d0c111393b7861796dd758 (patch) | |
| tree | 810f3c43c0d2c6336805ebcf13d86d5cf1226efa /test/engine | |
| parent | 7f92fdbd8ec479a61c53c11921ce0688ad4dd94b (diff) | |
| download | sqlalchemy-f1e96cb0874927a475d0c111393b7861796dd758.tar.gz | |
reinvent xdist hooks in terms of pytest fixtures
To allow the "connection" pytest fixture and others work
correctly in conjunction with setup/teardown that expects
to be external to the transaction, remove and prevent any usage
of "xdist" style names that are hardcoded by pytest to run
inside of fixtures, even function level ones. Instead use
pytest autouse fixtures to implement our own
r"setup|teardown_test(?:_class)?" methods so that we can ensure
function-scoped fixtures are run within them. A new more
explicit flow is set up within plugin_base and pytestplugin
such that the order of setup/teardown steps, which there are now
many, is fully documented and controllable. New granularity
has been added to the test teardown phase to distinguish
between "end of the test" when lock-holding structures on
connections should be released to allow for table drops,
vs. "end of the test plus its teardown steps" when we can
perform final cleanup on connections and run assertions
that everything is closed out.
From there we can remove most of the defensive "tear down everything"
logic inside of engines which for many years would frequently dispose
of pools over and over again, creating for a broken and expensive
connection flow. A quick test shows that running test/sql/ against
a single Postgresql engine with the new approach uses 75% fewer new
connections, creating 42 new connections total, vs. 164 new
connections total with the previous system.
As part of this, the new fixtures metadata/connection/future_connection
have been integrated such that they can be combined together
effectively. The fixture_session(), provide_metadata() fixtures
have been improved, including that fixture_session() now strongly
references sessions which are explicitly torn down before
table drops occur afer a test.
Major changes have been made to the
ConnectionKiller such that it now features different "scopes" for
testing engines and will limit its cleanup to those testing
engines corresponding to end of test, end of test class, or
end of test session. The system by which it tracks DBAPI
connections has been reworked, is ultimately somewhat similar to
how it worked before but is organized more clearly along
with the proxy-tracking logic. A "testing_engine" fixture
is also added that works as a pytest fixture rather than a
standalone function. The connection cleanup logic should
now be very robust, as we now can use the same global
connection pools for the whole suite without ever disposing
them, while also running a query for PostgreSQL
locks remaining after every test and assert there are no open
transactions leaking between tests at all. Additional steps
are added that also accommodate for asyncio connections not
explicitly closed, as is the case for legacy sync-style
tests as well as the async tests themselves.
As always, hundreds of tests are further refined to use the
new fixtures where problems with loose connections were identified,
largely as a result of the new PostgreSQL assertions,
many more tests have moved from legacy patterns into the newest.
An unfortunate discovery during the creation of this system is that
autouse fixtures (as well as if they are set up by
@pytest.mark.usefixtures) are not usable at our current scale with pytest
4.6.11 running under Python 2. It's unclear if this is due
to the older version of pytest or how it implements itself for
Python 2, as well as if the issue is CPU slowness or just large
memory use, but collecting the full span of tests takes over
a minute for a single process when any autouse fixtures are in
place and on CI the jobs just time out after ten minutes.
So at the moment this patch also reinvents a small version of
"autouse" fixtures when py2k is running, which skips generating
the real fixture and instead uses two global pytest fixtures
(which don't seem to impact performance) to invoke the
"autouse" fixtures ourselves outside of pytest.
This will limit our ability to do more with fixtures
until we can remove py2k support.
py.test is still observed to be much slower in collection in the
4.6.11 version compared to modern 6.2 versions, so add support for new
TOX_POSTGRESQL_PY2K and TOX_MYSQL_PY2K environment variables that
will run the suite for fewer backends under Python 2. For Python 3
pin pytest to modern 6.2 versions where performance for collection
has been improved greatly.
Includes the following improvements:
Fixed bug in asyncio connection pool where ``asyncio.TimeoutError`` would
be raised rather than :class:`.exc.TimeoutError`. Also repaired the
:paramref:`_sa.create_engine.pool_timeout` parameter set to zero when using
the async engine, which previously would ignore the timeout and block
rather than timing out immediately as is the behavior with regular
:class:`.QueuePool`.
For asyncio the connection pool will now also not interact
at all with an asyncio connection whose ConnectionFairy is
being garbage collected; a warning that the connection was
not properly closed is emitted and the connection is discarded.
Within the test suite the ConnectionKiller is now maintaining
strong references to all DBAPI connections and ensuring they
are released when tests end, including those whose ConnectionFairy
proxies are GCed.
Identified cx_Oracle.stmtcachesize as a major factor in Oracle
test scalability issues, this can be reset on a per-test basis
rather than setting it to zero across the board. the addition
of this flag has resolved the long-standing oracle "two task"
error problem.
For SQL Server, changed the temp table style used by the
"suite" tests to be the double-pound-sign, i.e. global,
variety, which is much easier to test generically. There
are already reflection tests that are more finely tuned
to both styles of temp table within the mssql test
suite. Additionally, added an extra step to the
"dropfirst" mechanism for SQL Server that will remove
all foreign key constraints first as some issues were
observed when using this flag when multiple schemas
had not been torn down.
Identified and fixed two subtle failure modes in the
engine, when commit/rollback fails in a begin()
context manager, the connection is explicitly closed,
and when "initialize()" fails on the first new connection
of a dialect, the transactional state on that connection
is still rolled back.
Fixes: #5826
Fixes: #5827
Change-Id: Ib1d05cb8c7cf84f9a4bfd23df397dc23c9329bfe
Diffstat (limited to 'test/engine')
| -rw-r--r-- | test/engine/test_ddlevents.py | 4 | ||||
| -rw-r--r-- | test/engine/test_deprecations.py | 16 | ||||
| -rw-r--r-- | test/engine/test_execute.py | 364 | ||||
| -rw-r--r-- | test/engine/test_logging.py | 16 | ||||
| -rw-r--r-- | test/engine/test_pool.py | 57 | ||||
| -rw-r--r-- | test/engine/test_processors.py | 10 | ||||
| -rw-r--r-- | test/engine/test_reconnect.py | 20 | ||||
| -rw-r--r-- | test/engine/test_reflection.py | 9 | ||||
| -rw-r--r-- | test/engine/test_transaction.py | 21 |
9 files changed, 283 insertions, 234 deletions
diff --git a/test/engine/test_ddlevents.py b/test/engine/test_ddlevents.py index 396b48aa4..baa766d48 100644 --- a/test/engine/test_ddlevents.py +++ b/test/engine/test_ddlevents.py @@ -21,7 +21,7 @@ from sqlalchemy.testing.schema import Table class DDLEventTest(fixtures.TestBase): - def setup(self): + def setup_test(self): self.bind = engines.mock_engine() self.metadata = MetaData() self.table = Table("t", self.metadata, Column("id", Integer)) @@ -374,7 +374,7 @@ class DDLEventTest(fixtures.TestBase): class DDLExecutionTest(fixtures.TestBase): - def setup(self): + def setup_test(self): self.engine = engines.mock_engine() self.metadata = MetaData() self.users = Table( diff --git a/test/engine/test_deprecations.py b/test/engine/test_deprecations.py index a18cf756b..0a2c9abe5 100644 --- a/test/engine/test_deprecations.py +++ b/test/engine/test_deprecations.py @@ -965,7 +965,7 @@ class TransactionTest(fixtures.TablesTest): class HandleInvalidatedOnConnectTest(fixtures.TestBase): __requires__ = ("sqlite",) - def setUp(self): + def setup_test(self): e = create_engine("sqlite://") connection = Mock(get_server_version_info=Mock(return_value="5.0")) @@ -1021,18 +1021,18 @@ def MockDBAPI(): # noqa class PoolTestBase(fixtures.TestBase): - def setup(self): + def setup_test(self): pool.clear_managers() self._teardown_conns = [] - def teardown(self): + def teardown_test(self): for ref in self._teardown_conns: conn = ref() if conn: conn.close() @classmethod - def teardown_class(cls): + def teardown_test_class(cls): pool.clear_managers() def _queuepool_fixture(self, **kw): @@ -1597,7 +1597,7 @@ class EngineEventsTest(fixtures.TestBase): __requires__ = ("ad_hoc_engines",) __backend__ = True - def tearDown(self): + def teardown_test(self): Engine.dispatch._clear() Engine._has_events = False @@ -1650,6 +1650,7 @@ class EngineEventsTest(fixtures.TestBase): event.listen( engine, "before_cursor_execute", cursor_execute, retval=True ) + with testing.expect_deprecated( r"The argument signature for the " r"\"ConnectionEvents.before_execute\" event listener", @@ -1676,11 +1677,12 @@ class EngineEventsTest(fixtures.TestBase): r"The argument signature for the " r"\"ConnectionEvents.after_execute\" event listener", ): - e1.execute(select(1)) + result = e1.execute(select(1)) + result.close() class DDLExecutionTest(fixtures.TestBase): - def setup(self): + def setup_test(self): self.engine = engines.mock_engine() self.metadata = MetaData() self.users = Table( diff --git a/test/engine/test_execute.py b/test/engine/test_execute.py index 21d4e06e0..a1e4ea218 100644 --- a/test/engine/test_execute.py +++ b/test/engine/test_execute.py @@ -43,7 +43,6 @@ from sqlalchemy.testing import is_not from sqlalchemy.testing import is_true from sqlalchemy.testing import mock from sqlalchemy.testing.assertsql import CompiledSQL -from sqlalchemy.testing.engines import testing_engine from sqlalchemy.testing.mock import call from sqlalchemy.testing.mock import Mock from sqlalchemy.testing.mock import patch @@ -94,13 +93,13 @@ class ExecuteTest(fixtures.TablesTest): ).default_from() ) - conn = testing.db.connect() - result = ( - conn.execution_options(no_parameters=True) - .exec_driver_sql(stmt) - .scalar() - ) - eq_(result, "%") + with testing.db.connect() as conn: + result = ( + conn.execution_options(no_parameters=True) + .exec_driver_sql(stmt) + .scalar() + ) + eq_(result, "%") def test_raw_positional_invalid(self, connection): assert_raises_message( @@ -261,16 +260,15 @@ class ExecuteTest(fixtures.TablesTest): (4, "sally"), ] - @testing.engines.close_open_connections def test_exception_wrapping_dbapi(self): - conn = testing.db.connect() - # engine does not have exec_driver_sql - assert_raises_message( - tsa.exc.DBAPIError, - r"not_a_valid_statement", - conn.exec_driver_sql, - "not_a_valid_statement", - ) + with testing.db.connect() as conn: + # engine does not have exec_driver_sql + assert_raises_message( + tsa.exc.DBAPIError, + r"not_a_valid_statement", + conn.exec_driver_sql, + "not_a_valid_statement", + ) @testing.requires.sqlite def test_exception_wrapping_non_dbapi_error(self): @@ -864,12 +862,10 @@ class CompiledCacheTest(fixtures.TestBase): ["sqlite", "mysql", "postgresql"], "uses blob value that is problematic for some DBAPIs", ) - @testing.provide_metadata - def test_cache_noleak_on_statement_values(self, connection): + def test_cache_noleak_on_statement_values(self, metadata, connection): # This is a non regression test for an object reference leak caused # by the compiled_cache. - metadata = self.metadata photo = Table( "photo", metadata, @@ -1040,7 +1036,19 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): __requires__ = ("schemas",) __backend__ = True - def test_create_table(self): + @testing.fixture + def plain_tables(self, metadata): + t1 = Table( + "t1", metadata, Column("x", Integer), schema=config.test_schema + ) + t2 = Table( + "t2", metadata, Column("x", Integer), schema=config.test_schema + ) + t3 = Table("t3", metadata, Column("x", Integer), schema=None) + + return t1, t2, t3 + + def test_create_table(self, plain_tables, connection): map_ = { None: config.test_schema, "foo": config.test_schema, @@ -1052,18 +1060,16 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): t2 = Table("t2", metadata, Column("x", Integer), schema="foo") t3 = Table("t3", metadata, Column("x", Integer), schema="bar") - with self.sql_execution_asserter(config.db) as asserter: - with config.db.begin() as conn, conn.execution_options( - schema_translate_map=map_ - ) as conn: + with self.sql_execution_asserter(connection) as asserter: + conn = connection.execution_options(schema_translate_map=map_) - t1.create(conn) - t2.create(conn) - t3.create(conn) + t1.create(conn) + t2.create(conn) + t3.create(conn) - t3.drop(conn) - t2.drop(conn) - t1.drop(conn) + t3.drop(conn) + t2.drop(conn) + t1.drop(conn) asserter.assert_( CompiledSQL("CREATE TABLE [SCHEMA__none].t1 (x INTEGER)"), @@ -1074,14 +1080,7 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): CompiledSQL("DROP TABLE [SCHEMA__none].t1"), ) - def _fixture(self): - metadata = self.metadata - Table("t1", metadata, Column("x", Integer), schema=config.test_schema) - Table("t2", metadata, Column("x", Integer), schema=config.test_schema) - Table("t3", metadata, Column("x", Integer), schema=None) - metadata.create_all(testing.db) - - def test_ddl_hastable(self): + def test_ddl_hastable(self, plain_tables, connection): map_ = { None: config.test_schema, @@ -1094,27 +1093,28 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): Table("t2", metadata, Column("x", Integer), schema="foo") Table("t3", metadata, Column("x", Integer), schema="bar") - with config.db.begin() as conn: - conn = conn.execution_options(schema_translate_map=map_) - metadata.create_all(conn) + conn = connection.execution_options(schema_translate_map=map_) + metadata.create_all(conn) - insp = inspect(config.db) + insp = inspect(connection) is_true(insp.has_table("t1", schema=config.test_schema)) is_true(insp.has_table("t2", schema=config.test_schema)) is_true(insp.has_table("t3", schema=None)) - with config.db.begin() as conn: - conn = conn.execution_options(schema_translate_map=map_) - metadata.drop_all(conn) + conn = connection.execution_options(schema_translate_map=map_) + + # if this test fails, the tables won't get dropped. so need a + # more robust fixture for this + metadata.drop_all(conn) - insp = inspect(config.db) + insp = inspect(connection) is_false(insp.has_table("t1", schema=config.test_schema)) is_false(insp.has_table("t2", schema=config.test_schema)) is_false(insp.has_table("t3", schema=None)) - @testing.provide_metadata - def test_option_on_execute(self): - self._fixture() + def test_option_on_execute(self, plain_tables, connection): + # provided by metadata fixture provided by plain_tables fixture + self.metadata.create_all(connection) map_ = { None: config.test_schema, @@ -1127,61 +1127,54 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): t2 = Table("t2", metadata, Column("x", Integer), schema="foo") t3 = Table("t3", metadata, Column("x", Integer), schema="bar") - with self.sql_execution_asserter(config.db) as asserter: - with config.db.begin() as conn: + with self.sql_execution_asserter(connection) as asserter: + conn = connection + execution_options = {"schema_translate_map": map_} + conn._execute_20( + t1.insert(), {"x": 1}, execution_options=execution_options + ) + conn._execute_20( + t2.insert(), {"x": 1}, execution_options=execution_options + ) + conn._execute_20( + t3.insert(), {"x": 1}, execution_options=execution_options + ) - execution_options = {"schema_translate_map": map_} - conn._execute_20( - t1.insert(), {"x": 1}, execution_options=execution_options - ) - conn._execute_20( - t2.insert(), {"x": 1}, execution_options=execution_options - ) - conn._execute_20( - t3.insert(), {"x": 1}, execution_options=execution_options - ) + conn._execute_20( + t1.update().values(x=1).where(t1.c.x == 1), + execution_options=execution_options, + ) + conn._execute_20( + t2.update().values(x=2).where(t2.c.x == 1), + execution_options=execution_options, + ) + conn._execute_20( + t3.update().values(x=3).where(t3.c.x == 1), + execution_options=execution_options, + ) + eq_( conn._execute_20( - t1.update().values(x=1).where(t1.c.x == 1), - execution_options=execution_options, - ) + select(t1.c.x), execution_options=execution_options + ).scalar(), + 1, + ) + eq_( conn._execute_20( - t2.update().values(x=2).where(t2.c.x == 1), - execution_options=execution_options, - ) + select(t2.c.x), execution_options=execution_options + ).scalar(), + 2, + ) + eq_( conn._execute_20( - t3.update().values(x=3).where(t3.c.x == 1), - execution_options=execution_options, - ) - - eq_( - conn._execute_20( - select(t1.c.x), execution_options=execution_options - ).scalar(), - 1, - ) - eq_( - conn._execute_20( - select(t2.c.x), execution_options=execution_options - ).scalar(), - 2, - ) - eq_( - conn._execute_20( - select(t3.c.x), execution_options=execution_options - ).scalar(), - 3, - ) + select(t3.c.x), execution_options=execution_options + ).scalar(), + 3, + ) - conn._execute_20( - t1.delete(), execution_options=execution_options - ) - conn._execute_20( - t2.delete(), execution_options=execution_options - ) - conn._execute_20( - t3.delete(), execution_options=execution_options - ) + conn._execute_20(t1.delete(), execution_options=execution_options) + conn._execute_20(t2.delete(), execution_options=execution_options) + conn._execute_20(t3.delete(), execution_options=execution_options) asserter.assert_( CompiledSQL("INSERT INTO [SCHEMA__none].t1 (x) VALUES (:x)"), @@ -1207,9 +1200,9 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): CompiledSQL("DELETE FROM [SCHEMA_bar].t3"), ) - @testing.provide_metadata - def test_crud(self): - self._fixture() + def test_crud(self, plain_tables, connection): + # provided by metadata fixture provided by plain_tables fixture + self.metadata.create_all(connection) map_ = { None: config.test_schema, @@ -1222,26 +1215,24 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): t2 = Table("t2", metadata, Column("x", Integer), schema="foo") t3 = Table("t3", metadata, Column("x", Integer), schema="bar") - with self.sql_execution_asserter(config.db) as asserter: - with config.db.begin() as conn, conn.execution_options( - schema_translate_map=map_ - ) as conn: + with self.sql_execution_asserter(connection) as asserter: + conn = connection.execution_options(schema_translate_map=map_) - conn.execute(t1.insert(), {"x": 1}) - conn.execute(t2.insert(), {"x": 1}) - conn.execute(t3.insert(), {"x": 1}) + conn.execute(t1.insert(), {"x": 1}) + conn.execute(t2.insert(), {"x": 1}) + conn.execute(t3.insert(), {"x": 1}) - conn.execute(t1.update().values(x=1).where(t1.c.x == 1)) - conn.execute(t2.update().values(x=2).where(t2.c.x == 1)) - conn.execute(t3.update().values(x=3).where(t3.c.x == 1)) + conn.execute(t1.update().values(x=1).where(t1.c.x == 1)) + conn.execute(t2.update().values(x=2).where(t2.c.x == 1)) + conn.execute(t3.update().values(x=3).where(t3.c.x == 1)) - eq_(conn.scalar(select(t1.c.x)), 1) - eq_(conn.scalar(select(t2.c.x)), 2) - eq_(conn.scalar(select(t3.c.x)), 3) + eq_(conn.scalar(select(t1.c.x)), 1) + eq_(conn.scalar(select(t2.c.x)), 2) + eq_(conn.scalar(select(t3.c.x)), 3) - conn.execute(t1.delete()) - conn.execute(t2.delete()) - conn.execute(t3.delete()) + conn.execute(t1.delete()) + conn.execute(t2.delete()) + conn.execute(t3.delete()) asserter.assert_( CompiledSQL("INSERT INTO [SCHEMA__none].t1 (x) VALUES (:x)"), @@ -1267,9 +1258,10 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): CompiledSQL("DELETE FROM [SCHEMA_bar].t3"), ) - @testing.provide_metadata - def test_via_engine(self): - self._fixture() + def test_via_engine(self, plain_tables, metadata): + + with config.db.begin() as connection: + metadata.create_all(connection) map_ = { None: config.test_schema, @@ -1282,25 +1274,25 @@ class SchemaTranslateTest(fixtures.TestBase, testing.AssertsExecutionResults): with self.sql_execution_asserter(config.db) as asserter: eng = config.db.execution_options(schema_translate_map=map_) - conn = eng.connect() - conn.execute(select(t2.c.x)) + with eng.connect() as conn: + conn.execute(select(t2.c.x)) asserter.assert_( CompiledSQL("SELECT [SCHEMA_foo].t2.x FROM [SCHEMA_foo].t2") ) class ExecutionOptionsTest(fixtures.TestBase): - def test_dialect_conn_options(self): + def test_dialect_conn_options(self, testing_engine): engine = testing_engine("sqlite://", options=dict(_initialize=False)) engine.dialect = Mock() - conn = engine.connect() - c2 = conn.execution_options(foo="bar") - eq_( - engine.dialect.set_connection_execution_options.mock_calls, - [call(c2, {"foo": "bar"})], - ) + with engine.connect() as conn: + c2 = conn.execution_options(foo="bar") + eq_( + engine.dialect.set_connection_execution_options.mock_calls, + [call(c2, {"foo": "bar"})], + ) - def test_dialect_engine_options(self): + def test_dialect_engine_options(self, testing_engine): engine = testing_engine("sqlite://") engine.dialect = Mock() e2 = engine.execution_options(foo="bar") @@ -1319,14 +1311,14 @@ class ExecutionOptionsTest(fixtures.TestBase): [call(engine, {"foo": "bar"})], ) - def test_propagate_engine_to_connection(self): + def test_propagate_engine_to_connection(self, testing_engine): engine = testing_engine( "sqlite://", options=dict(execution_options={"foo": "bar"}) ) - conn = engine.connect() - eq_(conn._execution_options, {"foo": "bar"}) + with engine.connect() as conn: + eq_(conn._execution_options, {"foo": "bar"}) - def test_propagate_option_engine_to_connection(self): + def test_propagate_option_engine_to_connection(self, testing_engine): e1 = testing_engine( "sqlite://", options=dict(execution_options={"foo": "bar"}) ) @@ -1336,27 +1328,30 @@ class ExecutionOptionsTest(fixtures.TestBase): eq_(c1._execution_options, {"foo": "bar"}) eq_(c2._execution_options, {"foo": "bar", "bat": "hoho"}) - def test_get_engine_execution_options(self): + c1.close() + c2.close() + + def test_get_engine_execution_options(self, testing_engine): engine = testing_engine("sqlite://") engine.dialect = Mock() e2 = engine.execution_options(foo="bar") eq_(e2.get_execution_options(), {"foo": "bar"}) - def test_get_connection_execution_options(self): + def test_get_connection_execution_options(self, testing_engine): engine = testing_engine("sqlite://", options=dict(_initialize=False)) engine.dialect = Mock() - conn = engine.connect() - c = conn.execution_options(foo="bar") + with engine.connect() as conn: + c = conn.execution_options(foo="bar") - eq_(c.get_execution_options(), {"foo": "bar"}) + eq_(c.get_execution_options(), {"foo": "bar"}) class EngineEventsTest(fixtures.TestBase): __requires__ = ("ad_hoc_engines",) __backend__ = True - def tearDown(self): + def teardown_test(self): Engine.dispatch._clear() Engine._has_events = False @@ -1376,7 +1371,7 @@ class EngineEventsTest(fixtures.TestBase): ): break - def test_per_engine_independence(self): + def test_per_engine_independence(self, testing_engine): e1 = testing_engine(config.db_url) e2 = testing_engine(config.db_url) @@ -1400,7 +1395,7 @@ class EngineEventsTest(fixtures.TestBase): conn.execute(s2) eq_([arg[1][1] for arg in canary.mock_calls], [s1, s1, s2]) - def test_per_engine_plus_global(self): + def test_per_engine_plus_global(self, testing_engine): canary = Mock() event.listen(Engine, "before_execute", canary.be1) e1 = testing_engine(config.db_url) @@ -1409,8 +1404,6 @@ class EngineEventsTest(fixtures.TestBase): event.listen(e1, "before_execute", canary.be2) event.listen(Engine, "before_execute", canary.be3) - e1.connect() - e2.connect() with e1.connect() as conn: conn.execute(select(1)) @@ -1424,7 +1417,7 @@ class EngineEventsTest(fixtures.TestBase): eq_(canary.be2.call_count, 1) eq_(canary.be3.call_count, 2) - def test_per_connection_plus_engine(self): + def test_per_connection_plus_engine(self, testing_engine): canary = Mock() e1 = testing_engine(config.db_url) @@ -1442,9 +1435,14 @@ class EngineEventsTest(fixtures.TestBase): eq_(canary.be1.call_count, 2) eq_(canary.be2.call_count, 2) - @testing.combinations((True, False), (True, True), (False, False)) + @testing.combinations( + (True, False), + (True, True), + (False, False), + argnames="mock_out_on_connect, add_our_own_onconnect", + ) def test_insert_connect_is_definitely_first( - self, mock_out_on_connect, add_our_own_onconnect + self, mock_out_on_connect, add_our_own_onconnect, testing_engine ): """test issue #5708. @@ -1478,7 +1476,7 @@ class EngineEventsTest(fixtures.TestBase): patcher = util.nullcontext() with patcher: - e1 = create_engine(config.db_url) + e1 = testing_engine(config.db_url) initialize = e1.dialect.initialize @@ -1559,10 +1557,11 @@ class EngineEventsTest(fixtures.TestBase): conn.exec_driver_sql(select1(testing.db)) eq_(m1.mock_calls, []) - def test_add_event_after_connect(self): + def test_add_event_after_connect(self, testing_engine): # new feature as of #2978 + canary = Mock() - e1 = create_engine(config.db_url) + e1 = testing_engine(config.db_url, future=False) assert not e1._has_events conn = e1.connect() @@ -1575,9 +1574,9 @@ class EngineEventsTest(fixtures.TestBase): conn._branch().execute(select(1)) eq_(canary.be1.call_count, 2) - def test_force_conn_events_false(self): + def test_force_conn_events_false(self, testing_engine): canary = Mock() - e1 = create_engine(config.db_url) + e1 = testing_engine(config.db_url, future=False) assert not e1._has_events event.listen(e1, "before_execute", canary.be1) @@ -1593,7 +1592,7 @@ class EngineEventsTest(fixtures.TestBase): conn._branch().execute(select(1)) eq_(canary.be1.call_count, 0) - def test_cursor_events_ctx_execute_scalar(self): + def test_cursor_events_ctx_execute_scalar(self, testing_engine): canary = Mock() e1 = testing_engine(config.db_url) @@ -1620,7 +1619,7 @@ class EngineEventsTest(fixtures.TestBase): [call(conn, ctx.cursor, stmt, ctx.parameters[0], ctx, False)], ) - def test_cursor_events_execute(self): + def test_cursor_events_execute(self, testing_engine): canary = Mock() e1 = testing_engine(config.db_url) @@ -1653,9 +1652,15 @@ class EngineEventsTest(fixtures.TestBase): ), ((), {"z": 10}, [], {"z": 10}, testing.requires.legacy_engine), (({"z": 10},), {}, [], {"z": 10}), + argnames="multiparams, params, expected_multiparams, expected_params", ) def test_modify_parameters_from_event_one( - self, multiparams, params, expected_multiparams, expected_params + self, + multiparams, + params, + expected_multiparams, + expected_params, + testing_engine, ): # this is testing both the normalization added to parameters # as of I97cb4d06adfcc6b889f10d01cc7775925cffb116 as well as @@ -1704,7 +1709,9 @@ class EngineEventsTest(fixtures.TestBase): [(15,), (19,)], ) - def test_modify_parameters_from_event_three(self, connection): + def test_modify_parameters_from_event_three( + self, connection, testing_engine + ): def before_execute( conn, clauseelement, multiparams, params, execution_options ): @@ -1721,7 +1728,7 @@ class EngineEventsTest(fixtures.TestBase): with e1.connect() as conn: conn.execute(select(literal("1"))) - def test_argument_format_execute(self): + def test_argument_format_execute(self, testing_engine): def before_execute( conn, clauseelement, multiparams, params, execution_options ): @@ -1956,9 +1963,9 @@ class EngineEventsTest(fixtures.TestBase): ) @testing.requires.ad_hoc_engines - def test_dispose_event(self): + def test_dispose_event(self, testing_engine): canary = Mock() - eng = create_engine(testing.db.url) + eng = testing_engine(testing.db.url) event.listen(eng, "engine_disposed", canary) conn = eng.connect() @@ -2102,13 +2109,13 @@ class EngineEventsTest(fixtures.TestBase): event.listen(engine, "commit", tracker("commit")) event.listen(engine, "rollback", tracker("rollback")) - conn = engine.connect() - trans = conn.begin() - conn.execute(select(1)) - trans.rollback() - trans = conn.begin() - conn.execute(select(1)) - trans.commit() + with engine.connect() as conn: + trans = conn.begin() + conn.execute(select(1)) + trans.rollback() + trans = conn.begin() + conn.execute(select(1)) + trans.commit() eq_( canary, @@ -2145,13 +2152,13 @@ class EngineEventsTest(fixtures.TestBase): event.listen(engine, "commit", tracker("commit"), named=True) event.listen(engine, "rollback", tracker("rollback"), named=True) - conn = engine.connect() - trans = conn.begin() - conn.execute(select(1)) - trans.rollback() - trans = conn.begin() - conn.execute(select(1)) - trans.commit() + with engine.connect() as conn: + trans = conn.begin() + conn.execute(select(1)) + trans.rollback() + trans = conn.begin() + conn.execute(select(1)) + trans.commit() eq_( canary, @@ -2310,7 +2317,7 @@ class HandleErrorTest(fixtures.TestBase): __requires__ = ("ad_hoc_engines",) __backend__ = True - def tearDown(self): + def teardown_test(self): Engine.dispatch._clear() Engine._has_events = False @@ -2742,7 +2749,7 @@ class HandleErrorTest(fixtures.TestBase): class HandleInvalidatedOnConnectTest(fixtures.TestBase): __requires__ = ("sqlite",) - def setUp(self): + def setup_test(self): e = create_engine("sqlite://") connection = Mock(get_server_version_info=Mock(return_value="5.0")) @@ -3014,6 +3021,9 @@ class HandleInvalidatedOnConnectTest(fixtures.TestBase): ], ) + c.close() + c2.close() + class DialectEventTest(fixtures.TestBase): @contextmanager @@ -3370,7 +3380,7 @@ class SetInputSizesTest(fixtures.TablesTest): ) @testing.fixture - def input_sizes_fixture(self): + def input_sizes_fixture(self, testing_engine): canary = mock.Mock() def do_set_input_sizes(cursor, list_of_tuples, context): diff --git a/test/engine/test_logging.py b/test/engine/test_logging.py index 29b8132aa..c56589248 100644 --- a/test/engine/test_logging.py +++ b/test/engine/test_logging.py @@ -30,7 +30,7 @@ class LogParamsTest(fixtures.TestBase): __only_on__ = "sqlite" __requires__ = ("ad_hoc_engines",) - def setup(self): + def setup_test(self): self.eng = engines.testing_engine(options={"echo": True}) self.no_param_engine = engines.testing_engine( options={"echo": True, "hide_parameters": True} @@ -44,7 +44,7 @@ class LogParamsTest(fixtures.TestBase): for log in [logging.getLogger("sqlalchemy.engine")]: log.addHandler(self.buf) - def teardown(self): + def teardown_test(self): exec_sql(self.eng, "drop table if exists foo") for log in [logging.getLogger("sqlalchemy.engine")]: log.removeHandler(self.buf) @@ -413,14 +413,14 @@ class LogParamsTest(fixtures.TestBase): class PoolLoggingTest(fixtures.TestBase): - def setup(self): + def setup_test(self): self.existing_level = logging.getLogger("sqlalchemy.pool").level self.buf = logging.handlers.BufferingHandler(100) for log in [logging.getLogger("sqlalchemy.pool")]: log.addHandler(self.buf) - def teardown(self): + def teardown_test(self): for log in [logging.getLogger("sqlalchemy.pool")]: log.removeHandler(self.buf) logging.getLogger("sqlalchemy.pool").setLevel(self.existing_level) @@ -528,7 +528,7 @@ class LoggingNameTest(fixtures.TestBase): kw.update({"echo": True}) return engines.testing_engine(options=kw) - def setup(self): + def setup_test(self): self.buf = logging.handlers.BufferingHandler(100) for log in [ logging.getLogger("sqlalchemy.engine"), @@ -536,7 +536,7 @@ class LoggingNameTest(fixtures.TestBase): ]: log.addHandler(self.buf) - def teardown(self): + def teardown_test(self): for log in [ logging.getLogger("sqlalchemy.engine"), logging.getLogger("sqlalchemy.pool"), @@ -588,13 +588,13 @@ class LoggingNameTest(fixtures.TestBase): class EchoTest(fixtures.TestBase): __requires__ = ("ad_hoc_engines",) - def setup(self): + def setup_test(self): self.level = logging.getLogger("sqlalchemy.engine").level logging.getLogger("sqlalchemy.engine").setLevel(logging.WARN) self.buf = logging.handlers.BufferingHandler(100) logging.getLogger("sqlalchemy.engine").addHandler(self.buf) - def teardown(self): + def teardown_test(self): logging.getLogger("sqlalchemy.engine").removeHandler(self.buf) logging.getLogger("sqlalchemy.engine").setLevel(self.level) diff --git a/test/engine/test_pool.py b/test/engine/test_pool.py index 550fedb8e..decdce3f9 100644 --- a/test/engine/test_pool.py +++ b/test/engine/test_pool.py @@ -17,7 +17,9 @@ from sqlalchemy.testing import eq_ from sqlalchemy.testing import expect_raises from sqlalchemy.testing import fixtures from sqlalchemy.testing import is_ +from sqlalchemy.testing import is_none from sqlalchemy.testing import is_not +from sqlalchemy.testing import is_not_none from sqlalchemy.testing import is_true from sqlalchemy.testing import mock from sqlalchemy.testing.engines import testing_engine @@ -63,18 +65,18 @@ def MockDBAPI(): # noqa class PoolTestBase(fixtures.TestBase): - def setup(self): + def setup_test(self): pool.clear_managers() self._teardown_conns = [] - def teardown(self): + def teardown_test(self): for ref in self._teardown_conns: conn = ref() if conn: conn.close() @classmethod - def teardown_class(cls): + def teardown_test_class(cls): pool.clear_managers() def _with_teardown(self, connection): @@ -364,10 +366,17 @@ class PoolEventsTest(PoolTestBase): p = self._queuepool_fixture() canary = [] + @event.listens_for(p, "checkin") def checkin(*arg, **kw): canary.append("checkin") - event.listen(p, "checkin", checkin) + @event.listens_for(p, "close_detached") + def close_detached(*arg, **kw): + canary.append("close_detached") + + @event.listens_for(p, "detach") + def detach(*arg, **kw): + canary.append("detach") return p, canary @@ -629,15 +638,35 @@ class PoolEventsTest(PoolTestBase): assert canary.call_args_list[0][0][0] is dbapi_con assert canary.call_args_list[0][0][2] is exc + @testing.combinations((True, testing.requires.python3), (False,)) @testing.requires.predictable_gc - def test_checkin_event_gc(self): + def test_checkin_event_gc(self, detach_gced): p, canary = self._checkin_event_fixture() + if detach_gced: + p._is_asyncio = True + c1 = p.connect() + + dbapi_connection = weakref.ref(c1.connection) + eq_(canary, []) del c1 lazy_gc() - eq_(canary, ["checkin"]) + + if detach_gced: + # "close_detached" is not called because for asyncio the + # connection is just lost. + eq_(canary, ["detach"]) + + else: + eq_(canary, ["checkin"]) + + gc_collect() + if detach_gced: + is_none(dbapi_connection()) + else: + is_not_none(dbapi_connection()) def test_checkin_event_on_subsequently_recreated(self): p, canary = self._checkin_event_fixture() @@ -744,7 +773,7 @@ class PoolEventsTest(PoolTestBase): eq_(conn.info["important_flag"], True) conn.close() - def teardown(self): + def teardown_test(self): # TODO: need to get remove() functionality # going pool.Pool.dispatch._clear() @@ -1490,12 +1519,16 @@ class QueuePoolTest(PoolTestBase): self._assert_cleanup_on_pooled_reconnect(dbapi, p) + @testing.combinations((True, testing.requires.python3), (False,)) @testing.requires.predictable_gc - def test_userspace_disconnectionerror_weakref_finalizer(self): + def test_userspace_disconnectionerror_weakref_finalizer(self, detach_gced): dbapi, pool = self._queuepool_dbapi_fixture( pool_size=1, max_overflow=2 ) + if detach_gced: + pool._is_asyncio = True + @event.listens_for(pool, "checkout") def handle_checkout_event(dbapi_con, con_record, con_proxy): if getattr(dbapi_con, "boom") == "yes": @@ -1514,8 +1547,12 @@ class QueuePoolTest(PoolTestBase): del conn gc_collect() - # new connection was reset on return appropriately - eq_(dbapi_conn.mock_calls, [call.rollback()]) + if detach_gced: + # new connection was detached + abandoned on return + eq_(dbapi_conn.mock_calls, []) + else: + # new connection reset and returned to pool + eq_(dbapi_conn.mock_calls, [call.rollback()]) # old connection was just closed - did not get an # erroneous reset on return diff --git a/test/engine/test_processors.py b/test/engine/test_processors.py index 3810de06a..5a4220c82 100644 --- a/test/engine/test_processors.py +++ b/test/engine/test_processors.py @@ -25,7 +25,7 @@ class CBooleanProcessorTest(_BooleanProcessorTest): __requires__ = ("cextensions",) @classmethod - def setup_class(cls): + def setup_test_class(cls): from sqlalchemy import cprocessors cls.module = cprocessors @@ -83,7 +83,7 @@ class _DateProcessorTest(fixtures.TestBase): class PyDateProcessorTest(_DateProcessorTest): @classmethod - def setup_class(cls): + def setup_test_class(cls): from sqlalchemy import processors cls.module = type( @@ -100,7 +100,7 @@ class CDateProcessorTest(_DateProcessorTest): __requires__ = ("cextensions",) @classmethod - def setup_class(cls): + def setup_test_class(cls): from sqlalchemy import cprocessors cls.module = cprocessors @@ -185,7 +185,7 @@ class _DistillArgsTest(fixtures.TestBase): class PyDistillArgsTest(_DistillArgsTest): @classmethod - def setup_class(cls): + def setup_test_class(cls): from sqlalchemy.engine import util cls.module = type( @@ -202,7 +202,7 @@ class CDistillArgsTest(_DistillArgsTest): __requires__ = ("cextensions",) @classmethod - def setup_class(cls): + def setup_test_class(cls): from sqlalchemy import cutils as util cls.module = util diff --git a/test/engine/test_reconnect.py b/test/engine/test_reconnect.py index 5fe7f6cc2..7a64b2550 100644 --- a/test/engine/test_reconnect.py +++ b/test/engine/test_reconnect.py @@ -162,7 +162,7 @@ def MockDBAPI(): class PrePingMockTest(fixtures.TestBase): - def setup(self): + def setup_test(self): self.dbapi = MockDBAPI() def _pool_fixture(self, pre_ping, pool_kw=None): @@ -182,7 +182,7 @@ class PrePingMockTest(fixtures.TestBase): ) return _pool - def teardown(self): + def teardown_test(self): self.dbapi.dispose() def test_ping_not_on_first_connect(self): @@ -357,7 +357,7 @@ class PrePingMockTest(fixtures.TestBase): class MockReconnectTest(fixtures.TestBase): - def setup(self): + def setup_test(self): self.dbapi = MockDBAPI() self.db = testing_engine( @@ -373,7 +373,7 @@ class MockReconnectTest(fixtures.TestBase): e, MockDisconnect ) - def teardown(self): + def teardown_test(self): self.dbapi.dispose() def test_reconnect(self): @@ -1004,10 +1004,10 @@ class RealReconnectTest(fixtures.TestBase): __backend__ = True __requires__ = "graceful_disconnects", "ad_hoc_engines" - def setup(self): + def setup_test(self): self.engine = engines.reconnecting_engine() - def teardown(self): + def teardown_test(self): self.engine.dispose() def test_reconnect(self): @@ -1336,7 +1336,7 @@ class PrePingRealTest(fixtures.TestBase): class InvalidateDuringResultTest(fixtures.TestBase): __backend__ = True - def setup(self): + def setup_test(self): self.engine = engines.reconnecting_engine() self.meta = MetaData() table = Table( @@ -1353,7 +1353,7 @@ class InvalidateDuringResultTest(fixtures.TestBase): [{"id": i, "name": "row %d" % i} for i in range(1, 100)], ) - def teardown(self): + def teardown_test(self): with self.engine.begin() as conn: self.meta.drop_all(conn) self.engine.dispose() @@ -1470,7 +1470,7 @@ class ReconnectRecipeTest(fixtures.TestBase): __backend__ = True - def setup(self): + def setup_test(self): self.engine = engines.reconnecting_engine( options=dict(future=self.future) ) @@ -1483,7 +1483,7 @@ class ReconnectRecipeTest(fixtures.TestBase): ) self.meta.create_all(self.engine) - def teardown(self): + def teardown_test(self): self.meta.drop_all(self.engine) self.engine.dispose() diff --git a/test/engine/test_reflection.py b/test/engine/test_reflection.py index 658cdd79f..0a46ddeec 100644 --- a/test/engine/test_reflection.py +++ b/test/engine/test_reflection.py @@ -796,7 +796,7 @@ class ReflectionTest(fixtures.TestBase, ComparesTables): assert f1 in b1.constraints assert len(b1.constraints) == 2 - def test_override_keys(self, connection, metadata): + def test_override_keys(self, metadata, connection): """test that columns can be overridden with a 'key', and that ForeignKey targeting during reflection still works.""" @@ -1375,7 +1375,7 @@ class CreateDropTest(fixtures.TablesTest): run_create_tables = None @classmethod - def teardown_class(cls): + def teardown_test_class(cls): # TablesTest is used here without # run_create_tables, so add an explicit drop of whatever is in # metadata @@ -1658,7 +1658,6 @@ class SchemaTest(fixtures.TestBase): @testing.requires.schemas @testing.requires.cross_schema_fk_reflection @testing.requires.implicit_default_schema - @testing.provide_metadata def test_blank_schema_arg(self, connection, metadata): Table( @@ -1913,7 +1912,7 @@ class ReverseCasingReflectTest(fixtures.TestBase, AssertsCompiledSQL): __backend__ = True @testing.requires.denormalized_names - def setup(self): + def setup_test(self): with testing.db.begin() as conn: conn.exec_driver_sql( """ @@ -1926,7 +1925,7 @@ class ReverseCasingReflectTest(fixtures.TestBase, AssertsCompiledSQL): ) @testing.requires.denormalized_names - def teardown(self): + def teardown_test(self): with testing.db.begin() as conn: conn.exec_driver_sql("drop table weird_casing") diff --git a/test/engine/test_transaction.py b/test/engine/test_transaction.py index 79126fc5b..47504b60a 100644 --- a/test/engine/test_transaction.py +++ b/test/engine/test_transaction.py @@ -1,6 +1,5 @@ import sys -from sqlalchemy import create_engine from sqlalchemy import event from sqlalchemy import exc from sqlalchemy import func @@ -640,12 +639,12 @@ class AutoRollbackTest(fixtures.TestBase): __backend__ = True @classmethod - def setup_class(cls): + def setup_test_class(cls): global metadata metadata = MetaData() @classmethod - def teardown_class(cls): + def teardown_test_class(cls): metadata.drop_all(testing.db) def test_rollback_deadlock(self): @@ -871,11 +870,13 @@ class IsolationLevelTest(fixtures.TestBase): def test_per_engine(self): # new in 0.9 - eng = create_engine( + eng = testing_engine( testing.db.url, - execution_options={ - "isolation_level": self._non_default_isolation_level() - }, + options=dict( + execution_options={ + "isolation_level": self._non_default_isolation_level() + } + ), ) conn = eng.connect() eq_( @@ -884,7 +885,7 @@ class IsolationLevelTest(fixtures.TestBase): ) def test_per_option_engine(self): - eng = create_engine(testing.db.url).execution_options( + eng = testing_engine(testing.db.url).execution_options( isolation_level=self._non_default_isolation_level() ) @@ -895,14 +896,14 @@ class IsolationLevelTest(fixtures.TestBase): ) def test_isolation_level_accessors_connection_default(self): - eng = create_engine(testing.db.url) + eng = testing_engine(testing.db.url) with eng.connect() as conn: eq_(conn.default_isolation_level, self._default_isolation_level()) with eng.connect() as conn: eq_(conn.get_isolation_level(), self._default_isolation_level()) def test_isolation_level_accessors_connection_option_modified(self): - eng = create_engine(testing.db.url) + eng = testing_engine(testing.db.url) with eng.connect() as conn: c2 = conn.execution_options( isolation_level=self._non_default_isolation_level() |
