Skip to content

Commit

Permalink
Get rid of database isolation option
Browse files Browse the repository at this point in the history
As part of removing AIP-44 we remove database isolation mode
and remove pytest markers that skipped some tests.

Part of apache#44436
  • Loading branch information
potiuk committed Nov 28, 2024
1 parent 77912f8 commit 3e09bbf
Show file tree
Hide file tree
Showing 212 changed files with 540 additions and 1,287 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -289,7 +289,7 @@ jobs:
runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }}
python-versions-list-as-string: ${{ needs.build-info.outputs.python-versions-list-as-string }}
# generate no providers constraints only in canary builds - they take quite some time to generate
# they are not needed for regular builds, they are only needed to update constraints in canaries
# they are not needed for regular builds, they are only needed to upe constraints in canaries
generate-no-providers-constraints: ${{ needs.build-info.outputs.canary-run }}
image-tag: ${{ needs.build-info.outputs.image-tag }}
chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }}
Expand Down
40 changes: 0 additions & 40 deletions Dockerfile.ci
Original file line number Diff line number Diff line change
Expand Up @@ -892,19 +892,6 @@ function environment_initialization() {
fi

RUN_TESTS=${RUN_TESTS:="false"}
if [[ ${DATABASE_ISOLATION=} == "true" ]]; then
echo "${COLOR_BLUE}Force database isolation configuration:${COLOR_RESET}"
export AIRFLOW__CORE__DATABASE_ACCESS_ISOLATION=True
export AIRFLOW__CORE__INTERNAL_API_URL=http://localhost:9080
# some random secret keys. Setting them as environment variables will make them used in tests and in
# the internal API server
export AIRFLOW__CORE__INTERNAL_API_SECRET_KEY="Z27xjUwQTz4txlWZyJzLqg=="
export AIRFLOW__CORE__FERNET_KEY="l7KBR9aaH2YumhL1InlNf24gTNna8aW2WiwF2s-n_PE="
if [[ ${START_AIRFLOW=} != "true" ]]; then
export RUN_TESTS_WITH_DATABASE_ISOLATION="true"
fi
fi

CI=${CI:="false"}

# Added to have run-tests on path
Expand Down Expand Up @@ -1086,33 +1073,6 @@ function check_run_tests() {
python "${IN_CONTAINER_DIR}/remove_arm_packages.py"
fi

if [[ ${DATABASE_ISOLATION=} == "true" ]]; then
echo "${COLOR_BLUE}Starting internal API server:${COLOR_RESET}"
# We need to start the internal API server before running tests
airflow db migrate
# We set a very large clock grace allowing to have tests running in other time/years
AIRFLOW__CORE__INTERNAL_API_CLOCK_GRACE=999999999 airflow internal-api >"${AIRFLOW_HOME}/logs/internal-api.log" 2>&1 &
echo
echo -n "${COLOR_YELLOW}Waiting for internal API server to listen on 9080. ${COLOR_RESET}"
echo
for _ in $(seq 1 40)
do
sleep 0.5
nc -z localhost 9080 && echo && echo "${COLOR_GREEN}Internal API server started!!${COLOR_RESET}" && break
echo -n "."
done
if ! nc -z localhost 9080; then
echo
echo "${COLOR_RED}Internal API server did not start in 20 seconds!!${COLOR_RESET}"
echo
echo "${COLOR_BLUE}Logs:${COLOR_RESET}"
echo
cat "${AIRFLOW_HOME}/logs/internal-api.log"
echo
exit 1
fi
fi

if [[ ${TEST_GROUP:=""} == "system" ]]; then
exec "${IN_CONTAINER_DIR}/run_system_tests.sh" "${@}"
else
Expand Down
18 changes: 0 additions & 18 deletions airflow/utils/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@
from __future__ import annotations

import contextlib
import os
from collections.abc import Generator
from functools import wraps
from inspect import signature
Expand All @@ -26,29 +25,12 @@
from sqlalchemy.orm import Session as SASession

from airflow import settings
from airflow.api_internal.internal_api_call import InternalApiConfig
from airflow.settings import TracebackSession, TracebackSessionForTests
from airflow.typing_compat import ParamSpec


@contextlib.contextmanager
def create_session(scoped: bool = True) -> Generator[SASession, None, None]:
"""Contextmanager that will create and teardown a session."""
if InternalApiConfig.get_use_internal_api():
if os.environ.get("RUN_TESTS_WITH_DATABASE_ISOLATION", "false").lower() == "true":
traceback_session_for_tests = TracebackSessionForTests()
try:
yield traceback_session_for_tests
if traceback_session_for_tests.current_db_session:
traceback_session_for_tests.current_db_session.commit()
except Exception:
traceback_session_for_tests.current_db_session.rollback()
raise
finally:
traceback_session_for_tests.current_db_session.close()
else:
yield TracebackSession()
return
if scoped:
Session = getattr(settings, "Session", None)
else:
Expand Down
34 changes: 0 additions & 34 deletions contributing-docs/testing/unit_tests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -691,40 +691,6 @@ You can also use fixture to create object that needs database just like this.
conn = request.getfixturevalue(conn)
...
Running tests with Database isolation
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Running tests with DB isolation is a special case of tests that require ``internal-api`` component to be
started in order to execute the tests. Only selected tests can be run with the database isolation
(TODO: add the list) - they are simulating running untrusted components (dag file processor, triggerer,
worker) running in an environment where there is no DB configuration and certain "internal_api" endpoints
are used to communicate with the internal-api component (that can access the DB directly).

In the ``database isolation mode`` the test code can access the DB and perform setup/teardown but the code
directly from airflow package will fail if the database is accessed directly - all the DB calls should go
through the internal API component.

When you run ``breeze testing providers-tests --database-isolation`` - the internal API server will be started for
you automatically:

.. code-block:: shell
breeze testing providers-tests --database-isolation tests/dag_processing/
However, when you want to run the tests interactively, you need to use ``breeze shell --database-isolation``
command and either use ``tmux`` to split your terminal and run the internal API component in the second
pane or run it after re-entering the shell with separate ``breeze exec`` command.

.. code-block:: shell
breeze shell --database-isolation tests/dag_processing/
> tmux
> Ctrl-B "
> Panel 1: airflow internal-api
> Panel 2: pytest tests/dag_processing
Running Unit tests
------------------

Expand Down
Loading

0 comments on commit 3e09bbf

Please sign in to comment.