Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI: fix failed test_containerized.py integration test for containerized code #6707

Merged
merged 12 commits into from
Jan 16, 2025
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ description: Bash run in Docker image through Singularity
default_calc_job_plugin: core.arithmetic.add
computer: localhost
filepath_executable: /bin/sh
image_name: docker://alpine:3
engine_command: singularity exec --bind $PWD:$PWD {image_name}
image_name: alpine:3
engine_command: docker run --user 1001:100 -v $PWD:$PWD -w $PWD -i {image_name}
prepend_text: ' '
append_text: ' '
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,3 @@ verdi -p test_aiida run ${SYSTEM_TESTS}/test_daemon.py
verdi -p test_aiida run ${SYSTEM_TESTS}/test_containerized_code.py
bash ${SYSTEM_TESTS}/test_polish_workchains.sh
verdi daemon stop

AIIDA_TEST_PROFILE=test_aiida pytest --db-backend psql -m nightly tests/
21 changes: 13 additions & 8 deletions .github/workflows/nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ jobs:
nightly-tests:

if: github.repository == 'aiidateam/aiida-core' # Prevent running the builds on forks as well
runs-on: ubuntu-22.04
runs-on: ubuntu-24.04

services:
postgres:
Expand Down Expand Up @@ -55,9 +55,6 @@ jobs:

steps:
- uses: actions/checkout@v4
- uses: eWaterCycle/setup-singularity@v7 # for containerized code test
with:
singularity-version: 3.8.7

- name: Install system dependencies
run: sudo apt update && sudo apt install postgresql
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we should be installing graphviz here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why? it seems the nightly pytests don't need graphviz.

Expand All @@ -72,15 +69,23 @@ jobs:
- name: Setup environment
run: .github/workflows/setup.sh

- name: Run tests
id: tests
run: .github/workflows/tests_nightly.sh
- name: Run pytest nigthly tests
id: pytest-tests
env:
AIIDA_TEST_PROFILE: test_aiida
AIIDA_WARN_v3: 1
run: |
pytest --db-backend psql -m nightly tests/

- name: Run daemon nightly tests
id: daemon-tests
run: .github/workflows/daemon_tests.sh

- name: Slack notification
# Always run this step (otherwise it would be skipped if any of the previous steps fail) but only if the
# `install` or `tests` steps failed, and the `SLACK_WEBHOOK` is available. The latter is not the case for
# pull requests that come from forks. This is a limitation of secrets on GHA
if: always() && (steps.install.outcome == 'failure' || steps.tests.outcome == 'failure') && env.SLACK_WEBHOOK != null
if: always() && (steps.install.outcome == 'failure' || steps.pytest-tests.outcome == 'failure' || steps.daemon-tests.outcome == 'failure') && env.SLACK_WEBHOOK != null
uses: rtCamp/action-slack-notify@v2
env:
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ verdi computer configure core.local localhost --config "${CONFIG}/localhost-conf
verdi computer test localhost
verdi code create core.code.installed --non-interactive --config "${CONFIG}/doubler.yaml"
verdi code create core.code.installed --non-interactive --config "${CONFIG}/add.yaml"
verdi code create core.code.containerized --non-interactive --config "${CONFIG}/add-singularity.yaml"
verdi code create core.code.containerized --non-interactive --config "${CONFIG}/add-containerized.yaml"

# set up slurm-ssh computer
verdi computer setup --non-interactive --config "${CONFIG}/slurm-ssh.yaml"
Expand Down
Loading