Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Improvements to speed CI and testing #159

Closed
wants to merge 13 commits into from

Conversation

eholum
Copy link
Collaborator

@eholum eholum commented Feb 6, 2024

No description provided.

@sea-bass
Copy link
Owner

sea-bass commented Feb 6, 2024

I think there are a few tests that cannot run in parallel because they clobber each other (namely, the PDDLStream and system integration tests). I wonder if there are pytest decorators that can be thrown in there to force them to run not in parallel?

Weirdly enough, even though there are test failures the CI shows up as green lol...

@eholum
Copy link
Collaborator Author

eholum commented Feb 6, 2024

Weirdly enough, even though there are test failures the CI shows up as green lol...

Yeah playing around with pytest-parallel vs pytest-xdist, but now I see kevlened/pytest-parallel#104. Will try xdist in moment. Will also see if we can at least cache pip deps for pytests... I'm ignoring docker build caches for the moment.

@sea-bass
Copy link
Owner

sea-bass commented Feb 6, 2024

Weirdly enough, even though there are test failures the CI shows up as green lol...

Yeah playing around with pytest-parallel vs pytest-xdist, but now I see kevlened/pytest-parallel#104. Will try xdist in moment. Will also see if we can at least cache pip deps for pytests... I'm ignoring docker build caches for the moment.

The biggest slowness in tests is the PDDLStream installation... so bang for buck wise, caching the Docker builds is gonna be it. But I also don't know how much is possible on free/public repos.

@eholum
Copy link
Collaborator Author

eholum commented Feb 6, 2024

The biggest slowness in tests is the PDDLStream installation...

You are definitely not wrong... on the plus side caching pip dependencies might save 1 or 2 seconds. You don't happen to have a dockerhub account do you? I think the free tier would probably cover what you need, since github registry charges for storage.

Weirdly enough, even though there are test failures the CI shows up as green lol...

Yeah that confused me. Something else is amok with CI here since I reverted that change and broke a unit test and... https://github.com/sea-bass/pyrobosim/actions/runs/7803256591/job/21282622248?pr=159#step:7:53

SUCCESS=0

# Do not swallow errors with tee
set -o pipefail
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you're piping to tee below I think you need this: https://unix.stackexchange.com/q/14270

So that:
https://github.com/sea-bass/pyrobosim/pull/159/checks#step:7:130

@@ -11,6 +11,12 @@
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
TEST_RESULTS_DIR="${SCRIPT_DIR}/results"

# Ensure we run everything for coverage purposes, but ensure failures are returned by the script
SUCCESS=0
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could alternatively just set -e or something, but did this in case you want ros coverage to run even when the pytests fail.

@eholum
Copy link
Collaborator Author

eholum commented Feb 7, 2024

Closing in favor of: #160 since this doesn't need to be on my fork anymore

@eholum eholum closed this Feb 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants