-
Notifications
You must be signed in to change notification settings - Fork 64
Add OpenMPI host injection script #1085
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Instance
|
Instance
|
Instance
|
Instance
|
Instance
|
PR merged! Moved |
1 similar comment
PR merged! Moved |
@ocaisa I guess that was an accidental close? |
@boegel yes, accident, meant to cancel a comment I was working on |
@agimenog We have recently split up the software-layer reposotory. The changes that are made in this pr should target the new repository, https://github.com/EESSI/software-layer-scripts. Maybe you can coördinate with @pfermi to open a new pr their? |
Hi,
We are creating a new PR because we don't have the right permissions to push into the previous one: #963
We have removed the maxdepth option in order to make the script copy all the needed libs. Now to make it work you should run the script following the next example:
sh software-layer-mpi/scripts/mpi_support/install_openmpi_host_injection.sh --mpi-path /support/home/multixs/.local/easybuild/software/OpenMPI/4.1.5-GCC-12.3.0/
With this we have tried to run an OSU benchmark and it worked as expected:
`srun -p short -n 2 --mpi=pmix --nodelist=node001,node002 /support/home/multixs/pull-963/osu-micro-benchmarks-7.3/c/mpi/pt2pt/standard/osu_latency
OSU MPI Latency Test v7.3
Size Latency (us)
Datatype: MPI_CHAR.
1 40.29
2 38.29
4 36.31
8 32.60
16 32.89
32 32.59
64 32.61
128 36.76
256 43.53
512 51.78
1024 62.63
2048 143.23`
If the result is okay for you we can start working to change the patchelf options in order to make it work with symlinks.
Also, we would like to know if there's any place where we can document how to use the script and compile the software with the injected mpi libraries.
We will like also to mention that when we tried the same workflow with OpenMPI/5.0.7-GCC-14.2.0 the MPI injection failed, we are currently working to fix it.
Regards,
Arturo.