Skip to content

Added pytorch framework script to test on gpu nodes#86

Open
Infinity-ops wants to merge 18 commits intoIKIM-Essen:mainfrom
Infinity-ops:nvidia/DLframeworks
Open

Added pytorch framework script to test on gpu nodes#86
Infinity-ops wants to merge 18 commits intoIKIM-Essen:mainfrom
Infinity-ops:nvidia/DLframeworks

Conversation

@Infinity-ops
Copy link
Copy Markdown
Contributor

This branch is for testing DeepLearningExamples from Nvidia on GPU nodes.
The relevant test scripts for few DL frameworks is added in the g_nodes role.

@enasca
Copy link
Copy Markdown
Member

enasca commented Nov 17, 2022

A couple of comments.

  • Let's just remove the ability to run the test from Ansible. Even if we hide it under the "never" tag, Ansible just isn't the right tool for this kind of long-running operation.
  • The target directory should be configurable instead of hardcoded to /usr/local. We should also take into account that the configured path might be on an NFS server with root_squash, therefore it should be possible to execute the copy operation as a different user.

Comment thread ansible/roles/g_nodes/files/SSD_docker.sh Outdated
Comment thread ansible/roles/g_nodes/files/gpuNet_docker.sh
Comment thread ansible/roles/g_nodes/tasks/main.yml Outdated
Comment thread ansible/roles/g_nodes/defaults/main.yml Outdated
@enasca enasca self-requested a review November 21, 2022 10:43
Copy link
Copy Markdown
Member

@enasca enasca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks good now but we need to re-test it on a GPU node after these changes.

@enasca enasca self-requested a review November 22, 2022 16:38
Copy link
Copy Markdown
Member

@enasca enasca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since performing actions as a different user on Ansible is rather fiddly, we could go back to copying the scripts on each node as root. The scripts should be tweaked so that they don't just download in the current directory; instead, the download location should be configurable but default to a predetermined location on NFS. This NFS path should be world-readable while writable only by a service account. In order to download the datasets, an admin could execute the scripts via runuser -u <service account> <script path>.

We could add a brief README with instructions on executing a test assuming that the dataset is already available. If the dataset isn't available, the user should notify an admin and ask them to download it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants