Skip to content

Commit 1735b32

Browse files
Merge pull request #30 from EESSI/aoc_tweaks
Update some of the text
2 parents c0cba40 + c1de1e8 commit 1735b32

File tree

2 files changed

+33
-41
lines changed

2 files changed

+33
-41
lines changed

isc25/EESSI/abstract.tex

Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,21 @@
11
What if there was a way to avoid having to install a broad range of scientific software from scratch on every HPC
22
cluster or cloud instance you use or maintain, without compromising on performance?
33

4-
Installing scientific software for supercomputers is known to be a tedious and time-consuming task. Especially as the
5-
HPC user community becomes more diverse, computational science expands rapidly, the diversity of system architectures
6-
increases the application software stack continues to deepen. Simultaneously, we see a surge in interest in cloud
7-
computing for scientific computing. Delivering optimised software installations and providing access to these
4+
Installing scientific software for supercomputers is known to be a tedious and time-consuming task. The application
5+
software stack continues to deepen as the
6+
HPC user community becomes more diverse, computational science expands rapidly, and the diversity of system architectures
7+
increases. Simultaneously, we see a surge in interest in public cloud
8+
infrastructures for scientific computing. Delivering optimised software installations and providing access to these
89
installations in a reliable, user-friendly, and reproducible way is a highly non-trivial task that affects application
910
developers, HPC user support teams, and the users themselves.
1011

11-
This tutorial aims to address these challenges by providing the attendees with the knowledge to stream optimised
12-
scientific software. For this, the tutorial introduces European Environment for Scientific Software Installations
12+
This tutorial aims to address these challenges by providing the attendees with the tools to \emph{stream} the optimised
13+
scientific software they need. The tutorial introduces European Environment for Scientific Software Installations
1314
(\emph{EESSI}), a collaboration between various European HPC sites \& industry partners, with the common goal of
14-
creating a shared repository of scientific software installations that can be used on a variety of systems, regardless
15-
of which flavor/version of Linux distribution or processor architecture is used, or whether it’s a full size HPC
15+
creating a shared repository of scientific software installations (\emph{not} recipes) that can be used on a variety of
16+
systems, regardless
17+
of which flavor/version of Linux distribution or processor architecture is used, or whether it's a full size HPC
1618
cluster, a cloud environment or a personal workstation.
1719

1820
We cover the usage of EESSI, different ways to accessing EESSI, how to add software to EESSI, and highlight some more
19-
advanced features. We will also show how to engage with the community and contribute to the project.
21+
advanced features. We will also show attendees how to engage with the community and contribute to the project.

isc25/EESSI/description.tex

Lines changed: 22 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -9,32 +9,22 @@
99
\subsection*{Overview and Goals}
1010

1111
Application developers, HPC sites, and end users %around the world
12-
spend significant amounts of time on optimised software installations. Surveys conducted at the
13-
\emph{``Getting Scientific Software Installed''} Birds-of-a-Feather sessions that we (co-)organised at both SC and ISC
14-
reveal that this (still) involves a significant amount of `manual' effort.
15-
% (and thus time/manpower).
16-
In the SC'19 survey,
17-
%In the most recent survey (SC'19),
18-
less than half of the respondents consistently automate software installation,
19-
and only ~25\% automate environment module file generation.
20-
%and only ~25\% automatically generate environment module files consistently, as opposed to composing them manually.
21-
%Although the problems that arise with installing scientific software are ubiquitous,
22-
Despite these ubiquitous problems,
23-
there is still inadequate collaboration
24-
between HPC sites to date: less than 30\% of respondents indicated that they
25-
work together with other HPC sites regarding software installation, even in most recent surveys.
26-
%Since EasyBuild can help relieve these burdens and foster collaboration, a tutorial introducing this tool is highly relevant to ISC'22 attendees.
27-
Hence, an EESSI tutorial is very relevant to ISC'25 attendees as this tool helps relieve these burdens and fosters
28-
collaboration.
29-
% KH: updated to use survey results from SC19 (Tue Feb 11th 17:43 CET)
30-
%\comment{MG: Are these numbers still OK? I seem to remember that at SC'19 more people said that they were using Spack or EB.}
31-
%\comment{KH: 25\% Spack, 15\% EasyBuild, see slide 4 in \url{http://easybuilders.github.io/easybuild/files/SC19_GSSI_BoF/GSSI_SC19_Survey_results.pdf}; w.r.t. collaboration, it's about 33\% if you add "yes" and "sometimes" (see slide 10)}
32-
33-
%The main goal of this tutorial is to facilitate the daily work of its attendees.
34-
% It will introduce EasyBuild as a tool for providing optimised, reproducible, multi-platform scientific software installations in a consistent, efficient, and user-friendly manner. We will explain the scope of EasyBuild and show attendees how to get started,
35-
% how to tap into some advanced features,
36-
% showcase its use on some large-scale systems,
37-
% and show how to engage with the EasyBuild community.
12+
spend significant amounts of time on optimised software installations.
13+
Much of this effort is repeatedly duplicated, not just \emph{between}
14+
individuals but also by the individuals themselves as they reinstall
15+
the software they need for all of the computational platforms they have access to.
16+
%Surveys conducted at the
17+
%\emph{``Getting Scientific Software Installed''} Birds-of-a-Feather sessions that we (co-)organised at both SC and ISC
18+
%reveal that this (still) involves a significant amount of `manual' effort.
19+
% In the SC'19 survey,
20+
% less than half of the respondents consistently automate software installation,
21+
% and only ~25\% automate environment module file generation.
22+
% Despite these ubiquitous problems,
23+
% there is still inadequate collaboration
24+
% between HPC sites to date: less than 30\% of respondents indicated that they
25+
% work together with other HPC sites regarding software installation, even in most recent surveys.
26+
% Hence, an EESSI tutorial is very relevant to ISC'25 attendees as this tool helps relieve these burdens and fosters
27+
% collaboration.
3828

3929
The \textbf{European Environment for Scientific Software Installations
4030
(EESSI)}\footnote{\href{https://eessi.io}{https://eessi.io}} project is a collaborative project
@@ -62,18 +52,18 @@ \subsection*{Overview and Goals}
6252
Additional families of general-purpose microprocessors including Arm 64-bit (aarch64) and RISC-V on top of th
6353
well-established Intel and AMD processors (both x86\_64), and different types of GPUS (NVIDIA, AMD, Intel) are
6454
increasing the diversity in system architectures. The rapid expansion of computational science beyond traditional
65-
domains like physics and computational chemistry, including bioinformatis, Machine Learning (ML) and Artificial
55+
domains like physics and computational chemistry, including bioinformatics, Machine Learning (ML) and Artificial
6656
Intelligence (AI), etc. leads to a significant growth of the software stack that is used for running scientific
6757
workloads. The emergence of commercial cloud infrastructure (Amazon EC2, Microsoft Azure, ...) and private cloud
6858
infrastructure (OpenStack) has competitive advantages over on-premise infrastructure for computational workloads, such
6959
as near-instant availability, increased flexibility, a broader variety of hardware platforms, and faster access to new
70-
generations of microprocessors. In addition the manpower that is available in the HPC user support teams that are
71-
responsible for helping scientists with running the software they require on high-end (and complex) infrastructure like
60+
generations of microprocessors. However, the manpower that is available in the HPC user support teams that are
61+
responsible for helping scientists with running the software they require on high-end and complex infrastructure like
7262
supercomputers (and beyond) is limited. These reasons indicate that there is a strong need for more collaboration on
73-
building and installing scientific software to avoid duplicate work across computational scientists and HPC user support
63+
building and installing scientific software to avoid duplicate work among computational scientists and HPC user support
7464
teams.
7565

76-
The main goal of EESSI is to provide a collection of scientific software installations that work across a wide range of
66+
The main goal of EESSI is to provide a collection of scientific software \emph{installations} that work across a wide range of
7767
different platforms, including HPC clusters, cloud infrastructure, and personal workstations and laptops, without making
7868
compromises on the performance of that software.
7969

@@ -90,7 +80,7 @@ \subsection*{Overview and Goals}
9080
of client systems that use the software installations provided by EESSI. It consists of a limited set of libraries
9181
and tools that are installed in a non-standard filesystem location (a "prefix"), which were built from source for
9282
the supported CPU families using Gentoo Prefix.
93-
\item The top layer of EESSI is called the software layer, which contains the actual scientific software applications
83+
\item The user-facing layer of EESSI is called the software layer, which contains the actual scientific software applications
9484
and their dependencies. Building, managing, and optimising the software installations included in the software
9585
layer is done using EasyBuild, a well-established software build and installation framework for managing
9686
(scientific) software stacks on High-Performance Computing (HPC) systems. Next to installing the software itself,

0 commit comments

Comments
 (0)