Skip to content

Add build tools#157

Merged
Zeitsperre merged 7 commits intomasterfrom
add-build-tools
Apr 15, 2026
Merged

Add build tools#157
Zeitsperre merged 7 commits intomasterfrom
add-build-tools

Conversation

@Zeitsperre
Copy link
Copy Markdown
Collaborator

@Zeitsperre Zeitsperre commented Nov 10, 2025

Overview

This PR fixes #156 by adding relevant build packages to the Conda environment for compilation of C, C++, and Fortran libraries.

Changes

  • Adds:
    • make (build runner for *nix systems)
    • cmake (cross-platform build runner, specific for raven-hydro)
    • c-compiler (metapackage for C)
    • cxx-compiler (metapackage for C++)
    • fortran-compiler (metapackage for FORTRAN)
    • libnetcdf (NetCDF4 C Library)
    • netcdf-fortran (NetCDF4 FORTRAN Library)

Testing Checklist

  • Manually verified whether build tools can be called as a user of PAVICS

Related Issue / Discussion

This PR does not require an immediate release. All this does is add the build-essential libraries so that users building and running C/C++/Fortran libraries no longer need to install those compilers manually in their Conda environments.

@Zeitsperre Zeitsperre requested review from huard and tlvu November 10, 2025 17:20
@Zeitsperre Zeitsperre self-assigned this Nov 10, 2025
Copy link
Copy Markdown
Collaborator

@tlvu tlvu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM but can you make sure we can compile CLASSIC as in branch add-fortran-compiler-for-classic master...add-fortran-compiler-for-classic?

Then this PR can replace my branch above.

Don't forget to commit all the extra output files under docker/saved_buildout/ for tracking purposes.

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

@tlvu Sounds good.

Can you verify how you install CLASSIC? When checking the website, I get the impression that we need to install Singularity as well (there happens to be a Conda package, but's very old)?

If we just need a Fortran compiler, this PR already covers it.

@tlvu
Copy link
Copy Markdown
Collaborator

tlvu commented Nov 10, 2025

@tlvu Sounds good.

Can you verify how you install CLASSIC? When checking the website, I get the impression that we need to install Singularity as well (there happens to be a Conda package, but's very old)?

If we just need a Fortran compiler, this PR already covers it.

Did you look at my branch?

#$ git clone https://gitlab.com/cccma/classic.git
#$ cd classic
#$ make
#$ bin/CLASSIC_serial
##shows help message

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

Zeitsperre commented Nov 10, 2025

@tlvu It seems like this approach won't work for CLASSIC:

image

Despite libnetcdf-fortran being installed (f90==True), the CLASSIC code isn't set up to look for it?

image

Update: It works if we explicitly set the NetCDF-Fortran location in the Makefile. I think we can go forward with this as is.

FFLAGS = $(FFLAGS) -I/opt/conda/envs/birdy/include
image

@tlvu
Copy link
Copy Markdown
Collaborator

tlvu commented Nov 10, 2025

Update: It works if we explicitly set the NetCDF-Fortran location in the Makefile. I think we can go forward with this as is.

FFLAGS = $(FFLAGS) -I/opt/conda/envs/birdy/include

So the Makefile has to be modified? Or we can simply set this when invoking make so we can simply document this?

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

The Makefile needs to be modified, from what I can tell. For the type of user that would make use of this (advanced Fortran developers), I don't even think this constitutes a problem. It's possible that there is an option to invoke that configuration, but that's not entirely clear to me.

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

@tlvu Do we need to document this? Outside two Ouranos employees I know of that use CLASSIC, who else would this be supporting?

I'd be more inclined to modify the Makefile for CLASSIC on their repository to add some configurations to support conda instead.

@tlvu
Copy link
Copy Markdown
Collaborator

tlvu commented Nov 10, 2025

It's possible that there is an option to invoke

Trying my luck: make FFLAGS=-I/opt/conda/envs/birdy/include (setting make variable on the fly)? Or FFLAGS=-I/opt/conda/envs/birdy/include make (make it an env var when invoking make)?

@tlvu
Copy link
Copy Markdown
Collaborator

tlvu commented Nov 10, 2025

@tlvu Do we need to document this? Outside two Ouranos employees I know of that use CLASSIC, who else would this be supporting?

I'd be more inclined to modify the Makefile for CLASSIC on their repository to add some configurations to support conda instead.

Question for @huard I guess, he asked for this so he would know the use-case.

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

Zeitsperre commented Nov 10, 2025

@tlvu The FFLAGS are supplied by the Makefile directly, no option to pass additional FFLAGS (that use-case wasn't thought of at the time, I suppose):

ifeq ($(mode), supercomputer)
	# Wrapper to the default Fortran compiler loaded via a module. The following is specific to the Intel compiler.
	# Include/library flags should NOT be specified as the appropriate ones should be available via the loaded module.
	COMPILER = ftn
	# Fortran Flags.
        FFLAGS = -DPARALLEL -r8 -g -O2 -mp1 -xCORE-AVX2 -align array64byte -init=arrays -init=zero -traceback -module $(ODIR)
else ifeq ($(mode), cray)
	# Wrapper to the default Fortran compiler loaded via a module. The following is specific to the Cray compiler.
	# Include/library flags should NOT be specified as the appropriate ones should be available via the loaded module.
	COMPILER = ftn
	# Fortran Flags. Note: -g prevents most optimizations, -rm gives listings.
	FFLAGS = -DPARALLEL -s real64 -e0 -ez -O2 -g
else ifeq ($(mode), ppp)
        # NB: Support for parallel netCDF library must first be enabled prior to running make via:
        #     . ssmuse-sh -x hpco/exp/hdf5-netcdf4/parallel/openmpi-3.1.2/static/intel-19.0.3.199/01
        # Open MPI wrapper to the default Fortran compiler. The following is specific to the Intel compiler.
        # Include/library flags for Open MPI should NOT be specified as the appropriate ones should be available via the wrapper.
        COMPILER = mpif90
        # Fortran Flags.
        FFLAGS = -DPARALLEL -r8 -g -O2 -mp1 -xCORE-AVX2 -align array64byte -init=arrays -init=zero -traceback -module $(ODIR)
        # For debugging:
        #FFLAGS = -DPARALLEL -r8 -g -O1 -mp1 -xCORE-AVX2 -align array64byte -init=arrays -init=zero -traceback -fpe0 -module $(ODIR)
        # Library Flags for netCDF.
        LFLAGS = -lnetcdff -lnetcdf -lhdf5_hl -lhdf5 -lz -lcurl
else ifeq ($(mode), parallel)
        # Parallel compiler.
        COMPILER = mpif90
        # Fortran Flags. The following is specific to the gfortran compiler.
        FFLAGS = -DPARALLEL -O3 -g -fdefault-real-8 -ffree-line-length-none -fbacktrace -ffpe-trap=invalid,zero,overflow -fbounds-check -J$(ODIR)
        # Include Flags.
        IFLAGS =  -I/para_netcdf_hdf-4.6.3/MPI/include
        # Library Flags.
        LFLAGS = -L/para_netcdf_hdf-4.6.3/MPI/lib -lnetcdff -lnetcdf -lhdf5_hl -lhdf5
else
	# Serial compiler.
	COMPILER = gfortran
	mode = serial
	ODIR = objectFiles
	# Fortran Flags.
	FFLAGS = -O3 -g -fdefault-real-8 -ffree-line-length-none -fbacktrace -ffpe-trap=invalid,zero,overflow -fbounds-check -J$(ODIR) #-Wall -Wextra
	# Include Flags.
	IFLAGS = -I/usr/include
	# Library Flags
	LFLAGS = -lnetcdff -ldl -lz -lm
endif

### ADDED BY @Zeitsperre 
FFLAGS = $(FFLAGS) -I/opt/conda/envs/birdy/include

# Create required directory/.gitignore file, if missing.
VOID := $(shell mkdir -p $(ODIR))
#VOID := $(shell [ ! -f $(ODIR)/.gitignore ] && cp objectFiles/.gitignore $(ODIR))

# RECIPES
# Compile object files from .F90 sources
$(ODIR)/%.o: src/%.F90
	$(COMPILER) $(FFLAGS) -c $< -o $@

# Compile object files from .f90 sources
$(ODIR)/%.o: src/%.f90
	$(COMPILER) $(FFLAGS) $(IFLAGS) -c $< -o $@

# Compile object files from .f (Fortran 77) sources
$(ODIR)/%.o: src/%.f
	$(COMPILER) $(FFLAGS) $(IFLAGS) -c $< -o $@

# Properly reference the ODIR for the linking
OBJD = $(patsubst %,$(ODIR)/%,$(OBJ))

# Link objects together and put executable in the bin/ directory
CLASSIC: $(OBJD)
	$(COMPILER) $(FFLAGS) $(IFLAGS) -o bin/CLASSIC_$(mode) $(OBJD) $(LFLAGS)

# "make clean mode=supercomputer" removes all object files in objectFiles_supercomputer
clean:
	rm -f $(ODIR)/*.o $(ODIR)/*.mod bin/CLASSIC_$(mode)

Thankfully, there is a way to set ENV_VARS using conda, so we could technically have something set for birdy on activation (see: https://stackoverflow.com/a/62508395). But, just to be clear, we would still need to change the Makefile to accept this.

Copy link
Copy Markdown
Collaborator

@tlvu tlvu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, as long as @huard is OK with the extra work-around when compiling CLASSIC.

No need to generate output for docker/saved_buildout/, we can do next time we release a full build.

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

@tlvu Sounds good, I was just about to write the same thing.

The CLASSIC build workflow is several years without changes. I think it's fair to ask ECCC for (or provide ECCC with) a pull request to update their Makefile. Adding something as simple as this to the Makefile might work fine:

IFLAGS := $(CONDA_PREFIX)/include

@huard
Copy link
Copy Markdown
Contributor

huard commented Feb 10, 2026

When I read https://docs.conda.io/projects/conda-build/en/stable/resources/compiler-tools.html#using-the-compiler-packages
I get the impression that there is way to alias the conda compilers with their standard name. But when I try this on the miniconda docker image, I can't get it to work. I mean, if this worked, I feel like it would be the ideal solution. You activate the environment, and then the usual commands simply point to the conda compilers.

@Zeitsperre
Copy link
Copy Markdown
Collaborator Author

@tlogan2000 PVI

@tlogan2000
Copy link
Copy Markdown
Collaborator

@tlogan2000 PVI

Thanks.
Just a word of caution that I think this can probably be put in place but could eventually be flagged as too big a security risk. We are hoping to have some external support for a security audit of our PAVICS infra sometime in the next year or so. Basically I am trying to say that if this becomes available now there is no guarantee that it will always be available on the platform

Signed-off-by: Trevor James Smith <10819524+Zeitsperre@users.noreply.github.com>
Signed-off-by: Trevor James Smith <10819524+Zeitsperre@users.noreply.github.com>
@Zeitsperre Zeitsperre mentioned this pull request Mar 17, 2026
3 tasks
@Zeitsperre Zeitsperre mentioned this pull request Apr 8, 2026
3 tasks
@Zeitsperre Zeitsperre merged commit 68a209c into master Apr 15, 2026
1 check failed
@Zeitsperre Zeitsperre deleted the add-build-tools branch April 15, 2026 17:14
Zeitsperre added a commit that referenced this pull request Apr 15, 2026
# Overview

<!-- Please include a summary of the changes and which issues is fixed.
Please also include relevant motivation and context. List any
dependencies that are required for this change. -->

This PR prepares the next major update of the Jupyter image of PAVICS
(2026-03).

The goal of this update is to update the Ouranos library offerings as
well as remove several older workarounds and lift several dependency
pins. It also updates the base Python from 3.11 to 3.12.

The most significant changes is that `jupyterlab` is now v4.x
(previously 3.x). For this new `jupyterlab`, many workarounds and the
functionality from older/deprecated plugins are now built in. The docker
build step is faster and there is a notable decrease in RAM usage.

The `birdy` ipython kernel has been removed. The presence of this kernel
was being used to run _some_ notebooks and was using the same basis as
the `python3` kernel. Due to the strict metadata comparisons that
`pytest` runs, this meant that we needed two kernels as some notebooks
were run with `python3` and others were run with `birdy`. The `python3`
kernel still targets the `birdy` environment for "reasons". Notebooks
required output metadata updates in several projects.

Additionally, the two-step environment update process has been scrapped.
The majority of packages are `conda`-based and the Ouranos offerings
have been working well together for quite some time. This not only
reduces complexity but also speeds up the build time. `micromamba`
(unused in the build process) has been removed.

Other small changes are that `launchcontainer` and `launchnotebook` no
longer overwrite UID and GID (fails in `zsh`) and instead now use
`USER_ID` and `GROUP_ID` for call substitutions. They also no longer use
back-tick commands:
```
VARIABLE = `some-command -flag`
```
is now:
```
VARIABLE = "$(some-command -flag)"
```

<!-- NOTE: Remember to tag 'release-py###-######' on the commit of this
merge. -->

## Changes

- Updates Jupyterlab from v3.x to v.4x (v4.5.6)
- Cleans up a lot of workarounds found within the Dockerfile previously
required for Jupyterlab 3.x
- Removes the `birdy` Jupyter kernel
- Adds two new plugins to Jupyterlab:
  - jupyterlab-logout (a logout button is now present in the top-right)
- jupyterlab-theme-toggler (a light/dark theme switch is present in the
top-right)
- Build tools (C/C++/FORTRAN) have been added 
- Jupyter env changes:
  - Relevant changes (alphabetical order):
```diff
<   - birdy=0.8.7=pyhd8ed1ab_1
>   - birdy=0.9.1=pyhd8ed1ab_0

<   - bottleneck=1.4.2=py311h9f3472d_0
>   - bottleneck=1.6.0=np2py312hfb8c2c5_3

<   - cf_xarray=0.10.5=pyhd8ed1ab_0
>   - cf_xarray=0.10.11=pyhd8ed1ab_1

>   - cmake=4.2.3=hc85cc9f_1

>   - cxx-compiler=1.11.0=hfcd1e18_0

<   - dash=3.0.3=pyhd8ed1ab_0

<   - dask=2025.4.0=pyhd8ed1ab_0
>   - dask=2026.3.0=pyhc364b38_0

<   - eccc_librmn=20.0.3=0
<   - eccc_libtdpack=1.6.3.1=0
<   - eccc_libvgrid=6.9.3=0
<   - eccc_rpnpy=2.2.0=py311h48b7412_1
<   - eccc_support_libs=2023.12.0=0
>   - eccc_librmn=20.0.4=hdcf949b_2
>   - eccc_libtdpack=1.6.3.1=hdcf949b_2
>   - eccc_libvgrid=6.9.3.1=hdcf949b_2
>   - eccc_rpnpy=2.2.0=py312h738df08_1

<   - figanos==0.4.0 (PyPI)
>   - figanos=0.6.0=pyhd8ed1ab_0

>   - firefox=149.0=h54a6638_0

<   - flask=3.1.0=pyhd8ed1ab_1
>   - flask=3.1.3=pyhcf101f3_1

>   - fortran-compiler=1.11.0=h9bea470_0

>   - gcc=14.3.0=h0dff253_18

<   - geoviews=1.14.0=hd8ed1ab_0
>   - geoviews=1.15.1=pyhd8ed1ab_0

>   - gxx=14.3.0=h76987e4_18

<   - holoviews=1.20.2=pyhd8ed1ab_0
<   - hvplot=0.11.2=pyhd8ed1ab_0
>   - holoviews=1.22.1=pyhd8ed1ab_0
>   - hvplot=0.12.2=pyhd8ed1ab_0

<   - intake=2.0.8=pyhd8ed1ab_0
<   - intake-esgf=2025.6.6=pyhd8ed1ab_0
<   - intake-esm=2025.2.3=pyhd8ed1ab_0
<   - intake-xarray=0.7.0=pyhd8ed1ab_0
>   - intake=2.0.9=pyhd8ed1ab_0
>   - intake-esgf=2026.1.26=pyhd8ed1ab_0
>   - intake-esm=2025.2.3=pyhd8ed1ab_1
>   - intake-xarray=2.0.0=pyhd8ed1ab_1

<   - jupyterlab=3.6.8=pyhd8ed1ab_0
<   - jupyterlab-git=0.44.0=pyhd8ed1ab_0
<   - jupyterlab-topbar=0.6.1=pyhd8ed1ab_3
<   - jupyterlab_pygments=0.3.0=pyhd8ed1ab_0
<   - jupyterlab_server=2.27.3=pyhd8ed1ab_1
<   - jupyterlab_widgets=3.0.14=pyhd8ed1ab_0
<   - jupytext=1.17.0=pyhbbac1ac_0
>   - jupyterlab=4.5.6=pyhd8ed1ab_0
>   - jupyterlab-geojson=3.4.0=pyhd8ed1ab_1
>   - jupyterlab-git=0.52.0=pyhd8ed1ab_0
>   - jupyterlab_pygments=0.3.0=pyhd8ed1ab_2
>   - jupyterlab_server=2.28.0=pyhcf101f3_0
>   - jupyterlab_widgets=3.0.16=pyhcf101f3_1

<   - matplotlib=3.10.1=py311h38be061_0
>   - matplotlib=3.10.8=py312h7900ff3_0

<   - nbconvert=7.16.6=hb482800_0
>   - nbconvert=7.17.0=h14065e2_0

<   - nodejs=22.13.0=hf235a45_0
<   - notebook=6.5.7=pyha770c72_0
>   - nodejs=22.21.1=h4a9c4b4_0
>   - notebook=7.5.5=pyhcf101f3_0

<   - numba=0.61.2=py311h4e1c48f_0
<   - numpy==1.26.4 (PyPI)
>   - numba=0.65.0=py312hd1dde6f_0
>   - numpy=2.2.6=py312h72c5963_0

<   - pandas=2.2.3=py311h7db5c69_3
>   - pandas=2.3.3=py312hf79963d_1

<   - pint=0.24.4=pyhd8ed1ab_1
>   - pint=0.25.3=pyhc364b38_0

<   - pip=25.0.1=pyh8b19718_0
>   - pip=26.0.1=pyh8b19718_0

<   - proj=9.6.0=h0054346_1
>   - proj=9.6.2=h18fbb6c_2

<   - pre-commit=4.2.0=pyha770c72_0

<   - pydantic=2.11.3=pyh3cfb1c2_0
>   - pydantic=2.13.0=pyhcf101f3_0

<   - pyproj=3.7.1=py311h0960b38_1
>   - pyproj=3.7.2=py312h1c88c49_1

<   - pytest=8.3.5=pyhd8ed1ab_0
>   - pytest=9.0.2=pyhcf101f3_0

<   - python=3.11.12=h9e4cc4f_0_cpython
>   - python=3.12.13=hd63d673_0_cpython

<   - raven-hydro=0.4.0=py311h81cb690_0
<   - ravenpy=0.19.0=pyhd8ed1ab_0
>   - raven-hydro=4.12.1=py312h07c4d9f_1
>   - ravenpy=0.21.0=pyhd8ed1ab_2

<   - requests=2.32.3=pyhd8ed1ab_1
>   - requests=2.33.1=pyhcf101f3_0

<   - scikit-image=0.25.2=py311h7db5c69_0
<   - scikit-learn=1.6.1=py311h57cc02b_0
<   - scipy=1.15.2=py311h8f841c2_0
>   - scikit-image=0.26.0=np2py312h4ae17e4_0
>   - scikit-learn=1.8.0=np2py312h3226591_1
>   - scipy=1.17.1=py312h54fa4ab_0

<   - xarray=2025.3.0=pyhd8ed1ab_0
>   - xarray=2025.10.1=pyhcf101f3_1
>   - xarray-spatial=0.9.5=pyhd8ed1ab_0

<   - xclim=0.57.0=pyhd8ed1ab_0
>   - xclim=0.60.0=pyhd8ed1ab_0

<   - xesmf=0.8.9=pyhd8ed1ab_0
>   - xesmf=0.9.2=pyhd8ed1ab_0

>   - xhydro=0.7.1=pyhd8ed1ab_0

<   - xscen=0.12.3=pyhd8ed1ab_0
>   - xscen=0.14.0=pyhd8ed1ab_0

<   - xsdba=0.5.0=pyhd8ed1ab_0
>   - xsdba=0.6.1=pyhd8ed1ab_0

<   - xskillscore=0.0.26=pyhd8ed1ab_1
>   - xskillscore=0.0.29=pyhd8ed1ab_0
```

## Testing Checklist

- [x] Deployed as "beta" image in production for bokeh visualization
performance regression testing.
- [x] Manually tested notebook
https://github.com/Ouranosinc/PAVICS-landing/blob/master/content/notebooks/climate_indicators/PAVICStutorial_ClimateDataAnalysis-5Visualization.ipynb
for bokeh visualization performance.
- [x] Committed the Jenkins build log to this Pull Request:
https://github.com/Ouranosinc/PAVICS-e2e-workflow-tests/blob/release-py312-260415/docker/saved_buildout/jenkins-buildlogs-default.txt

## Related Issue / Discussion

- Matching notebook fixes:
  - pavics-sdi: Ouranosinc/pavics-sdi#378
- PAVICS-landing: Ouranosinc/PAVICS-landing#125

- Deployment to PAVICS:
bird-house/birdhouse-deploy#683

- Jenkins-config changes for new notebooks:
#162

- Pull Requests that this contribution is building upon:
#157 (must
be merged once release is tagged)

- Previous release:
#150

## Additional Information

Full diff of the conda env export:

release-py311-250423-update250730...release-py312-260415

Full new conda env export:

https://github.com/Ouranosinc/PAVICS-e2e-workflow-tests/blob/release-py312-260415/docker/saved_buildout/conda-env-export.yml

DockerHub build log:

https://github.com/Ouranosinc/PAVICS-e2e-workflow-tests/blob/release-py312-260415/docker/saved_buildout/docker-buildlogs.txt
Zeitsperre added a commit to bird-house/birdhouse-deploy that referenced this pull request Apr 15, 2026
## Overview

This PR updates the PAVICS Jupyter image to tag `py312-260415`.

Many of the changes performed within the image were to ensure that the
latest software offerings were available to users as well to remove
several old workarounds for Jupyterlab that have been rendered obsolete
in recent years.

These changes also introduce user-accessible build tools installed via
`conda`. This was done to allow for users to perform manual
installations of C/C++/FORTRAN binaries and was a requested change from
developers and users of PAVICS.

## Changes

- The base Python version has been updated to Python3.12.
- The image now includes common build tools
(`x86_64-conda-linux-gnu-{gcc|g++|gfortran}`, `make`, `cmake`) installed
via `conda`. They are exposed to the userspace via their expected tool
names so that common build tools use them by default.
- The "birdy" `ipython` kernel has been removed. There are still two
`conda` environments (`base` and `birdy`), but the kernels offered to
users are now `python` and the `xeus-python` kernels (`xpython` and
`xpython-raw`).
- `jupyterlab` has been upgraded from v3.x to v4.x (v4.5.6). This
allowed for the removal of several workarounds required for legacy
`jupyterlab` plugins. Modern `jupyterlab` plugin support no longer
requires explicit build/install steps.
- Updates to all Ouranos software libraries: `xclim` (v0.60.0), `xsdba`
(v0.6.1), `xscen` (v0.14.0), `figanos` (v0.6.0)
- The latest `ravenpy` (v0.21.0) conda package now requires the `raven`
model to be explicitly installed (`raven-hydro`). This was done to more
easily allow for users installing custom `raven` binaries in their
environments.
- Most `PyPI`-based packages have been removed with the exception of new
`jupyterlab` interface plugins (`jupyterlab-logout` and
`jupyterlab-theme-toggler`).

**Non-breaking changes**
- Updates Jupyter image from `py311-250423-update250730` to
`py312-260415`

## Related Issue / Discussion

PAVICS-e2e-workflow-tests Pull Requests:
- Ouranosinc/PAVICS-e2e-workflow-tests#161
- Ouranosinc/PAVICS-e2e-workflow-tests#157 

## CI Operations

<!--
The test suite can be run using a different DACCS config with
``birdhouse_daccs_configs_branch: {branch_name}`` in the PR description.
To globally skip the test suite regardless of the commit message use
``birdhouse_skip_ci`` set to ``true`` in the PR description.

Using ``[<cmd>]`` (with the brackets) where ``<cmd> = skip ci`` in the
commit message will override ``birdhouse_skip_ci`` from the PR
description.
Such commit command can be used to override the PR description behavior
for a specific commit update.
However, a commit message cannot 'force run' a PR which the description
turns off the CI.
To run the CI, the PR should instead be updated with a ``true`` value,
and a running message can be posted in following PR comments to trigger
tests once again.
-->

birdhouse_daccs_configs_branch: master
birdhouse_skip_ci: false
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add libraries required to build Raven from source

4 participants