ODS provides CI/CD pipeline support based on OpenShift Pipelines. This introduction will walk you through the essentials, and guide you all the way to more advanced topics. Basic knowledge of Kubernetes concepts and OpenShift is assumed. Estimated reading time is about 15 minutes.
OpenShift Pipelines is a Kubernetes-style CI/CD solution based on Tekton. It builds on the Tekton building blocks and offers tight integration with OpenShift. The main addition over plain Tekton is a UI in the OpenShift console.
Tekton provides a framework to create cloud-native CI/CD pipelines. The building blocks for those pipelines are defined using Kubernetes Custom Resources.
A Tekton pipeline references a series of tasks (a Kubernetes resource named Task
). When the pipeline runs, Kubernetes will schedule one pod per task. Each task is made up of a series of steps. Each step corresponds to one container in the task pod. At a minimum, a step defines the container image to use, and which command / script to run. Therefore, a step can achieve a huge variety of things such as building artifacts, deploying, etc. The pipeline run is provided a workspace (a Kubernetes PVC volume mounted in the task pods), allowing all tasks to work on the same repository checkout.
At this stage you know just enough about Tekton to continue with this introduction, but if you want to know more about it, you can read the Tekton docs and/or follow the OpenShift Pipelines tutorial.
In regard to CI/CD, ODS Pipeline provides two things:
-
two Tekton Tasks, one to start pipeline runs, and one to finish pipeline runs
-
a pipeline manager responding to Bitbucket webhook events by triggering pipelines
All pipeline runs have some plumbing to do at the beginning and the end of a pipeline run, such as checking out the source code and setting the Bitbucket build status. To save you from implementing this yourself, two tasks are offered, ods-pipeline-start
and ods-pipeline-finish
.
Now, it would be cumbersome to manually create a PipelineRun
in OpenShift for each pushed commit, referencing the start and finish task and any other task you want to run to build, package and deploy your application.
To automate this process, ODS Pipeline ships with the pipeline manager. This service automatically starts pipeline runs in response to Bitbucket webhook requests, based on task definitions stored in the Git repository to which the pipeline corresponds. Further, it automatically injects the start and finish task.
To understand how this works, it is best to trace the flow starting from the repository. Assume you have a repository containing a Go application, and you want to run a pipeline building a container image for it every time you push to Bitbucket. To achieve this in a project, all you need is to have an ods.yaml
file in the root of your repository. The ods.yaml
file defines the tasks you want to run in the pipeline. Let’s look at an example ods.yaml
file for our Go repository:
pipeline:
- tasks:
- name: build
taskRef:
resolver: git
params:
- { name: url, value: https://github.com/opendevstack/ods-pipeline-go.git }
- { name: revision, value: v0.1.2 }
- { name: pathInRepo, value: tasks/build.yaml }
workspaces:
- name: source
workspace: shared-workspace
- name: package
taskRef:
resolver: git
params:
- { name: url, value: https://github.com/opendevstack/ods-pipeline-image.git }
- { name: revision, value: v0.1.0 }
- { name: pathInRepo, value: tasks/package.yaml }
runAfter:
- build
workspaces:
- name: source
workspace: shared-workspace
- name: deploy
taskRef:
resolver: git
params:
- { name: url, value: https://github.com/opendevstack/ods-pipeline-helm.git }
- { name: revision, value: v0.1.0 }
- { name: pathInRepo, value: tasks/deploy.yaml }
runAfter:
- package
workspaces:
- name: source
workspace: shared-workspace
You can see that it defines three tasks, build
, package
and deploy
, which run sequentially due to the usage of runAfter
. The referenced tasks are located in Git repositories. In this case, the tasks are created specifically for ODS Pipeline but in fact you can reference any Tekton task you wish, either by referencing them via the Git resolver or by pointing to Task resources installed in your OpenShift cluster.
In order to create pipeline runs based on these task definitions whenever there is a push to Bitbucket, a webhook setting must be created for the repository. This webhook must point to a route connected to the ODS pipeline manager in OpenShift. When the webhook fires, a payload with information about the pushed commit is sent. The ODS pipeline manager first checks the authenticity of the request (did the request really originate from a push in the Bitbucket repository?). Then, it retrieves the ods.yaml
file from the Git repository/ref identified in the payload, and reads the pipeline configuration. Based on the tasks defined there, it assembles a new Tekton pipeline run. Finally, the ODS pipeline manager starts the pipeline run, passing parameter values extracted from the webhook event payload. The following illustrates this flow:
With the above in place, you do not need to manage pipeline runs manually. Every repository with an ods.yaml
file and a webhook configuration automatically manages and triggers pipeline runs based on the defined tasks.
At this stage you know enough to get started using and modifying CI/CD pipelines with ODS Pipeline.