Check out what's in the latest release of Kabanero Learn More
Close the table of contents



Creating and updating tasks and pipelines

duration 30 minutes


A default set of tasks and pipelines are provided that perform a variety of CI/CD functions. These tasks and pipelines include validating the application stack is active on the cluster, building applications stacks using appsody, pushing the built image to an image repository, and deploying the application to the cluster. These tasks and pipelines operate with the default application stacks and operate as-is for new application stacks you might create. Also, there are cases where you can update the tasks or pipelines or create new ones. This guide explains the steps you follow to make updates to tasks or pipelines and how you can update your Kabanero CR to use your new pipeline release.

Setting up a pipelines repo

  1. Clone the pipelines repository.

    git clone
  2. Notice that the default pipelines and tasks are under the pipelines/incubator repository.

     cd pipelines/incubator
  3. Edit the existing tasks, pipelines, or trigger files as needed or add your new tasks and pipelines here. To learn more about pipelines and creating new tasks, see the pipeline tutorial.

Creating a pipelines release from your pipelines repo

The product operator expects all the pipeline’s artifacts to be packaged in an archive file. The archive file must include a manifest file that lists each file in the archive along with its sha256 hash. The kabanero-pipelines repo contains a set of artifacts under the ci directory that easily allows you to create and publish a release of your pipelines.

Creating the pipelines release artifacts locally

You can build your pipeline repo locally and generate the necessary pipeline archive that is used in the Kabanero CR. The archive file can then be hosted in some place of your choosing and used in the Kabanero CR. To generate the archive file locally:

  1. Run the following command from the root directory of your local copy of the pipelines repo:

     . ./ci/
  2. Locate the archive file under the ci/assets directory.

  3. Upload the archive file to your preferred hosting location and use the URL in the Kabanero CR as described in the next section.

Creating the pipelines release artifacts from your public Github pipelines repo using Travis

If your pipelines are hosted on a public GitHub repo, you can set up a Travis build against a release of your pipelines repo, which generates the archive file and attaches it to your release. The kabanero-pipelines repo provides a sample .travis.yml file.

Use the location of the archive file under the release in the Kabanero CR as described in the next section.

Creating the pipelines release artifacts from your GHE pipelines repo using a pipeline on the OpenShift cluster

Use the following steps to trigger a pipeline build of your pipelines repository. The pipeline will build the pipelines and deploy a pipelines-index container into your cluster. The pipelines-index container will host the pipeline archive on an NGINX server.

  1. Login to your OpenShift cluster.

  2. Navigate to the ci directory of your pipelines repo.

  3. Activate the pipeline.
     oc -n kabanero apply -f tekton/pipelines-build-pipeline.yaml 
  4. Activate the task.
     oc -n kabanero apply -f tekton/pipelines-build-task.yaml 
  5. Configure security constraints for service account pipelines-index
     oc -n kabanero adm policy add-scc-to-user privileged -z pipelines-index
  6. Create the pipelines-build-git-resource.yaml file with the following contents. Modify revision and url properties as needed to point to your pipelines repository and the revision you want to build.

     kind: PipelineResource
       name: pipelines-build-git-resource
       - name: revision
         value: master
       - name: url
       type: git
  7. Activate the pipelines-build-git-resource.yaml file.

     oc -n kabanero apply -f pipelines-build-git-resource.yaml
  8. Create a pipelines-build-pipeline-run.yaml file with the following contents.

     kind: PipelineRun
       name: pipelines-build-pipeline-run
       namespace: kabanero
         name: pipelines-build-pipeline
       - name: git-source
           name: pipelines-build-git-resource
         - name: deploymentSuffix
           value: latest
       serviceAccountName: pipelines-index
       timeout: 60m
  9. Create a secret for your git account and associate it with the pipelines-index service account. For example:
     oc -n kabanero secrets link pipelines-index basic-user-pass
  10. Trigger the pipeline.
     oc -n kabanero delete --ignore-not-found -f pipelines-build-pipeline-run.yaml
     sleep 5
     oc -n kabanero apply -f pipelines-build-pipeline-run.yaml

    You can track the pipeline execution in the Tekton dashboard or via CLI:

     oc -n kabanero logs $(oc -n kabanero get pod -o name -l --all-containers -f 

    When the build completes successfully, a pipelines-index-latest container is deployed into your cluster.

  11. Get the route for the pipelines-index-latest pod.

     PIPELINES_URL=$(oc -n kabanero get route pipelines-index-latest --no-headers -o=jsonpath='https://{.status.ingress[0].host}/default-kabanero-pipelines.tar.gz')
     echo $PIPELINES_URL
  12. Use the URL in the Kabanero CR as described in the next section.

Updating the Kabanero CR to use the new release

Follow the configuring a Kabanero CR instance documentation to configure or deploy a product instance with the pipeline archive URL obtained in the previous step. Then, generate the digest of the pipelines archive contained at this URL and specify it in the Kabanero CR. You can use a command like sha256sum to obtain the digest.

See the following example where the pipelines that are published in the archive are associated with each of the stacks that exist in the stack repository.

kind: Kabanero
  name: kabanero
  namespace: kabanero
  version: "0.9.1"
    - name: central
    - id: default
      sha256: deb5162495e1fe60ab52632f0879f9c9b95e943066590574865138791cbe948f

As an alternative, you can specify the pipelines archive under individual stack sections. This configuration associates the pipelines in the archive with these application stacks.

Way to go! What's next?

What could make this guide better?

Raise an issue to share feedback

Edit or create new guides to help contribute

Need help?

Ask a question on Stack Overflow

Where to next?