Jenkins – King of Continuous Integration

Fantastic article on how to use Jenkins by Martin Heller in Infoworld, copied below.

How Jenkins works

Jenkins is distributed as a WAR archive and as installer packages for the major operating systems, as a Homebrew package, as a Docker image, and as source code. The source code is mostly Java, with a few Groovy, Ruby, and Antlr files.

You can run the Jenkins WAR standalone or as a servlet in a Java application server such as Tomcat. In either case, it produces a web user interface and accepts calls to its REST API.

When you run Jenkins for the first time, it creates an administrative user with a long random password, which you can paste into its initial webpage to unlock the installation.

Jenkins plug-ins

Once installed, Jenkins allows you to either accept the default plugin list or choose your own plugins.

Once you have picked your initial set of plug-ins, click the Install button and Jenkins will add them.

The Jenkins main screen displays the current build queue and Executor status, and offers links to create new items (jobs), manage users, view build histories, manage Jenkins, look at your custom views, and manage your credentials.

A new Jenkins item can be any of six types of job plus a folder for organizing items.

There are 18 things you can do from the Manage Jenkins page, including the option to open a command-line interface. At this point, however, we should look at pipelines, which are enhanced workflows that are typically defined by scripts.

Jenkins pipelines

Once you have Jenkins configured, it’s time to create some projects that Jenkins can build for you. While you can use the web UI to create scripts, the current best practice is to create a pipeline script, named Jenkinsfile, and check it into your repository. The screenshot below shows the configuration web form for a multibranch pipeline.

As you can see, branch sources for this kind of pipeline in my basic Jenkins installation can be Git or Subversion repositories, including GitHub. If you need other kinds of repositories or different online repository services, it’s just a matter of adding the appropriate plug-ins and rebooting Jenkins. I tried, but couldn’t think of a source code management system (SCM) that doesn’t already have a Jenkins plug-in listed.

Jenkins pipelines can be declarative or scripted. A declarative pipeline, the simpler of the two, uses Groovy-compatible syntax—and if you want, you can start the file with #!groovy to point your code editor in the right direction. A declarative pipeline starts with a pipeline block, defines an agent, and defines stages that include executable steps, as in the three-stage example below.

pipeline {
    agent any

    stages {
        stage(‘Build’) {
            steps {
                echo ‘Building..’
            }
        }
        stage(‘Test’) {
            steps {
                echo ‘Testing..’
            }
        }
        stage(‘Deploy’) {
            steps {
                echo ‘Deploying....’
            }
        }
    }
}

pipeline is the mandatory outer block to invoke the Jenkins pipeline plugin. Agent defines where you want to run the pipeline. Any says to use any available agent to run the pipeline or stage. A more specific agent might declare a container to use, for example:

agent {
    docker {
        image ‘maven:3-alpine’
        label ‘my-defined-label’
        args  ‘-v /tmp:/tmp’
    }
}

stages contain a sequence of one or more stage directives. In the example above, the three stages are Build, Test, and Deploy.

steps do the actual work. In the example above the steps just printed messages. A more useful build step might look like the following:

pipeline {
    agent any

    stages {
        stage(‘Build’) {
            steps {
                sh ‘make’
                archiveArtifacts artifacts: ‘**/target/*.jar’, fingerprint: true
            }
        }
    }
}

Here we are invoking make from a shell, and then archiving any produced JAR files to the Jenkins archive.

The post section defines actions that will be run at the end of the pipeline run or stage. You can use a number of post-condition blocks within the post section: always, changed, failure, success, unstable, and aborted.

For example, the Jenkinsfile below always runs JUnit after the Test stage, but only sends an email if the pipeline fails.

pipeline {
    agent any
    stages {
        stage(‘Test’) {
            steps {
                sh ‘make check’
            }
        }
    }
    post {
        always {
            junit ‘**/target/*.xml’
        }
        failure {
            mail to: team@example.com, subject: ‘The Pipeline failed :(‘
        }
    }
}

The declarative pipeline can express most of what you need to define pipelines, and is much easier to learn than the scripted pipeline syntax, which is a Groovy-based DSL. The scripted pipeline is in fact a full-blown programming environment.

For comparison, the following two Jenkinsfiles are completely equivalent.

Declarative pipeline

pipeline {
    agent { docker ‘node:6.3’ }
    stages {
        stage(‘build’) {
            steps {
                sh ‘npm —version’
            }
        }
    }

Scripted pipeline

node(‘docker’) {
    checkout scm
    stage(‘Build’) {
        docker.image(‘node:6.3’).inside {
            sh ‘npm —version’
        }
    }
}

Blue Ocean, the Jenkins GUI

If you’d like the latest and greatest Jenkins UI, you can use the Blue Ocean plug-in, which provides a graphical user experience. You can add the Blue Ocean plug-in to your existing Jenkins installation or run a Jenkins/Blue Ocean Docker container. With Blue Ocean installed, your Jenkins main menu will have an extra icon.

You can open Blue Ocean directly if you wish. It’s in the /blue folder on the Jenkins server. Pipeline creation in Blue Ocean is a bit more graphical than in plain Jenkins.

Jenkins Docker

As I mentioned earlier, Jenkins is also distributed as a Docker image. There isn’t much more to the process: Once you’ve picked the SCM type, you provide a URL and credentials, then create a pipeline from a single repository or scan all repositories in the organization. Every branch with a Jenkinsfile will get a pipeline.

Here I’m running a Blue Ocean Docker image, which came with a few more Git service plug-ins installed than the default list of SCM providers:

Once you have run some pipelines, the Blue Ocean plug-in will display their status, as shown above. You can zoom in on an individual pipeline to see the stages and steps.

You can also zoom in on branches (top) and activities (bottom).

Why use Jenkins?

The Jenkins Pipeline plug-in we’ve been using supports a general continuous integration/continuous delivery (CICD) use case, which is probably the most common use for Jenkins. There are specialized considerations for some other use cases.

Java projects were the original raison d’être for Jenkins. We’ve already seen that Jenkins supports building with Maven; it also works with Ant, Gradle, JUnit, Nexus, and Artifactory.

Android runs a kind of Java, but introduces the issue of how to test on the wide range of Android devices. The Android emulator plug-in allows you to build and test on as many emulated devices as you care to define. The Google Play Publisher plug-in lets you send builds to an alpha channel in Google Play for release or further testing on actual devices.

I’ve shown examples where we specified a Docker container as the agent for a pipeline and where we ran Jenkins and Blue Ocean in a Docker container. Docker containers are very useful in a Jenkins environment for improving speed, scalability, and consistency.

There are two major use cases for Jenkins and GitHub. One is build integration, which can include a service hook to trigger Jenkins on every commit to your GitHub repository. The second is the use of GitHub authentication to control access to Jenkins via OAuth.

Jenkins supports many other languages besides Java. For C/C++, there are plug-ins to capture errors and warnings from the console, generate build scripts with CMake, run unit tests, and perform static code analysis. Jenkins has a number of integrations with PHP tools.

While Python code doesn’t need to be built (unless you’re using Cython, for instance, or creating a Python wheel for installation) it’s useful that Jenkins integrates with Python testing and reporting tools, such as Nose2 and Pytest, and code quality tools such as Pylint. Similarly, Jenkins integrates with Ruby tools such as Rake, Cucumber, Brakeman, and CI::Reporter.

1 thought on “Jenkins – King of Continuous Integration”

Comments are closed.