Jenkins offers an easy way to set up a continuous integration or continuous delivery (CI/CD) environment for almost any combination of languages and source code repositories using pipelines, in addition to automating other routine development tasks. While Jenkins doesn’t eliminate the need to script individual steps, it does give you a faster and more robust way to integrate your entire build, test, and deployment toolchain than you can easily create yourself.
Before Jenkins, the best thing a developer could do to avoid breaking nightly build was to carefully and successfully write and test their code on a local machine before committing it. But that meant testing one’s changes in isolation, without everyone else’s daily commitments. There was no firm guarantee that the nightly build would survive the last commit.
Jenkins, originally Hudson, was a direct response to this limitation.
Hudson and Jenkins
In 2004, Kohsuke Kawaguchi was a Java developer at Sun Microsystems. Kawaguchi was tired of breaking builds in his development work and wanted to find a way to know, before committing the code to the repository, if the code was going to work. So Kawaguchi built an automation server in and for Java to make this possible, called Hudson. Hudson became popular at Sun and spread to other companies as open source.
Fast forward to 2011, and a dispute between Oracle (which had acquired Sun) and the independent Hudson open source community led to a fork with a name change, Jenkins. In 2014, Kawaguchi became CTO of CloudBees, which offers continuous delivery products based on Jenkins.
Both forks continued to exist, although Jenkins was much more active. To this day, the Jenkins project is still active. Hudson’s website was closed on January 31, 2020.
In March 2019, the Linux Foundation, along with CloudBees, Google, and other companies, launched a new open source software foundation called the Continuous Delivery Foundation (CDF). The Jenkins contributors decided that their project should join this new foundation. Kawaguchi wrote at the time that nothing significant would change for users.
In January 2020, Kawaguchi announced that he would be moving to his new company, Launchable. He also said that he would be officially stepping away from Jenkins, though he would remain on CDF’s technical oversight committee. His role at CloudBees changed to advisor.
Jenkins Automation
Today, Jenkins is the leading open source automation server with 1,600-1,800 plugins to support the automation of all kinds of development tasks. The problem that Kawaguchi was originally trying to solve, continuous integration and continuous delivery of Java code (that is, creating projects, running tests, static code analysis, and deployment) is just one of many processes. that people automate with Jenkins. The available plugins span five areas: platforms, user interface, administration, source code management, and most often build management.
How Jenkins works
Jenkins is distributed as a WAR file and as installation packages for major operating systems, as a Homebrew package, as a Docker image, and as source code. Jenkins also supports installation and scaling on Kubernetes. The source code is mostly Java, with some Groovy, Ruby, and Antlr files.
You can run Jenkins WAR standalone or as a servlet on a Java application server such as Tomcat. In either case, it produces a web UI and accepts calls to its REST API.
When you first run Jenkins, it creates an administrative user with a long random password, which you can paste into your home web page to unlock the installation.
Jenkins Plugins
Once installed, Jenkins allows you to accept the default plugin list or choose your own plugins.
Once you’ve chosen your initial set of plugins, click the Install button and Jenkins will add them.
The main Jenkins screen displays the current build queue and Runner status, and provides links to create new items (jobs), manage users, view build histories, manage Jenkins, view your custom views, and manage your credentials.
A new Jenkins item can be any of the six job types plus a folder to organize items.
The Manage Jenkins page allows you to do up to 18 different things, including the option to open a command line interface. At this point, however, we need to look at pipelines, which are enhanced workflows that are typically defined using scripts.
Jenkins pipelines
Once you’ve got Jenkins set up, it’s time to create some projects that Jenkins can build for you. While you can use the web UI to create scripts, the current best practice is to create a pipeline script, named Jenkinsfile, and check it into your repository. The following screenshot shows the configuration web form for a multi-branch pipeline.
As you can see, the branch sources for this type of pipeline in my basic Jenkins installation can be Git or Subversion repositories, including GitHub. If you need other types of repositories or different online repository services, it’s just a matter of adding the appropriate plugins and restarting Jenkins. I’ve tried, but I couldn’t think of a source code management system that doesn’t already have a Jenkins plugin listed.
Jenkins pipelines can be declarative or scriptable. TO declarative pipeline, the simpler of the two, uses a Groovy-compatible syntax and you can start the file with #!groovy
to point your code editor in the right direction. A declarative pipeline starts with a pipeline
block, define a agent
and define stages
which include executable steps
as in the following three-stage example.
pipeline {
agent any
stages {
stage(‘Build’) {
steps {
echo ‘Building..’
}
}
stage(‘Test’) {
steps {
echo ‘Testing..’
}
}
stage(‘Deploy’) {
steps {
echo ‘Deploying....’
}
}
}
}
pipeline
is the required external block to invoke the Jenkins pipeline plugin. agent
defines where you want to run the pipeline. any
it says to use any available agent to run the pipeline or stage. A more specific agent could declare a container to use, for example:
agent {
docker {
image ‘maven:3-alpine’
label ‘my-defined-label’
args ‘-v /tmp:/tmp’
}
}
stages
contain a sequence of one or more stage directives. In the example above, the three stages are build, test, and deploy.
steps
do the actual work. In the example above, the steps only printed messages. A more useful build step might look like this:
pipeline {
agent any
stages {
stage(‘Build’) {
steps {
sh ‘make’
archiveArtifacts artifacts: ‘**/target/*.jar’, fingerprint: true
}
}
}
}
Here we are invoking make
from a shell and then archiving any produced JAR files to the Jenkins archive.
He post
The section defines the actions that will be executed at the end of the execution or stage of the pipeline. You can use a number of postcondition blocks within the post section: always
, changed
, failure
, success
, unstable
and aborted
.
For example, the Jenkins file below always runs JUnit after the test stage, but only sends an email if the pipeline fails.
pipeline {
agent any
stages {
stage(‘Test’) {
steps {
sh ‘make check’
}
}
}
post {
always {
junit ‘**/target/*.xml’
}
failure {
mail to: team@example.com, subject: ‘The Pipeline failed :(‘
}
}
}
The declarative pipeline can express most of what you need to define pipelines and is much easier to learn than the scripted pipeline syntax, which is a Groovy-based DSL. The scripted pipeline is, in fact, a complete programming environment.
For comparison, the following two Jenkinsfiles are completely equivalent.
declarative pipeline
pipeline {
agent { docker ‘node:6.3’ }
stages {
stage(‘build’) {
steps {
sh ‘npm —version’
}
}
}
}
Scripted pipeline
node(‘docker’) {
checkout scm
stage(‘Build’) {
docker.image(‘node:6.3’).inside {
sh ‘npm —version’
}
}
}
Blue Ocean, the Jenkins GUI
If you want the latest and greatest Jenkins UI, you can use the Blue Ocean plugin, which provides a graphical user experience. You can add the Blue Ocean plugin to your existing Jenkins installation or run a Jenkins/Blue Ocean Docker container. With Blue Ocean installed, your Jenkins main menu will have an additional icon:
You can open Blue Ocean directly if you want. It’s in the /blue
folder on the Jenkins server. Building pipelines in Blue Ocean is a bit more graphical than in plain Jenkins:
Jenkins Docker
As I mentioned earlier, Jenkins is also distributed as a Docker image. There’s not much else to the process: once you’ve chosen your SCM type, provide a URL and credentials, then create a pipeline from a single repository or scan all repositories in the organization. Each branch with a Jenkinsfile will get a pipeline.
Here I am running a Blue Ocean Docker image, which came with a few more Git service plugins installed than the default list of SCM providers:
Once you have run a few pipelines, the Blue Ocean plugin will display its status, as shown above. You can zoom in on an individual pipeline to see the stages and steps:
You can also expand the branches (above) and activities (below):
—
Why use Jenkins?
The Jenkins Pipeline plugin we’ve been using supports a general continuous integration/continuous delivery (CICD) use case, which is probably the most common use case for Jenkins. There are specialized considerations for some other use cases.
The Java projects were the originals. Reason to be for Jenkins. We’ve already seen that Jenkins supports building with Maven; it also works with Ant, Gradle, JUnit, Nexus and Artifactory.
Android runs a sort of Java, but has the problem of how to test on the wide range of Android devices. The Android emulator plugin allows you to build and test on as many emulated devices as you want to define. The Google Play Publisher Plugin allows you to submit builds to an alpha channel on Google Play for release or further testing on real devices.
I’ve shown examples where we specify a Docker container as the agent for a pipeline and where we run Jenkins and Blue Ocean in a Docker container. Docker containers are very useful in a Jenkins environment to improve speed, scalability, and consistency.
There are two main use cases for Jenkins and GitHub. One is the build integration, which can include a service hook to activate Jenkins on every commit to your GitHub repository. The second is the use of GitHub authentication to control access to Jenkins via OAuth.
Jenkins supports many other languages besides Java. For C/C++, there are plugins to capture errors and warnings from the console, generate build scripts with CMake, run unit tests, and perform static code analysis. Jenkins has a number of integrations with PHP tools.
While it’s not necessary to compile the Python code (unless you’re using Cython, for example, or creating a Python wheel for installation), it’s helpful if Jenkins integrates with Python testing and reporting tools, such as Nose2 and Pytest, and code quality. tools like Pylint. Similarly, Jenkins integrates with Ruby tools like Rake, Cucumber, Brakeman, and CI::Reporter.
Jenkins for CI/CD
Overall, Jenkins offers an easy way to set up a CI/CD environment for virtually any combination of languages and source code repositories using pipelines, as well as automating a number of other routine development tasks. While Jenkins doesn’t eliminate the need to script individual steps, it does give you a faster and more robust way to integrate your entire build, test, and deployment toolchain than you could easily create yourself.
Copyright © 2023 IDG Communications, Inc.
Be First to Comment