CI / CD Pipeline at Sonatype
Table of Contents
- Overview
- Main Players
- Sonatype CI/CD Jenkins Pipeline
- Master-Snapshot Pipeline
- Master-Release Pipeline
- Putting It All Together
Overview
Continuous Integration (CI) means that developers are automatically merging changes back to the main development branch as often as possible. It also means that testing is automated and checks that applications are not broken whenever new commits are pulled into the master branch.
Continuous Delivery (CD) is an extension of continuous integration where new changes are released to our customers quickly and effectively. CD means that in addition to having our tests automated, we also have automated our release process and we can deploy our applications at any time.
The IQ Server team has automated the Sonatype CI/CD pipeline and streamlined their development process. The main driver behind this change was moving to a declarative Jenkins pipeline. In this article, we’ll show you the Sonatype Jenkins pipeline setup, and how our products (IQ Server/Lifecycle, and Nexus Repository Manager) fit in as pivotal points in the pipeline.
Main Players
While this article focuses on our CI/CD Jenkins pipeline, there are several other important programs we use to streamline our process. We’ll quickly go over those here and then jump into our pipeline.
Maven and the Central Repository | The Sonatype IQ Server team uses Maven and the Central Repository for their builds. Apache Maven (or Maven) is a software project management tool that serves as the centralized location for our project’s builds. Maven relies on the Central Repository — a public repository for Java components. |
JIRA | We use JIRA project management software to define user stories, bug fixes, and any other project feature enhancements. |
GitHub | The IQ Server team uses GitHub as their source control tool. |
Nexus Repository Manager | We use Nexus Repository Manager to host our own components and proxy those in public repositories (like the Central Repository). |
Nexus IQ Server | Nexus IQ Server is the policy engine behind our products like Lifecycle and Firewall. We use the IQ Server to evaluate policies against our builds and use the provided information to remediate violations as early as possible. |
Jenkins | Jenkins is DevOps software that lets you create a continuous integration or continuous delivery environment for just about any language and source code repositories using pipelines. Pipelines represent a robust set of automation tools. With a Jenkins pipeline, we can define the steps needed to build, test, and deliver our applications. |
Sonatype CI/CD Jenkins Pipeline
As stated in the intro, the main point of this article is to go over our Jenkins pipeline, and we’ll do that in this section.
The IQ Server team decided to go with a declarative pipeline written as a Jenkinsfile
in the root of the project. This file lives in source control, just like the source code. We went with a declarative pipeline vs. scripted because using declarative puts more structure around what we can do and helps us enforce best practices. The declarative pipeline also runs in a sandbox, so they’re a little safer. For more information on declarative and scripted pipelines, see Jenkins Pipeline documentation.
To make our Jenkinsfile
, we use Groovy scripts to make a multi-branch pipeline job. This is good for feature branches and test branches, and also for letting Jenkins automatically discover those branches and run the jobs to build. We also have a script for single pipelines tied to a Release branch. Usually, this is applied to our Master
branch. These pipelines take parameters from the Jenkins UI to make a feature Snapshot job, Master Snapshot, release job, deployment job, downstream job (for customization), and so on.
We have a couple of shared pipelines and a lot of custom shared scripts. Almost everything we do is a standard Maven project. This results in two pipelines: creating Snapshot builds (where things might change and are still in active development) and then the Release pipeline (where things are stable and ready for production).
Master-Snapshot Pipeline
Let’s see what those look like in the Jenkins UI. We’ll use the Master Snapshot build as our example:
When developers merge feature changes, bug fix changes, or anything else into master, the Master Snapshot build kicks off automatically and goes through the following checks in our Jenkins pipeline:
- License Check: Make sure all source code has appropriate license headers.
- Build and Test: In Maven speak this is a
mvn clean deploy
. - Run Downstream: Our projects have different requirements for tests, and all of these differences go in the run downstream job. This is because we have a standard pipeline that we can use across all projects - streamlining the setup and standardizing what we use for longevity. But there are inevitably some differences, and that’s where those go. Run downstream is a powerful feature that lets you customize the tests you want to do for each project. For example, we test all different versions of Bamboo for the Bamboo plugin, but this test doesn’t make any sense for the IQ Server pipeline, so that’s not included in
Run Downstream
for IQ.
Our more complex projects have a lot of custom tests that we need to run. In this example, we’ll look at the pipeline master. We kick these off in a parallel test group. We’ve got some older legacy Geb tests, then we have two groups of functional tests and another group of integration testing. These run in parallel on different nodes so we get the whole thing to run in 30ish minutes rather than running each of the 5000 tests on its own.
NOTE: Each of our projects has different needs for testing - for example, the Bamboo plugin has tests that run for each version of the plugin that we support. These extra tests in the Downstream job are what vary from project to project, but we try to make everything else the same.
- Collect Distribution Files: Here we collect various files for the distribution that we want to save.
- Evaluate Policies: In this step, we call our own product - the IQ Server (our policy server). Here we evaluate all of our projects and products - in this case, using IQ Server to evaluate the IQ Server. In master Snapshot, we evaluate our policies against the build stage. The stage you evaluate against may be different depending on your pipeline - for instance, in the Release pipeline, we evaluate the release stage, which has different policies in place. We evaluate our policies to let us view violations that need to be resolved before proceeding. The IQ Server team uses Lifecycle to warn in the build phase, and block at release or deploy. The alerts kick-off notifications in the Jenkins UI, JIRA, email, and chat all link to the application composition report for further investigation and remediation.
These are the steps we go through with every Snapshot build, whether it’s a feature branch or master. The only reason we have a master vs. feature Snapshot build is that the rules are slightly different for how things get triggered. Master Snapshot builds get kicked off automatically every night or when someone makes a merge. We do a nightly build in case no one pushes anything for that day. This lets us catch things like policy evaluations, dependency changes, and system changes on a daily basis. On the other hand, feature Snapshots are up to the developer to run because they may make several changes before they want to run the build.
Next, let’s look at what this looks like as a declarative pipeline in the Jenkins code. Remember that this basic declarative pipeline is used by every project, with just a few minor differences.
pipeline { agent { label pipelineCommon.agentLabel } tools { maven mavenCommon.mavenVersion jdk mavenCommon.javaVersion } options { buildDiscarder( logRotator(numToKeepStr: '100', daysToKeepStr: '14', artifactNumToKeepStr: '20', artifactDaysToKeepStr: '10') ) timestamps() } stages { stage('License Check') { steps { licenseCheck(mavenCommon) } } stage('Build and Test') { steps { buildAndTest(mavenCommon, pipelineCommon.keystoreCredentialsId, isDeployBranch(env, deployBranch), pipelineCommon.useInstall4J) } } stage('Sonar Analysis') { when { expression { return pipelineCommon.performSonarAnalysis } } steps { sonarAnalyzePullRequest(env: env) } } stage('Run Downstream') { steps { runDownstream(pipelineCommon.downstreamJobName, pipelineCommon.artifactsForDownstream) } } stage('Collect Distribution Files') { steps { collectDist(pipelineCommon.distFiles) } } stage('Evaluate Policies') { when { expression { return isDeployBranch(env, deployBranch) } } steps { runEvaluation(pipelineCommon.iqPolicyEvaluation, 'build') } } }
In the script, we tell the pipeline which tools to use and then parameterize them. Looking at the stages, we see everything we saw in the UI like the license check, build and test, and so on. This is one of the beautiful things about the declarative pipeline. You can look at the script and then look at the UI and it’s easy to see the workflow and know exactly what’s going on.
Next, let’s take a look at the other major workflow which is our Release pipeline.
Master-Release Pipeline
For the Master Release build pipeline, the process is a little different from the Snapshot build:
It is essentially the same up to the License Check
stage, but then the next step is Prepare Release
. In this step, we set the code base to a specific version that we provided as one of our parameters. We then build and test, run our downstream tests, and then evaluate policies against the release stage.
The next step is Publish
. This is where we sign the binaries and deploy them. This is another place where we integrate and use our own product, pushing to our Nexus Repository Manager as part of this process. During the publish stage, the artifacts created by our build go to Nexus Repository Manager. This is slightly different from the Master Snapshots pipeline where the build and test
step pushes a Snapshot version of artifacts to an internal Nexus Repository. Typically, when we publish a release, we also push to a public Nexus Repository that is available for customers to use (depending on what the product is). For example, the Maven plugin gets published via Nexus Repository and is also available via download from Sonatype.
After we store our artifacts, we collect any other distribution files that we want to use for other jobs in Jenkins. Finally, the finish release step is where we clean everything up and tag things and make sure it’s all good to go.
Again, if we look at that pipeline code, it’s pretty similar to the Master Snapshots pipeline:
pipeline { agent { label pipelineCommon.agentLabel } tools { maven mavenCommon.mavenVersion jdk mavenCommon.javaVersion } options { buildDiscarder( logRotator(numToKeepStr: '100', daysToKeepStr: '14', artifactNumToKeepStr: '20', artifactDaysToKeepStr: '10') ) timestamps() } stages { stage('License Check') { steps { licenseCheck(mavenCommon) } } stage('Prepare Release') { steps { validateReleaseParameters(params) setBuildDisplayName Version: params.version prepareRelease(env, mavenCommon, params.version) } } stage('Build and Test') { steps { buildAndTest(mavenCommon, pipelineCommon.keystoreCredentialsId, false, pipelineCommon.useInstall4J) } } stage('Run Downstream') { steps { runDownstream(pipelineCommon.downstreamJobName, pipelineCommon.artifactsForDownstream) } } stage('Evaluate Policies') { steps { runEvaluation(pipelineCommon.iqPolicyEvaluation, 'release') } } stage('Publish') { steps { publishRelease(mavenCommon, pipelineCommon.keystoreCredentialsId, pipelineCommon.useInstall4J) } } stage('Collect Distribution Files') { steps { collectDist(pipelineCommon.distFiles) } } stage('Finish Release') { steps { finishRelease(env, mavenCommon, params.version, params.nextVersion) } } }
You’ll see we’re able to reuse some steps (like license check) from the Master Snapshots pipeline. Almost all steps are similar, just with some extra tests.
Putting It All Together
Although our Jenkins pipeline is the heart of our CI/CD process, there are many other integrations that come together to make things work. JIRA and our user stores is the trigger to get things started. Developers change the status of user stories from backlog to in progress, to done, while completing some kind of development. They then commit to Source Control Management, which is GitHub at Sonatype. Code that’s merged into GitHub kicks-off our automated Jenkins CI/CD pipeline that we discussed in depth. The Sonatype build/test/deploy pipeline is all done in Jenkins. This pipeline is “template-ized” meaning we can reuse it for several projects. This speeds up the development process, and we don’t have to go into Jenkins and figure out how it works for each project. This attests to the power of the Run Downstream
step in a Jenkins pipeline, that lets you easily accommodate changing project requirements (usually testing).
There are two places where our products are built into the pipeline: (1) the Policy Evaluation
step uses IQ Server to evaluate policies against the build or publish stage and (2) the Publish
step and Deploy
step pushes to Nexus Repository Manager to store our artifacts.
The Policy Evaluation
step is an important gate in our pipeline. Not only do any violations automatically create JIRA tickets for the team, we also see them in the Jenkins UI and get email and chat alerts. This lets the team be aware of any policy violations that are triggered from a build. When a JIRA ticket is created from a policy evaluation, it’s treated as a high-priority item, and the team works to get those resolved quickly. At Sonatype, we don’t let any product go out the door with violations! This is part of the process for all of our pipeline builds. We get notifications when a build fails or gets a warning. For instance, we may get a “warn” notification in the build stage, but that same evaluation will fail the build in release stage. All warnings aren’t released until the team goes in and figures out what the violation is and gets it fixed. Then we’ll attempt the build again, and if everything is good, it proceeds to the build stage.