Demystifying DevOps: Getting Started with Automated Delivery Pipelines
A Guide for Those New to DevOps and Jenkins Processes
Introduction
DevOps is a software engineering culture and practice that aims to unify software development and operations. DevOps practices include the use of a technology stack and tools that help teams operate and evolve applications quickly and reliably, allowing us to accomplish tasks such as deploying code, provisioning infrastructure, scanning code for quality, running automated tests, and monitoring running applications.
The Problem
Historically, deploying software (pushing code into production and making it available to customers) has been a very manual and slow process. Software development teams would build software and then pass it over to a completely separate operations team that would deal with building, testing, and deploying the code. Development teams were essentially throwing their product over a wall to the operations teams and saying, “It’s your problem now!”
As a result, getting code shipped to customers took a long time and left a lot of room for confusion and human error.
The Solution: DevOps!
By embracing a new culture in which teams are responsible for both the development and operations aspects of software, teams now have increased communication, shorter development cycles, reduced time to recover due to more frequent code releases, and a culture focused on performance. Teams are now responsible for the entire lifecycle of their code and are held accountable for fixing any bugs that come up anytime from writing the first lines of code all the way until a release is shipped.
DevOps on an Enterprise Scale
At Capital One, our engineering teams use DevOps processes to deliver high-quality software in a hyper-efficient manner. We strive to achieve a company-wide standard so that we have better performing teams, faster delivery, and happier customers. To achieve this, we use a variety of delivery pipelines that scan and run checks on our code.
The purpose of using an automated delivery pipeline is so that every time a change is made to a code base, the code undergoes a series of rigorous automated tests including unit tests, integration tests, security scanning, and quality checks. If the updated code passes all the pipeline tests and is able to successfully build and get deployed to our infrastructure, the pipeline will automatically create and deploy a release that is immediately available to customers! The customers don’t experience any downtime from the application and teams can release updates without the fear that something will break.
Building Automated Delivery Pipelines Using Jenkins
As an engineer on a DevOps team, I work on building robust pipelines so that any development team can easily use them to deploy their applications and be sure their code is up to par with Capital One’s high standards. To do this, my team leverages Jenkins, the industry standard for building delivery pipelines.
Jenkins is an open source automation server written in Java that helps automate the operations portion of DevOps. Jenkins can easily be integrated with GitHub webhooks so that every time a change is pushed, Jenkins is notified and can perform any number of functions. At Capital One, we use Jenkins because it allows us to take advantage of state-of-the-art open source tools without reinventing the wheel in-house. If you are new to the DevOps world, having a basic understanding of Jenkins is crucial!
The Jenkins pipeline for a repo is defined in a file called the Jenkinsfile which is typically in the top level directory of the project. Jenkinsfiles are broken down into stages, such as “application build,” “integration testing,” and “deploy,” and these stages are further broken down into additional steps. At Capital One, we use code generation tools, such as the application I work on, to generate Jenkinsfiles for different types of applications. However, Jenkinsfiles often need to be manually tweaked, and in any case it’s still important to understand how to write one from scratch. The rest of this article will walk you through creating a GitHub repo that is integrated with Jenkins.
Build Your Own Jenkins Pipeline From Scratch
Our goal is to configure a repository so that whenever a push is made to the repo, it triggers a pipeline in Jenkins to run. Pipelines can do almost anything, but ours will be a simple one that calls a shell script and prints the step it’s on.
What You Need
- A GitHub account and basic knowledge of git. Create an account for free here.
- Access to a Jenkins instance (either your own or an enterprise instance for your company). More information here.
Adding a Jenkinsfile That Defines the Pipeline
- Create a new GitHub repository (under your account). This will give you admin access to the repo which will allow you to add a webhook to talk to Jenkins.
- Clone your repo locally.
- Add a file in the top level directory called “hello_world.sh” and add the following code, or any other code you want the pipeline to execute:
This is a script that will be called in the Jenkinsfile and will therefore be executed when the pipeline is run.
4. Add a file in the top level directory called “Jenkinsfile” and add the following code:
This is a basic pipeline with three stages. Although these stages are mocks and don’t actually perform these tasks, the three stages listed here are realistic functions that Jenkins commonly performs. The first stage changes the permissions of the file so that the Jenkinsfile can execute our shell script, and the second two statements are basic print statements.
5. Push these changes to your repo.
Creating a Pipeline in Jenkins
1. Create a folder in your Jenkins instance.
2. On the left-hand side, click on “New Item.”
3. Enter a name for your new pipeline (for example, <your_name>-jenkins-demo). Select “Pipeline” from the list of options, then click “OK.”
4. Select that you are connecting your pipeline to a GitHub project and include the link to your repo.
5. Under Build Triggers, select “GitHub hook trigger for GITScm polling” and “Poll SCM.” This will allow the GitHub webhook to interact with Jenkins.
6. Under Pipeline > Definition, select “Pipeline script from SCM.” Under the SCM dropdown, select “Git” and include the same link to your repo under Repository URL.
This will tell the pipeline to execute the script in your repo (the Jenkinsfile written in part 1). If you ever push updates to your Jenkinsfile in your repo, the pipeline executed will update automatically.
7. Save your pipeline.
8. Run the pipeline once manually by selecting “Build Now” from the menu on the left.
Adding the Webhook in GitHub
- In your GitHub repo, click Settings > Hooks and add a webhook with the following configuration (using the URL for your own Jenkins instance):
2. Save the webhook.
Testing
- Push a change to your repo. It can be anything, for example adding or updating the README or adding a comment somewhere. If everything was set up correctly, your pipeline script should have been triggered and you’ll be able to see your job running in Jenkins!
Take-Aways
If you’re a software engineer, having a basic understanding of what DevOps is and why it’s important is crucial. Having automated deployment processes in place may seem tedious, but it’s super important for deploying robust, bug-free code.
Even having a little bit of experience with Jenkins and automated deployments, such as completing the activity above, can set you apart from other developers. No matter what type of products you’re building, having a DevOps solution in place is a key to delivering high-quality software.