Streamlining DevOps with Jenkins, Docker, and Kubernetes Integration

@Harsh
6 min readFeb 29, 2024

--

INTRODUCTION

In the fast-paced world of DevOps, automation and integration are key to achieving efficiency and scalability. In this blog, we’ll explore a comprehensive project where we leverage Jenkins to orchestrate a seamless workflow integrating Docker and Kubernetes. Let’s dive into the details of how we’ve streamlined our development and deployment processes for maximum effectiveness.

PROJECT OVERVIEW

Our project centers around a Jenkins pipeline that automates the build, testing, and deployment of Docker containers in a Kubernetes environment. We’ve divided our infrastructure into three nodes within Jenkins:

1. Image Building Node:

This node is responsible for building Docker images automatically whenever changes are pushed to our GitHub repository. We’ve configured a GitHub webhook to trigger this node, ensuring seamless integration with our version control system.

2. Image Testing Node:

Once the Docker image is built, it’s automatically pulled onto this node for testing. We run automated tests to ensure the integrity and functionality of the containerized application. If the tests pass, the image is deemed ready for deployment.

3. Deployment Node:

The final stage involves deploying the Docker image onto our Kubernetes cluster. Rather than directly interacting with the cluster, we’ve abstracted this process through a separate deployment node. This node communicates with the Kubernetes cluster running in a different zone, adhering to best practices for separation of concerns and security.

You just need to install “kubectl” and copy the “admin.conf” file from master node to this node to let it communicate to kubernetes cluster.

NOTE :

  1. ARTICLE FOR SETTING-UP MASTER-SLAVE ARCHITECTURE IN JENKINS
  2. ARTICLE FOR SETTING UP KUBERNETES MULTI-NODE CLUSTER

Jenkins Pipeline Workflow

Our Jenkins pipeline consists of three interconnected jobs:

Image Building Job:

This job is triggered by the GitHub webhook whenever changes are pushed to the repository. It automatically builds the Docker image using the Dockerfile and pushes it to Docker Hub for centralized storage and distribution.

2. Image Testing Job:

Upon successful image build, this job pulls the Docker image from Docker Hub onto the testing node. Automated tests are executed to validate the functionality of the application within the container. If the tests pass, the job proceeds to trigger the deployment job.

3. App Deployment Job:

The final stage involves deploying the Docker image onto our Kubernetes cluster. This job retrieves Kubernetes manifests from the GitHub repository, which specify the deployment configuration. It creates a Kubernetes deployment and exposes it to external traffic, ensuring accessibility for end-users.

Adding Webhook to github

Go to the github repository and in settings, click on the webhook option and add your jenkins url with port number and add “/github-webhook/” after this url. Select the Content-type “application-json”

Steps to Clone GitHub Repository for Local Development:

  1. Navigate to GitHub Repository: Go to the GitHub repository that contains the project you want to clone. You can find the repository URL in the browser’s address bar.
  2. Navigate to GitHub Repository: Go to the GitHub repository that contains the project you want to clone. You can find the repository URL in the browser’s address bar.
  3. Open Terminal or Command Prompt: Open your terminal or command prompt on your local computer where you want to clone the repository.
  4. Navigate to Desired Directory: Use the cd command to navigate to the directory where you want to clone the repository.
  5. Clone Repository: Use the git clone command followed by the repository URL to clone the repository to your local machine. If you're using HTTPS:
git clone <repo-url>

6. Verify Clone: Once the cloning process is complete, navigate into the cloned directory using cd repository and verify that the project files are present by listing the directory contents (ls command).

7. Start Working: You’re now ready to start working on the project locally! Make changes to the files as needed and push them back to the remote repository when you’re ready to commit your changes.

In our project we have two repository :

  1. “Jenkins_training_2024" for our webpages and Dockerfile.

After cloning to local repository :

The content of index.html file are available at my github : https://github.com/harsh2478/Jenkins_training_2024/

As soon as you push it, your build job will trigger and start building.

2. Next repo we need is “kubernetes_manifests” for updating the manifests for our kubernetes resources.

We will first create deployment manifest :

And then we create service manifest for exposing our deployment

Now when your job for production runs, it will come to this repo and take the content from here and based on this, create the deployment and service.

After running all the jobs successfully, when you try to see the result, you will get something like this :

FINAL OVERVIEW OF YOUR PIPELINE

Conclusion

By integrating Jenkins, Docker, and Kubernetes, we’ve established a robust DevOps pipeline that automates the entire software development lifecycle. From code changes to deployment, our streamlined workflow ensures rapid iteration and consistent delivery of high-quality software. Embrace automation and integration to propel your development processes to new heights of efficiency and agility.

--

--

@Harsh
@Harsh

Written by @Harsh

A devOps engineer from India

No responses yet