๐Ÿš€๐Ÿ“… Day 27 DevOps Challenge - Jenkins Declarative Pipeline with Docker Integration ๐Ÿณ

๐Ÿš€๐Ÿ“… Day 27 DevOps Challenge - Jenkins Declarative Pipeline with Docker Integration ๐Ÿณ

ยท

5 min read

Leveling Up Your Jenkins Pipeline with Docker ๐Ÿ› ๏ธ

Welcome to another exciting installment of our Jenkins journey! ๐ŸŒŸ In our previous blog post, we dived deep into the world of Declarative pipelines and learned how to structure our CI/CD workflows using the power of declarative syntax.

The Docker Advantage โœจ

Before we dive into the nitty-gritty technical details, let's take a moment to understand why Docker is a game-changer for modern software development and deployment:

Consistency Across Environments โœ”๏ธ

Docker containers ensure consistent runtime environments throughout various stages of your pipeline, from development to production. No more "it works on my machine" scenarios! ๐Ÿ™…โ€โ™‚๏ธ

Enhanced Isolation ๐Ÿ”๏ธ

By encapsulating applications and their dependencies within containers, Docker minimizes conflicts between different applications and versions, making software deployment a breeze. ๐Ÿš€

Reproducible Environments ๐Ÿ”„

With Docker, reproducing the exact same environment on different machines becomes effortless. This guarantees consistent behavior and reduces the chances of unforeseen issues cropping up. ๐ŸŒ

Efficiency and Resource Utilization ๐Ÿš„

Docker containers are lightweight compared to traditional virtual machines, resulting in faster deployment times, optimized resource utilization, and greater scalability. โš™๏ธ

Seamlessly Integrating Docker into Jenkins Declarative Pipelines ๐ŸŒ

Let's roll up our sleeves and explore how we can seamlessly integrate Docker into our Jenkins pipeline using the intuitive declarative syntax. Our focus will be on two pivotal Docker commands: docker build and docker run. ๐Ÿ’ป

Defining the Stages ๐Ÿ› ๏ธ

Within our Declarative Pipeline, we'll establish distinct stages for building and running our Docker container. Here's a peek at how these stages will shape up:

pipeline {
    agent any

    stages {
        stage('Build') {
            steps {
                script {
                    // Build the Docker image
                    sh 'docker build -t trainwithshubham/django-app:latest .'
                }
            }
        }

        stage('Run') {
            steps {
                script {
                    // Run the Docker container
                    sh 'docker run -d trainwithshubham/django-app:latest'
                }
            }
        }
    }
}

Build Stage ๐Ÿ—๏ธ

In the "Build" stage, we leverage the docker build command to assemble a Docker image. The -t flag helps us tag the image for easy reference, while the . denotes the build context โ€“ the directory housing the Dockerfile and any requisite resources. ๐Ÿ› ๏ธ

Run Stage ๐Ÿƒโ€โ™‚๏ธ

Transitioning to the "Run" stage, we initiate a container using the image crafted in the prior stage. The -d flag launches the container in detached mode, freeing up the terminal. We identify the image by its tag, such as trainwithshubham/django-app:latest. ๐Ÿš€

Task-01: Create a Docker-Integrated Jenkins Declarative Pipeline ๐Ÿณ

Follow these steps to create a Jenkins declarative pipeline that integrates Docker using the provided syntax:

  1. Open your Jenkins instance and navigate to the Jenkins dashboard.

  2. Click on "New Item" to create a new Jenkins pipeline job.

  3. Provide a name for your pipeline job and select "Pipeline" as the project type.

  4. Scroll down to the "Pipeline" section and choose "Pipeline script" from the Definition dropdown.

  5. Now, let's use the provided syntax inside the script block of the pipeline:

  6.  pipeline {
         agent any
         stages {
             stage('Code Clone') {
                 steps {
                     git url: 'https://github.com/AdarshJha-1/node-todo-cicd.git', branch: 'master'
                 }
             }
             stage('Build') {
                 steps {
                     sh 'docker build -t node-todo-cicd:latest .'
                 }
             }
             stage('Testing') {
                 steps {
                     echo 'Testing'
                     // You can add actual testing steps here
                 }
             }
             stage('Deploy') {
                 steps {
                     sh 'docker run -d -p 8000:8000 node-todo-cicd:latest'
                 }
             }
         }
     }
    
  7. Click on the "Save" button to save your pipeline configuration.

  8. Run the pipeline job by clicking on the "Build Now" button.

  9. The pipeline will execute the build and run stages, creating a Docker container with your application.

Since running the job twice might result in errors due to the Docker container already being created

you can implement a solution to address this issue. Here's how you can do it:

Task-02: Docker-Integrated Jenkins Declarative Pipeline with Docker Compose Approach ๐Ÿณ

In Task-01, we successfully created a Docker-integrated Jenkins declarative pipeline to build, test, and deploy our application. However, when rebuilding the project, a common issue emerges due to the previously built container still occupying the same port, leading to conflicts. To address this using a Docker Compose approach without modifying the Docker Compose file, follow the steps below:

Solution:

  1. Modify Jenkins Declarative Pipeline:

    • Open your Jenkins pipeline job and navigate to the pipeline script section.

    • Update the "Deploy" stage to use Docker Compose commands for managing containers:

    pipeline {
        agent any
        stages {
            stage('Code Clone') {
                steps {
                    git url: 'https://github.com/AdarshJha-1/node-todo-cicd.git', branch: 'master'
                }
            }
            stage('Build') {
                steps {
                    sh 'docker build -t node-todo-cicd:latest .'
                }
            }
            stage('Testing') {
                steps {
                    echo 'Testing'
                    // You can add actual testing steps here
                }
            }
            stage('Deploy') {
                steps {
                    sh 'docker-compose down' # Stop and remove existing containers
                    sh 'docker-compose up -d' # Start new containers
                }
            }
        }
    }

  1. Save and Rebuild:

    • Click on the "Save" button to save your pipeline configuration.

    • Run the pipeline job by clicking on the "Build Now" button.

How it Works:

By employing Docker Compose's docker-compose down and docker-compose up commands within the pipeline, we ensure that existing containers are stopped and removed before launching new containers. This approach guarantees that the previously running container on the same port will be stopped, and the new container will take its place on the same port, preventing conflicts.

With these adjustments, you can confidently rebuild the project without encountering port-related errors caused by running containers on the same port.

Follow these steps to enhance your build and deployment workflow, ensuring that your containers operate smoothly without conflicts

Conclusion ๐ŸŽ‰

Incorporating Docker into Jenkins pipelines has revolutionized software deployment! ๐Ÿณ๐Ÿš€ Achieving consistent environments, streamlined isolation, and efficient resource utilization, Docker offers a game-changing advantage. The synergy between Docker's capabilities and Jenkins' declarative pipelines empowers seamless integration through "docker build" and "docker run" commands. Tackling port conflicts, Docker Compose ensures smooth deployments. Keep exploring Docker's vast potential for networking and orchestration. With this newfound expertise, you're primed to conquer the dynamic landscape of CI/CD. Happy coding! ๐Ÿ’ป๐ŸŽ‰

Did you find this article valuable?

Support Adarsh Jha by becoming a sponsor. Any amount is appreciated!

ย