DevSecOps Made Simple: A Step-by-Step Walkthrough (Part 1)

As organizations increasingly embrace DevSecOps, understanding how to seamlessly integrate security, quality, and collaboration tools into CI pipelines has become a vital skill for developers and security practitioners. However, for those just starting their DevSecOps journey, the landscape can feel overwhelming.

This blog is written keeping beginners in mind, providing a practical, high-level overview of integrating SonarQube, Snyk, and Jira into a Jenkins CI pipeline. While this setup may not represent the most advanced or secure implementation, the goal is to give you a hands-on understanding of the foundational steps involved in setting up a basic CI pipeline with integrated code quality, security scanning, and issue tracking.

There are numerous blogs about DevSecOps available on the internet, but most didn’t fully meet our expectations as we wanted an approach that emphasizes hands-on learning and experimentation in a local environment before moving on to cloud-based solutions. While we found one that seemed promising (you can find the link here—it’s excellent), We (Irfan and I) ultimately decided to create our own blog to deepen our understanding and share a practical, accessible guide.

Whether you're a developer, a security enthusiast, or someone stepping into DevSecOps for the first time, this guide strives to be your starting point, bridging the gap between theory and practical application.

The pipeline was built on Ubuntu 22.04.5 LTS (Jammy) and it utilizes the OWASP Juice Shop, a deliberately vulnerable application project.
CI Pipeline Flow

Downloading and Installing Tools

Installing Jenkins

  1. Download the latest version of Jenkins from this link.
  1. Follow the installation and configuration instructions provided on the website.

Installing Snyk

Since npm will be required for managing project dependencies, we’ll use npm to install Snyk.

  1. Download and install Node.js (version 20.x or 22.x), as these versions are supported by the Juice Shop project. Use this link to download and configure Nodejs.
  1. After installing Node.js, use the following command to install the Snyk tool globally:
sudo npm install -g snyk

Installing SonarQube

  1. Download the latest SonarQube Community Edition from here.
  1. Run the tool using the following command:
./sonarqube-<version>bin/linux-x86-64/sonar.sh start
  1. Once SonarQube is running, access the tool at http://localhost:9000 and configure your account.

Installing trufflehog

  1. Download trufflehog from here. We used the Go-based installation method.
  1. Install pre-commit, as trufflehog integrates with it. Use this link for installation and configuration instructions.
  1. Ensure trufflehog is properly configured in the Linux system environment.

Installing Ngrok

  1. Download Ngrok from this link.
  1. Follow the installation guide provided on the page to configure it.

Generating API Keys

Since Snyk and SonarQube need to be accessed via Jenkins, we will require the API keys to achieve this.

Configuring the tools

GitHub

  1. We forked the original Juice Shop repository. You can find our forked repository here.
  1. To get started, fork the original repository.
  1. Navigate to your forked repository, then go to SettingsWebhooks. Click the Add webhook button and configure the mandatory fields as shown in the screenshot below:
    • Run an Ngrok instance and copy the generated URL.
    • Add the URL to the Payload URL field and append the suffix /github-webhook/.
    Note: The /github-webhook/ directory in a web application or server is typically used to handle incoming webhook events from GitHub.

Jenkins

  1. Install the following plugin:

    SonarQube Scanner for Jenkins

  1. To configure the credentials (API keys), navigate to DashboardManage JenkinsCredentialsSystemGlobal credentials. Add new credentials with the type Secret text for SonarQube. The secret value should be the one you created earlier.
  1. Repeat step 2 for Snyk, using the API key you generated earlier.
  1. Paste the SonarQube key under the Secret parameter and make note of the ID, as it will be required when configuring the environment.
  1. Navigate to DashboardManage JenkinsSystem Configuration.
  1. Under SonarQube servers, check the Environment variable option.
  1. Add a custom Name and select the credentials ID under the Server authentication token.
  1. Navigate to DashboardManage JenkinsTools. Select the SonarQube Scanner installation drop-down option.
  1. Add a custom Name and specify the latest version of the scanner.
Make a note of the Name used, as it will be needed for configuration later.
  1. Navigate to DashboardNew Item.
  1. Enter the project name and select Pipeline under item type.
  1. Check "Poll SCM" and "GitHub hook trigger for GIT SCM polling" under Build Triggers.
  1. Select “Pipeline script from SCM” under the Pipeline options.
  1. The Jenkins script will be hosted on GitHub. Under Script Path, specify the script file name. For now, let's use Jenkinsfile.

SonarQube

  1. For creating a project on SonarQube, select “Create a local project,”
  1. Name your project. In this case, the project name is “Juice-Shop-Github.” After this, choose Global Settings as the option and create the project.
  1. When prompted with “How do you want to analyze your repository?”, select “With Jenkins.”
  1. Choose GitHub as your DevOps platform. The tool will now display the process of creating a pipeline job on Jenkins, which we have already set up.
  1. The tool will show the Prerequisites required to scan the project. We will follow these steps shortly.

Jenkins Pipeline Script

The following Jenkins script was used in this pipeline:

In the code snippet below, credentialsId refers to the secrets stored in Jenkins, which were created earlier.
🤣

I guess I should've paid more attention to coding in my classes. 😅 Luckily, ChatGPT was there to save the day and make my code as clear as a sunny sky.

node {
    stage('Git SCM') { # Check out the source code from the repository defined in Jenkins.
        checkout scm
    }
    
    stage('Installing project dependencies and running a dependency scan using Snyk') {
        withCredentials([string(credentialsId: 'token', variable: 'SNYK_TOKEN')]) { # Retrieve the Snyk token using the credential ID "token" stored in Jenkins.
            sh "npm install" # Install the project dependencies.

            sh '''
                export SNYK_TOKEN=$SNYK_TOKEN
                snyk auth $SNYK_TOKEN > /dev/null 2>&1 # Authenticate Snyk using the token. Output is redirected to /dev/null to hide sensitive information in logs.
                snyk monitor --org=a16b592a-4677-4b3c-ab8a-b034d23d37db # Monitor project dependencies for vulnerabilities. The --org flag specifies the Snyk organization, required if multiple organizations exist.
            '''
        }
    }

    stage('Source Code Analysis using SonarQube') {
       withCredentials([string(credentialsId: 'sonarqube', variable: 'envsonar')]) { # Retrieve SonarQube credentials using the credential ID "sonarqube" stored in Jenkins.
            def scannerHome = tool(name: 'SonarQube', type: 'hudson.plugins.sonar.SonarRunnerInstallation') # Define the path to the SonarQube scanner, configured using the tool name in Jenkins.
            withSonarQubeEnv('envsonar') { # Set up the environment for SonarQube analysis using the retrieved credentials.
                sh "${scannerHome}/bin/sonar-scanner" # Execute the SonarQube analysis.
            }
        }
    }
}

We need to create the following two files in the repository:

  1. Jenkinsfile (Remember, we used this file name when creating the Jenkins project). This file will contain our Jenkins script. 🙂Note: Remove the #comments from the script.
  1. sonar-project.properties (This is required by SonarQube to identify the project)

Configure TruffleHog and Git Locally

  1. Clone the remote repository to your local machine. If SSH is configured with Git, pulling and pushing files to the repository will be both easy and secure. Use the following command to initialize the repository:
git init
Reference
  1. Create a .pre-commit-config.yaml file with the following content. The --fail flag will return an error code 183 if valid credentials are detected.
    Reference
repos:
  - repo: local
    hooks:
      - id: trufflehog
        name: TruffleHog
        description: Detect secrets in your data.
        entry: bash -c 'trufflehog git file://. --since-commit HEAD --only-verified --fail'
        language: system
        stages: ["pre-commit", "pre-push"]
  1. Use pre-commit install to install the Git hook script.
Reference
  1. We have created a sensitive file named “keys” that contains dummy keys.
  1. Add the file using git add keys and commit with git commit -m "<some message>". You’ll notice the commit fails because the tool detected secrets in the “keys” file.
  1. For demonstration purposes, let’s remove the --fail flag to ignore the file for now.
  1. During the next git push, we explicitly added sensitive keys to the file. GitHub Push Protection detects this and rejects the commit.
Isn’t it great? Git is catching those keys faster than we can push our regrets. 😎

Let’s access the “Allow the secret” link to bypass this secret detection.

Running the Pipeline

  1. Once the commit is successful, push the code to the remote repository using git push.
  1. GitHub webhooks will verify successful delivery. When GitHub receives a push request, it triggers the webhook to the Jenkins URL, which initiates the build process.

It says, “Started by GitHub push by <git username>”

  1. Once the build completes successfully, you’ll see the results pushed to the SonarQube and Snyk dashboards.
SonarQube
Snyk

Integrating JIRA with SonarQube 😇

  1. Create an account on Jira and install the “SonarQube Connector for Jira” plugin.
  1. Create a Jira project and navigate to Project SettingsAppsSonarQube.
  1. Enter the following details:
  1. Navigate to the project and check the results under the SonarQube tab. You can further create labels and assign tickets for the vulnerabilities detected.

Conclusion

In this blog, we explored the implementation of the CI (Continuous Integration) part of our pipeline. In the next blog, we’ll dive into the CD (Continuous Deployment) process, where we’ll build and deploy the application. We’ll also step up our game by incorporating Burp Suite or OWASP ZAP as our automated application scanner—because who doesn’t love a bit of extra security sprinkled in? Stay tuned!