Certified Jenkins Engineer

Kubernetes and GitOps

Demo Publish Reports to AWS S3

In this guide, you’ll learn how to upload your Jenkins pipeline’s test, coverage, and security reports to an Amazon S3 bucket. This approach centralizes all your build artifacts in S3 for easy sharing and long-term storage.

Table of Contents

  1. Inspecting the Jenkins Workspace
  2. Creating the S3 Bucket
  3. Configuring IAM and Jenkins Credentials
  4. Installing the Pipeline: AWS Steps Plugin
  5. Generating an S3 Upload Snippet
  6. Adding the Upload Stage to the Jenkinsfile
  7. Authenticating with AWS in the Pipeline
  8. Running the Pipeline
  9. Reviewing the Console Output
  10. Verifying Artifacts in S3
  11. Links and References

Inspecting the Jenkins Workspace

First, browse your Jenkins workspace via the Classic UI to verify all generated reports are present:

nodejs:22-6-0  – Use a tool from a predefined Tool Installation
# Fetch environment variables for a given tool via withEnv
#### REPLACE below with Kubernetes http://IP_Address:30000/api-docs/ ####
chmod 777 $(pwd)
docker run -v $(pwd):/zap/wrk:/rw ghcr.io/zaproxy/zap-api-scan.py -t http://134.209....
solar-system-gitops-argocd – Verify if file exists in workspace
rm -rf solar-system-gitops-argocd  – Shell Script

The image shows a Jenkins workspace interface displaying a list of files and directories with their sizes and timestamps.

Common reports and locations:

Report TypeDirectory / File Pattern
Code coveragecoverage/
Dependency-checkdependency-check-report.html, etc.
Unit test resultstest-results.xml
Container scanstrivy*.*
OWASP ZAP scanszap*.*

Creating the S3 Bucket

In the AWS S3 console, create a new bucket (e.g., solar-system-jenkins-reports-bucket) in US East (Ohio). This bucket will house all your Jenkins reports:

The image shows an AWS S3 console with a list of general purpose buckets, including one named "solar-system-jenkins-reports-bucket" in the US East (Ohio) region.


Configuring IAM and Jenkins Credentials

  1. In AWS IAM, create or select a user with the AmazonS3FullAccess policy.
  2. In Jenkins, go to Credentials and add a new AWS Credentials entry. Set the ID to aws-s3-ec2-lambda-creds.

The image shows an AWS Identity and Access Management (IAM) console screen, displaying the permissions policies for a user, including AmazonEC2FullAccess and AmazonS3FullAccess.

The image shows a Jenkins dashboard displaying a list of stored credentials, including IDs and names for various systems and services.

Warning

Do not hard-code AWS keys in your Jenkinsfile. Always use Jenkins Credentials and the withAWS wrapper.


Installing the Pipeline: AWS Steps Plugin

Install Pipeline: AWS Steps via Manage Jenkins → Manage Plugins. This plugin provides the s3Upload and withAWS steps you’ll need.

The image shows a webpage for the Jenkins "Pipeline: AWS Steps" plugin, detailing its features, version information, and installation statistics. It includes links to documentation, GitHub, and other resources.


Generating an S3 Upload Snippet

Use Jenkins’s Snippet Generator to preview the s3Upload syntax and options:

The image shows a Jenkins interface with a "Snippet Generator" for creating pipeline scripts, displaying various options for script steps.


Adding the Upload Stage to the Jenkinsfile

Add a new stage named Upload – AWS S3 that runs only on PR branches. It will:

  1. Create a reports-$BUILD_ID directory
  2. Copy all relevant reports into it
  3. Upload the folder to your S3 bucket
stage('Upload - AWS S3') {
  when {
    branch 'PR*'
  }
  steps {
    withAWS(credentials: 'aws-s3-ec2-lambda-creds', region: 'us-east-2') {
      sh '''
        ls -ltr
        mkdir reports-$BUILD_ID
        cp -rf coverage/ reports-$BUILD_ID/
        cp dependency* test-results.xml trivy*.* zap*.* reports-$BUILD_ID/
        ls -ltr reports-$BUILD_ID/
      '''
      s3Upload(
        file: "reports-$BUILD_ID",
        bucket: 'solar-system-jenkins-reports-bucket',
        path: "jenkins-$BUILD_ID/"
      )
    }
  }
}

Note

Use double quotes for Groovy string interpolation ("reports-$BUILD_ID").
You can adjust path: to organize reports by job, branch, or date.


Authenticating with AWS in the Pipeline

The withAWS step injects your IAM credentials and region into the build. Generate this snippet in the Snippet Generator by searching for withAWS.

The image shows the AWS Identity and Access Management (IAM) console, with a search for "S3" displaying related services and features. The right side of the screen shows user details and permissions settings.


Running the Pipeline

Commit and push your updated Jenkinsfile to trigger a build. The Upload – AWS S3 stage should appear and complete successfully:

The image shows a Jenkins pipeline interface for a project named "solar-system" under "Gitea-Organization," displaying various stages of a build and deployment process, all marked as completed. The interface includes details about a pull request and deployment status.


Reviewing the Console Output

Inspect the logs to verify the file listing, directory creation, copy commands, and S3 upload progress:

$ ls -ltr
...
# mkdir reports-6
# cp -rf coverage/ reports-6/
# cp dependency* test-results.xml trivy*.* zap*.* reports-6/
Uploading file:/var/lib/jenkins/workspace/.../reports-6/ to s3://solar-system-jenkins-reports-bucket/jenkins-6/
Finished: Uploading to solar-system-jenkins-reports-bucket/jenkins-6/test-results.xml
Finished: Uploading to solar-system-jenkins-reports-bucket/jenkins-6/trivy-image-CRITICAL-results.html
...

Verifying Artifacts in S3

Head back to the S3 console and navigate into your bucket. You should see a jenkins-<build_id>/ folder with all your copied reports.

That’s it! You’ve successfully configured your Jenkins pipeline to publish test, coverage, and security reports to Amazon S3.


Watch Video

Watch video content

Previous
Demo DAST Ignore Rules