How to Create a CI/CD Pipeline With Jenkins, Containers, and Amazon ECS | by Alfonso Valdes Carrales | Jun, 2022

Be taught the step-by-step strategy of deploying a pattern Nodejs containerized utility

picture by creator

Should you’re nonetheless constructing and delivering your software program purposes the normal manner then you’re lacking out on a serious innovation within the software program improvement course of or software program improvement lifecycle.

To point out you what I’m speaking about, on this article I’ll share “Easy methods to create a CI/CD Pipeline with Jenkins, Containers, and Amazon ECS” that deploys your utility and overcomes the constraints of the normal software program supply mannequin.

This innovation enormously impacts deadlines, time to market, high quality of the product, and so forth. I’ll take you thru the entire step-by-step strategy of organising a CI/CD Docker pipeline for a pattern Nodejs utility.

A CI/CD Pipeline (or Steady Integration Steady Supply) pipeline is a set of directions to automate the method of software program checks, builds, and deployments. Listed here are just a few advantages of implementing CI/CD in your group.

  1. Smaller Code Change
    The flexibility of CI/CD pipelines to permit the mixing of a small piece of code at a time helps builders to acknowledge any potential drawback earlier than an excessive amount of work is accomplished.
  2. Sooner Supply
    A number of each day releases or continuous releases could be made a actuality utilizing CI/CD pipelines.
  3. Observability
    Having automation in place that generates in depth logs in every stage of the event course of helps to grasp if one thing goes fallacious.
  4. Simpler Rollbacks
    There are probabilities that the code that has been deployed might have points. In such instances, it is rather essential to get again to the earlier working launch as quickly as doable. One of many greatest benefits of utilizing the CI/CD pipelines is that you may shortly and simply roll again to the earlier working launch.
  5. Scale back Prices
    Having automation in place for repetitive duties frees up the developer and operation individuals’s time that may very well be spent on product improvement.

These are only a few advantages of getting CI/CD pipelines for builds and deployments. On this video, you may proceed studying the CI/CD benefits and why it is best to use a CI/CD within the first place.

Now, earlier than we proceed with the steps to arrange a CI/CD Pipeline with Jenkins, Containers, and Amazon ECS, let’s see briefly what instruments and applied sciences we might be utilizing.

  1. GitHub
    It’s a web-based utility or a cloud-based service the place individuals or builders collaborate, retailer and handle their utility code utilizing Git. We’ll create and retailer our pattern Nodejs utility code right here.
  2. AWS EC2 Occasion
    AWS EC2 is an Elastic Laptop Service supplied by Amazon Net Companies used to create digital machines or digital situations on AWS Cloud. We’ll create an EC2 occasion and set up Jenkins and different dependencies in it.
  3. Java
    This might be required to run Jenkins Server.
  4. AWS CLI
    aws-cli i.e AWS Command Line Interface is a command-line software used to handle AWS Companies utilizing instructions. We might be utilizing it to handle AWS ECS Activity and ECS Service.
  5. Nodejs and Npm
    Nodejs is a backend JavaScript runtime surroundings and Npm is a package deal supervisor for Node. We might be making a CI/CD Docker pipeline for the Nodejs utility.
  6. Docker
    Docker is an open supply containerization platform used for creating, transport, and operating purposes. We’ll use it to construct Docker photos of our pattern Nodejs utility and push/pull them to/from AWS ECR.
  7. Jenkins
    Jenkins is an open supply, freely out there automation server used to construct, check, and deploy software program purposes. We might be creating our CI/CD Docker pipeline to construct, check and deploy our Nodejs utility on AWS ECS utilizing Jenkins
  8. AWS ECR
    AWS Elastic Container Registry is a Docker picture repository totally managed by AWS to simply retailer, share, and deploy container photos. We might be utilizing AWS ECR to retailer Docker photos of our pattern Nodejs utility.
  9. AWS ECS
    AWS Elastic Container Service is a container orchestration service totally managed by AWS to simply deploy, handle, and scale containerized purposes. We might be utilizing it to host our pattern Nodejs utility.

Additionally learn: CI/CD Pipeline and workflow on AWS, Kubernetes, and Docker

That is what our structure will appear like after organising the CI/CD Pipeline with Docker.

After the CI/CD Docker Pipeline is efficiently arrange, we’ll push commits to our GitHub repository and in flip, GitHub Webhook will set off the CI/CD Pipeline on Jenkins Server. Jenkins Server will then pull the most recent code, perform unit checks, construct a docker picture and push it to AWS ECR. After the picture is pushed to AWS ECR, the identical picture might be deployed in AWS ECS by Jenkins.

CI and CD Workflow permits us to give attention to improvement whereas it carries out the checks, construct, and deployments in an automatic manner.

  1. Steady Integration
    This permits the builders to push the code to the model management system or supply code administration system, construct and check the most recent code pushed by the developer, and generate and retailer artifacts.
  2. Steady Supply
    That is the method that lets us deploy the examined code to the manufacturing at any time when required.
  3. Steady Deployment
    This goes one step additional and releases each single change with none guide intervention to the client system each time the manufacturing pipeline passes all of the checks.

The first objective of the automated CI/CD pipeline is to construct the most recent code and deploy it. There could be numerous phases as per the necessity. The most typical ones are talked about under:

  1. Set off
    The CI/CD pipeline can do its job on the required schedule when executed manually or triggered mechanically on a selected motion within the code repository.
  2. Code Pull
    On this section, the pipeline pulls the most recent code at any time when the pipeline is triggered.
  3. Unit Assessments
    On this section, the pipeline performs checks which are there within the codebase, that is additionally known as unit checks.
  4. Construct or Package deal
    As soon as all of the checks move, the pipeline strikes ahead and builds artifacts or docker photos in case of dockerized purposes.
  5. Push or Retailer
    On this section, the code that has been constructed is pushed to the artifactory or Docker repository within the case of dockerized purposes.
  6. Acceptance Assessments
    This section or stage of the pipeline validates if the software program behaves as meant. It’s a manner to make sure that the software program or utility does what it’s meant to do.
  7. Deploy
    That is the ultimate stage in any CI/CD pipeline. On this stage, the applying is prepared for supply or deployment.

A deployment technique is a manner wherein containers of the micro-services are taken down and added. There are numerous choices out there. Nevertheless, we’ll solely talk about those which are out there and supported by ECS

In rolling updates, the scheduler within the ECS Service replaces the at present operating duties with new ones. The duties within the ECS cluster are nothing however operating containers created out of the duty definition. Deployment configuration controls the variety of duties that Amazon ECS provides or removes from the service.

The decrease and the higher restrict on the variety of duties that ought to be operating is managed by minimumHealthyPercent and maximumPercent respectively.

  1. minimumHealthyPercent instance: If the worth of minimumHealthyPercent is 50 and the specified process rely is 4, then the scheduler can cease two current duties earlier than beginning two new duties
  2. maximumPercent instance: If the worth of maximumPercent is 4 and the specified process is 4 then the scheduler can begin 4 new duties earlier than stopping 4 current duties.

If you wish to be taught extra about this, go to the official documentation here.

Blue/Inexperienced deployment technique permits the developer to confirm a brand new deployment earlier than sending visitors to it by putting in an up to date model of the applying as a brand new substitute process set.

There are primarily 3 ways wherein visitors can shift throughout blue/inexperienced deployment.

  1. Canary — Traffic is shifted in two increments, proportion of traffic shifted to your up to date process set within the first increment and the interval, in minutes, earlier than the remaining traffic is shifted within the second increment.
  2. Linear — Traffic is shifted in equal increments, the share of traffic shifted in every increment, and the variety of minutes between every increment.
  3. All-at-once — All traffic is shifted from the unique process set to the up to date process set abruptly.

To be taught extra about this, go to the official documentation here.

Out of those two methods, we might be utilizing the rolling-updates deployment technique in our demo utility.

Now, let’s get began and make our palms soiled.

The Dockerfile for the pattern Nodejs utility is as follows. There isn’t a have to copy-paste this file, it’s already out there within the pattern git repository that you simply cloned beforehand.

Let’s simply attempt to perceive the directions of our Dockerfile.

  1. FROM node:12.18.4-alpine
    This might be our base picture for the container.
  2. WORKDIR /app
    This might be set as a working listing within the container.
  3. ENV PATH /app/node_modules/.bin:$PATH
    The PATH variable is assigned a path to /app/node_modules/.bin.
  4. COPY package deal.json ./
    Package deal.json might be copied into the working listing of the container.
  5. RUN npm set up
    Set up dependencies.
  6. COPY . ./
    Copy information and folders with dependencies from the host machine to the container.
  7. EXPOSE 3000
    Permit to port 300 of the container.
  8. CMD [“node”, “./src/server.js”]
    Begin the applying

That is the Docker file that we’ll use to create a docker picture.

  1. Go to, create an account when you don’t have it already else log in to your account and create a brand new repository. You possibly can identify it as per your selection; nevertheless, I’d suggest utilizing the identical identify to keep away from any confusion.

2. You’ll get the display as follows, copy the repository URL and maintain it helpful. Name this URL as a GitHub Repository URL and notice it down within the textual content file in your system.

Observe: Create a brand new textual content file in your system and notice down all the main points that might be required later.

This might be required for authentication functions. It is going to be used as a substitute of a password for Git over HTTPs, or can be utilized to authenticate to the API over primary authentication.

  1. Click on on the consumer icon within the top-right, go to “Settings,” then click on on the “Builders settings” possibility within the left panel.

2. Click on on the “Private entry tokens” choices and “Generate new token” to create a brand new token.

3. Tick the “repo” checkbox, the token will then have “full management of personal repositories”

4. It is best to see your token created now.

  1. Test your current working listing.

Observe: You might be within the dwelling listing, i.e., /dwelling/ubuntu.

  1. Clone my pattern repository containing all of the required code.
    git clone
  2. Create a brand new repository. This repository might be used for CI/CD Pipeline setup.
    git clone
  3. Copy all of the code from my nodejs repository to the newly created demo-nodejs-app repository.
    cp -r nodejs/* demo-nodejs-app/
  4. Change your working listing.
    cd demo-nodejs-app/

Observe: For the remainder of the article, don’t change your listing. Keep in the identical listing. Right here it’s, /dwelling/ubuntu/demo-nodejs-app/, and execute all of the instructions from there.

  1. ls -l
  2. git standing
  1. Test your current working listing; it ought to be the identical. Right here it’s, /dwelling/ubuntu/demo-nodejs-app/
  2. Set a username to your git commit message.
    git config consumer.identify “Rahul”
  3. Set an electronic mail to your git commit message.
    git config consumer.electronic mail “<>”
  4. Confirm the username and electronic mail you set.
    git config –checklist
  5. Test the standing, see information which were modified or added to your git repository.
    git standing
  6. Add information to the git staging space.
    git add
  7. Test the standing, see information which were added to the git staging space.
    git standing
  8. Commit your information with a commit message.
    git commit -m “My first commit”
  9. Push the decide to your distant git repository.
    git push
  1. Create an IAM consumer with programmatic entry in your AWS account and notice down the entry key and secret key in your textual content file for future reference. Present administrator permissions to the consumer.
    We don’t want admin entry. Nevertheless, to keep away from permission points and for the sake of the demo, let’s proceed with administrator entry.
  1. Create an ECR Repository in your AWS account and notice its URL in your textual content file for future reference.
  1. Create an ECR Repository in your AWS account and notice its URL in your textual content file for future reference.
  1. Go to ECS Console and click on on “Get Began” to create a cluster.

2. Click on on the “Configure” button out there within the “customized” possibility underneath “Container definition.”

3. Specify a reputation to the container as “nodejs-container,” the ECR Repository URL within the “Picture” textual content field, “3000” port within the Port mappings part, after which click on on the “Replace” button. You possibly can specify any identify of your selection for the container.

4. Now you can see the main points you specified underneath “Container definition.” Click on on the “Subsequent” button to proceed.

5. Choose “Software Load Balancer” underneath “Outline your service” after which click on on the “Subsequent” button.

6. Maintain the cluster identify as “default” and proceed by clicking on the “Subsequent” button. You possibly can change the cluster identify if you’d like.

7. Evaluation the configuration and it ought to look as follows. If the configurations match, then click on on the “Create” button. It will provoke the ECS Cluster creation.

8. After a couple of minutes, it is best to have your ECS cluster created and the Launch Standing ought to be one thing as follows.

Create an EC2 Occasion for organising the Jenkins Server

  1. Create an EC2 Occasion with Ubuntu 18.04 AMI and open its Port 22 to your IP and Port 8080 for in its Safety Group. Port 22 might be required for ssh g\\h and 8080 for accessing the Jenkins Server. Port 8080 is the place GitHub Webhook will strive to hook up with on Jenkins Server, therefore we have to permit it for

After the occasion is obtainable, let’s set up Jenkins Server on it together with all of the dependencies.

  1. Confirm if the OS is Ubuntu 18.04 LTS
  2. Test the RAM, a minimal of two GB is what we require.
    free -m
  3. The Consumer that you simply use to log in to the server ought to have sudo privileges. “ubuntu” is the consumer out there with sudo privileges for EC2 situations created utilizing “Ubuntu 18.04 LTS” AMI.
  4. Test your current working listing, it is going to be your property listing.
  1. Replace your system by downloading package deal info from all configured sources.
    sudo apt replace
  2. Search and Set up Java 11
    sudo apt search openjdk
    sudo apt set up openjdk-11-jdk
  3. Set up jq command, the JSON processor.
    sudo apt set up jq
  4. Set up Nodejs 12 and NPM
    curl -sL | sudo -E bash –
    sudo apt set up nodejs
  5. Set up the AWS CLI software.
    sudo apt set up awscli
  6. Test the Java model.
    java –model
  7. Test the jq model.
    jq –model
  8. Test the Nodejs model
    node –model
  9. Test the NPM model
    npm –model
  10. Test the AWS CLI model
    aws –model

Observe: Make certain all of your variations match the variations seen within the above picture.

  1. Jenkins could be put in from the Debian repository
    wget -q -O — | sudo apt-key add -sudo sh -c ‘echo deb binary/ > /and so forth/apt/sources.checklist.d/jenkins.checklist’
  2. Replace the apt package deal index
    sudo apt-get replace
  3. Set up Jenkins on the machine
    sudo apt-get set up jenkins
  4. Test the service standing whether it is operating or not.
    service jenkins standing
  5. It is best to have your Jenkins up and operating now. It’s possible you’ll seek advice from the official documentation here when you face any points with the set up.
  1. Set up packages to permit apt to make use of a repository over HTTPS:
    sudo apt-get set up apt-transport-https ca-certificates curl gnupg lsb-release
  2. Add Docker’s official GPG key:
    curl -fsSL | sudo gpg –dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
  3. Arrange the secure repository
    echo “deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] $(lsb_release -cs) secure” | sudo tee /and so forth/apt/sources.checklist.d/docker.checklist > /dev/null
  4. Replace the apt package deal index
    sudo apt-get replace
  5. Set up the most recent model of Docker Engine and containerd,
    sudo apt-get set up docker-ce docker-ce-cli
  6. Test the docker model.
    docker –model
  7. Create a “docker” group, this will likely exit.
    sudo groupadd docker
  8. Add “ubuntu” consumer to the “docker” group
    sudo usermod -aG docker ubuntu
  9. Add “jenkins” consumer to the “docker” group
    sudo usermod -aG docker jenkins
  10. Take a look at when you can create docker objects utilizing “ubuntu” consumer.
    docker run hello-world
  11. Swap to “root” consumer
    sudo -i
  12. Swap to “jenkins” consumer
    su jenkins
  13. Take a look at when you can create docker objects utilizing “jenkins” consumer.
    docker run hello-world
  14. Exit from “jenkins” consumer
  15. Exit from “root” consumer
  16. Now you need to be again in “ubuntu” consumer. It’s possible you’ll seek advice from the official documentation here when you face any points with the set up.
  1. After Jenkins has been put in, step one is to extract its password.
    sudo cat /var/lib/jenkins/secrets and techniques/initialAdminPassword

2. Hit the URL within the browser
Jenkins URL: http://<public-ip-of-the-ec2-instace>:8080

3. Choose the “Set up advised plugins” possibility

4. Specify user-name, password for the brand new admin consumer to be created. You should use this consumer as an admin consumer.

5. This URL discipline might be auto-filled, click on on the “Save and End” button to proceed.

6. Your Jenkins Server is prepared now.

7. Here’s what its Dashboard seems like:

  1. Let’s set up all of the plugins that we’ll want. Click on on “Handle Jenkins” within the left panel.

2. Here’s a checklist of plugins that we have to set up

  1. CloudBees AWS Credentials:
    Permits storing Amazon IAM credentials, keys throughout the Jenkins Credentials API.
  2. Docker Pipeline:
    This plugin permits constructing, testing, and utilizing Docker photos from Jenkins Pipeline.
  3. Amazon ECR:
    This plugin gives integration with AWS Elastic Container Registry (ECR)
  4. AWS Steps:
    This plugin provides Jenkins pipeline steps to work together with the AWS API.

3. Within the “Out there” tab, search all these plugins and click on on “Set up with out restart.”

4. You will notice the display as follows after the plugins have been put in efficiently.

  1. CloudBees AWS Credentials plugin will come to the rescue right here. Go to “Handle Jenkins”, after which click on on “Handle Credentials.”

2. Click on on “(international).” “Add credentials.”

3. Choose Type as “AWS Credentials” and supply ID as “demo-admin-user”. This may be supplied as per your selection, maintain a notice of this ID within the textual content file. Specify the Entry Key and Secret Key of the IAM consumer we created within the earlier steps.

Click on on “OK” to retailer the IAM credentials.

4. Comply with the identical step and this time choose Type as “Username with password” to retailer the GitHub Username and Token we created earlier.

Click on on “OK” to retailer the GitHub credentials.

5. It is best to now have IAM and GitHub credentials in your Jenkins.

  1. Go to the primary dashboard, and click on on “New Merchandise” to create a Jenkins Pipeline.

2. Choose the “Pipeline” and identify it “demo-job” or present a reputation of your selection.

3. Tick the “GitHub undertaking” checkbox underneath the “Basic” tab, present the GitHub Repository URL of the one we created earlier. Additionally, tick the checkbox “GitHub hook set off for GitScm polling” underneath the “Construct Set off” tab.

4. Below the “Pipeline” tab, choose “Pipeline script from the SCM” definition, specify our repository URL and choose the credential we created for GitHub. Test the department identify if it matches the one you’ll be utilizing to your commits.

Evaluation the configurations and click on on “Save” to avoid wasting your adjustments to the pipeline.

5. Now you may see the pipeline we simply created.

The subsequent step is to combine GitHub with Jenkins in order that at any time when there may be an occasion on the GitHub Repository, it will probably set off the Jenkins Job.

  1. Go to the settings tab of the repository, and click on on “Webhooks” within the left panel. You possibly can see the “Add webhook” button, click on on it to create a webhook.

2. Present the Jenkins URL with context as “/github-webhook/”. The URL will look as follows.

Webhook URL: http://<Jenkins-IP>:8080/github-webhook/
You possibly can choose the occasions of your selection. Nevertheless, for the sake of simplicity I’ve chosen “Ship me every part.”

Make certain the “Lively” checkbox is checked.

Click on on “Add webhook” to create a webhook that can set off the Jenkins job at any time when there may be any form of occasion within the GitHub Repository.

3. It is best to see your webhook. Click on on it to see if it has been configured appropriately or not.

4. Click on on the “Current Deliveries” tab and it is best to see a inexperienced tick mark. The inexperienced tick mark reveals that the webhook was ready to hook up with the Jenkins Server.

Earlier than we set off the Pipeline from GitHub Webhook. Let’s attempt to manually execute it.

  1. Go to the Job we created and Construct it.

2. Should you see its logs, you will notice that it failed. The reason being, that we have now not but assigned values to variables we have now in our Jenkinsfile.

Reminder: For the remainder of the article, don’t change your listing. Keep in the identical listing, i.e., /dwelling/ubuntu/demo-nodejs-app, and execute all of the instructions from right here.

Assign values to the variable within the Jenkinsfile

  1. To beat the above error, that you must make some adjustments to the Jenkinsfile. Now we have variables in that file and we have to assign values to these variables to deploy our utility to the ECS cluster we created. Assign appropriate values to the variables having “CHANGE_ME.”
    cat Jenkinsfile

2. Right here is the checklist of variables to your comfort.
Now we have the next variables within the Jenkinsfile.

    Assign your AWS Account Quantity right here.
    Assign the area you created your ECS Cluster in
    Assign the identify of the ECS Cluster that you simply created.
    Assign the Service identify that received created within the ECS Cluster.
    Assign the Activity identify that received created within the ECS Cluster.
    Assing the variety of duties you need to be created within the ECS Cluster.
    Assign the ECR Repository URL
  8. IMAGE_TAG=”$env.BUILD_ID”
    Don’t change this.
    Don’t change this.
  10. registryCredential = “CHANGE_ME”
    Assign the identify of the credentials you created in Jenkins to retailer the AWS Entry Key and Secret Key

3. Test the standing to verify that the file has been modified.
git standing
cat Jenkinsfile

4. Add a file to the git staging space, commit it after which push it to the distant GitHub Repository.
git standing
git add Jenkinsfile
git commit -m “Assigned surroundings particular values in Jenkinsfile”
git push

After pushing the commit, the Jenkins Pipeline will get triggered.

Nevertheless, you will notice an error “Acquired permission denied whereas attempting to hook up with the Docker daemon socket at unix:///var/run/docker.sock” in your Jenkins Job.

The explanation for that is {that a} “jenkins” consumer that’s utilized by the Jenkins Job will not be allowed to create docker objects. To present permission to a “jenkins” consumer, we added it to the “docker” group within the earlier step. Nevertheless, we didn’t restart the Jenkins service after that.

I saved this intentionally in order that I may present you the necessity to add the “jenkins” consumer to the “docker” group in your EC2 Occasion.

Now you realize what must be achieved to beat the above error.

  1. Restart the Jenkins service.
    sudo service jenkins restart
  2. Test if the Jenkins service has began or not.
    sudo service jenkins standing
  1. Make some adjustments in to commit, push and check if the Pipeline will get triggered mechanically or not.
  2. Add, commit and push the file.
    git standing
    git diff
    git add
    git commit -m “Modified to set off the Jenkins job after restarting the Jenkins service”
    git push

3. This time you may observe that the job should have received triggered mechanically. Go to the Jenkins job and confirm the identical.

4. That is what the Stage View seems like. It reveals us the phases that we have now laid out in our Jenkinsfile.

  1. Go to the Cluster, click on on the “Duties” tab, after which open the operating “Activity.”

2. Click on on the “JSON” tab and confirm the picture, the picture tag ought to match with the Jenkins Construct quantity. On this case, it’s “6” and it matches with my Jenkins Job Construct quantity.

3. Hit the ELB URL to test if the Nodejs utility is obtainable or not. It is best to get the message as follows within the browser after hitting the ELB URL.

  1. Open the “rc/server.js file and make some adjustments within the show message to check the CI/CD Pipeline once more.
    vim src/server.js
  2. Test the information which were modified. On this case, just one file could be seen as modified.
    git standing
  3. Test the distinction that your change has brought about within the file.
    git diff src/server.js
  4. Add the file that you simply modified to the git staging space.
    git add src/server.js
  5. Test the standing of the native repository.
    git standing
  6. Add a message to the commit.
    git commit -m “Up to date welcome message”
  7. Push your change to the distant repository.
    git push

8. Go to the Activity, this time you will notice two duties operating. One with the older revision and one with the newer revision. You see two duties due to the rolling-update deployment technique configured by default within the cluster.

Observe: Your revision numbers might differ.

9. Wait round 2–3 minutes, and it is best to solely have one process operating with the most recent revision.

10. Once more, hit the ELB URL and it is best to see your adjustments. On this case, we modified the show message.

Congratulations! You might have a working Jenkins CI/CD Pipeline to deploy your Nodejs containerized utility on AWS ECS at any time when there’s a change in your supply code.

Should you have been simply attempting out to arrange a CI/CD pipeline to get aware of it or for POC functions in your group and not want it, it’s all the time higher to delete the sources you created whereas finishing up the POC. As a part of this CI/CD pipeline, we created just a few sources.

We created the under checklist that can assist you delete them:

  1. Delete the GitHub Repository
  2. Delete the GitHub Token
  3. Delete the IAM Consumer
  4. Delete the EC2 Occasion
  5. Delete the ECR Repository
  6. Delete the ECS Cluster
  7. Deregister the Activity Definition

And eventually, right here is the abstract of what it’s a must to do to arrange a CI/CD Docker pipeline to deploy a pattern Nodejs utility on AWS ECS utilizing Jenkins.

  1. Clone the prevailing pattern GitHub Repository
  2. Create a brand new GitHub Repository and duplicate the code from the pattern repository in it
  3. Create a GitHub Token
  4. Create an IAM Consumer
  5. Create an ECR Repository
  6. Create an ECS Cluster
  7. Create an EC2 Occasion for organising the Jenkins Server
  8. Set up Java, JSON processor jq, Nodejs, and NPM on the EC2 Occasion
  9. Set up Jenkins on the EC2 Occasion
  10. Set up Docker on the EC2 Occasion
  11. Set up Plugins
  12. Create Credentials in Jenkins
  13. Create a Jenkins Job
  14. Combine GitHub and Jenkins
  15. Test the deployment
  16. Cleanup the sources

A CI/CD Pipeline serves as a manner of automating your software program purposes’ builds, checks, and deployments. It’s the spine of any group with a DevOps tradition. It has quite a few advantages for software program improvement and it boosts your small business enormously.

On this article, we demonstrated the steps to create a Jenkins CI/CD Docker Pipeline to deploy a pattern Nodejs containerized utility on AWS ECS. We noticed how GitHub Webhooks can be utilized to set off the Jenkins pipeline on each push to the repository which in flip deploys the most recent docker picture to AWS ECS.

CI/CD pipelines with Docker are finest to your group to enhance code high quality and ship software program releases shortly with none human errors.

We hope this text helped you be taught extra concerning the integral elements of the CI/CD Docker Pipeline.

More Posts