How to create pipeline for Nestjs in AWS CodePipeline using Docker and CodeDeploy

Welcome, dear readers! In today’s post, we’ll dive deep into the process of creating a Continuous Integration/Continuous Deployment (CI/CD) pipeline for a NestJS application using AWS CodePipeline and CodeDeploy. By leveraging Docker, we will create a containerized version of our NestJS application, providing an additional layer of abstraction and automation. Whether you’re a seasoned developer or just getting started in the world of DevOps, this tutorial aims to provide a clear, step-by-step guide to combining these powerful technologies.

Prerequisites

Before we get started, it’s important to ensure that you’re equipped with the necessary tools and knowledge. Here’s what you’ll need:

  • NestJS: Basic knowledge of building applications using NestJS.
  • Docker: Familiarity with Docker, including building images and running containers.
  • AWS: An active AWS account and basic understanding of AWS services, specifically AWS CodePipeline and CodeDeploy.
  • Node.js and NPM: These should be installed in your system. We’ll use them to set up our NestJS application.
  • IDE: A text editor or Integrated Development Environment (IDE) of your choice, such as Visual Studio Code.
  • Command Line: Comfortability with using the command line, as we’ll execute a number of shell commands.

Overview of Technologies

Let’s briefly go over the technologies we’re going to use in this tutorial:

  • NestJS: A progressive Node.js framework for building efficient, reliable, and scalable server-side applications. Its modular architecture and support for TypeScript make it an excellent choice for enterprise-level applications.
  • Docker: A tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow developers to package up an application with all its parts, including libraries and other dependencies, and ship it all out as one package.
  • AWS CodePipeline: A fully managed continuous delivery service that helps you automate your release pipelines for fast and reliable application and infrastructure updates.
  • AWS CodeDeploy: A deployment service that enables developers to automate the deployment of applications to either on-premises servers or AWS cloud instances. CodeDeploy scales with your infrastructure to allow for simultaneous deployments across multiple instances.

By the end of this tutorial, you should be able to bind these technologies together to build a seamless CI/CD pipeline. Now, without further ado, let’s move on to setting up our NestJS application.

Installing Docker

If you don’t already have Docker installed, you can install it using the following command:

sudo dnf update -y
sudo dnf install docker -y

Once Docker is installed, start the Docker service using the following command:

sudo systemctl start docker

You can also enable Docker to start automatically on system boot:

sudo systemctl enable docker

Install nginx

First, you need to install the nginx web server. Run the following command to install nginx on your Amazon Linux 2023 server:

sudo yum install nginx -y
sudo systemctl enable nginx

Setting Up NestJS Application

Let’s start by setting up a simple NestJS application.

  1. Install the NestJS CLI globally using NPM: npm install -g @nestjs/cli
  2. Once installed, you can create a new project: nest new your-project-name
  3. After creating the project, navigate to the project directory: cd your-project-name
  4. To ensure that the application is set up correctly, run the application using npm run start. You should see the application running on http://localhost:3000.

This sets up a basic NestJS application. Of course, your real-world application will likely be more complex, but this is enough to get us started.

Dockerizing the NestJS Application

The next step is to create a Docker image of our application. This will allow us to run the application in any environment that supports Docker, ensuring consistency across different stages of our pipeline.

  1. In the root of your project, create a file named Dockerfile.
  2. Open the Dockerfile in your text editor, add the following:
# Start from a Node.js 18 base image
FROM node:18-alpine

# Set the working directory to /app
WORKDIR /app

# Copy package.json and package-lock.json to the docker image
COPY package*.json ./

# Install all the dependencies
RUN npm install

# Install NestJS CLI globally in the docker image
RUN npm install -g @nestjs/cli

# Copy the rest of the application to the docker image
COPY . .

# Build the application
RUN npm run build

# Start the application
CMD ["npm", "run", "start:prod"]

This Dockerfile does the following:

  • Starts from a base image that includes Node.js 14.
  • Sets a working directory in the container.
  • Copies the package.json and package-lock.json files to the container.
  • Runs npm install to install the application dependencies.
  • Copies the rest of the application to the container.
  • Runs npm run build to build the application.
  • Exposes port 3000, which our application will use.
  • Sets the command that will run when the container starts.
  1. Now, let’s build the Docker image with this command: docker build -t nestjs-app .
  2. Once the image is built, you can run it using Docker: docker run -p 3000:3000 nestjs-app
  3. Visit http://localhost:3000 in your browser. You should see the NestJS application running.

Setting up AWS CodePipeline

With our Dockerized NestJS application ready, let’s move on to setting up our AWS CodePipeline.

First, create this “appspec.yml” file with the following content

version: 0.0
os: linux
files:
  - source: /
    destination: /var/www/projecta/
permissions:
  - object: /var/www/projecta/
    owner: ec2-user    
hooks:
  AfterInstall:
    - location: scripts/after_install.sh
      timeout: 300
      runas: root

After that, add this after_install.sh file in the scripts folder

#!/bin/bash


CONTAINER_NAME=projecta-container
IMAGE_NAME=projecta-image

# Navigate to the directory of your Dockerfile if not the current directory
cd /var/www/projecta

# Stop and remove the existing container if it's already running
if [ "$(docker ps -a -q -f name=$CONTAINER_NAME)" ]; then
    docker stop $CONTAINER_NAME
    docker rm $CONTAINER_NAME
fi

# Build the Docker image
docker build -t $IMAGE_NAME .

# Remove dangling images from previous builds
docker image prune -f

# Run the new Docker container
docker run -d --name $CONTAINER_NAME -p 3000:3000 $IMAGE_NAME
  1. Log in to your AWS account and navigate to the AWS CodePipeline console.
  2. Click on “Create pipeline”.
  3. Enter a name for your pipeline, then click “Next”.
  4. In the “Source” stage, you can choose where your source code resides. If it’s on GitHub, select “GitHub” and connect your account. After that, select your repository and branch.
  5. In the “Build” stage, select Skip.
  6. Click on “Create build project”, and then “Next”.
  7. We will set up AWS CodeDeploy in the next section, so for now, you can skip the “Deploy” stage.
  8. Review your settings and then click “Create pipeline”.

Great, we’ve now set up a basic pipeline that pulls from our source repository and builds our application using AWS CodeBuild. Let’s proceed with the next step in the next section.

Setting up AWS CodeDeploy

  1. In the AWS console, navigate to CodeDeploy.
  2. Click on “Create Application”, enter an application name, and choose EC2/On-premises for the compute platform.
  3. Now create a Deployment Group. Assign a name and choose the service role that allows AWS CodeDeploy to access the target instances.
  4. Select the “In-place” deployment type.
  5. Under Environment Configuration, select Amazon EC2 instances. Apply a tag to filter the instances to which you wish to deploy the application.
  6. Under Load balancer, add your load balancer if you have one, otherwise leave it blank.
  7. Click “Create Deployment Group”.

Configuring the AWS CodePipeline

Return to the AWS CodePipeline and edit the pipeline:

  1. Add a new stage by clicking “Edit” and then “Add Stage”. Label it “Deploy”.
  2. Inside the Deploy stage, click “Add Action Group”.
  3. Choose AWS CodeDeploy as the “Action Provider”.
  4. For the “Input Artifacts”, choose the source artifact from your pipeline (it should be something like “SourceArtifact”).
  5. Choose the application and deployment group you created in CodeDeploy.
  6. Save your changes.

Triggering the Pipeline

With everything set up, you’re now ready to trigger the pipeline. Here’s how you do it:

  1. Push a change to the repository branch that you set as the pipeline source. This will automatically trigger the pipeline.
  2. In AWS CodePipeline, you can view the progress of the pipeline execution. You’ll see each stage—Source, Build, and Deploy—execute sequentially. If all stages succeed, the change has been deployed to your EC2 instances.
  3. If any stage fails, you can click on “Details” to get more information about the failure and how to fix it.

Now you have a functioning AWS CodePipeline set up for your Dockerized NestJS application. Remember, this pipeline is meant to simplify your deployment process and ensure consistent deployment of your application. You can now make changes to your application code confident that those changes can be smoothly and reliably deployed.

Monitoring and Troubleshooting

Monitoring your pipeline is essential for spotting issues quickly and maintaining the health of your deployments. AWS provides CloudWatch for this purpose.

  1. AWS CloudWatch: Navigate to the CloudWatch console in AWS. Here, you can view logs for each pipeline run. Check these logs if a pipeline execution fails to understand what went wrong.
  2. Pipeline Execution History: In the AWS CodePipeline console, you can view the execution history for each pipeline run. This includes information on each action taken within the stages of your pipeline and whether those actions succeeded or failed.

In terms of troubleshooting, most issues can be resolved by checking your application logs, pipeline execution details, and CloudWatch logs. Ensure that all AWS services are correctly configured and have the necessary permissions to interact with each other.

Best Practices

Let’s go over some best practices for using AWS CodePipeline, CodeDeploy, and Docker:

  • Keep Docker Images Lightweight: Minimize the size of your Docker images to reduce build time and speed up your deployments.
  • Use .dockerignore: Add a .dockerignore file to your project to prevent unnecessary files from being included in your Docker images.
  • Manage AWS Permissions Carefully: Ensure that all AWS services have only the permissions they need and no more. This minimizes the potential damage if a service is compromised.
  • Keep Pipelines Simple: Each pipeline should have a single, clearly-defined purpose. This makes pipelines easier to manage and troubleshoot.
  • Monitor Your Pipelines: Regularly check CloudWatch and pipeline execution history to spot and fix any issues quickly.

Conclusion

And that’s it! You’ve just walked through the process of creating a CI/CD pipeline for a NestJS application using AWS CodePipeline, AWS CodeDeploy, and Docker. This pipeline will help you automate your deployment process, reduce errors, and ensure that your application is always running the latest, greatest version of your code.

Remember, the key to successful CI/CD is constant monitoring and regular updates. Always be on the lookout for ways to improve your pipeline and your deployment process. Don’t be afraid to experiment and try new things. Every pipeline is unique, just like the team that builds it.

Thank you for following along with this guide. Here’s to smooth and reliable deployments!

Atiqur Rahman

I am MD. Atiqur Rahman graduated from BUET and is an AWS-certified solutions architect. I have successfully achieved 6 certifications from AWS including Cloud Practitioner, Solutions Architect, SysOps Administrator, and Developer Associate. I have more than 8 years of working experience as a DevOps engineer designing complex SAAS applications.

Leave a Reply