In this tutorial, we'll learn how to dockerize a Node.js application that uses Express as a web framework and PostgreSQL as a database. Docker allows us to package our application along with its dependencies into a container, making it easy to deploy and run consistently across different environments. By the end of this tutorial, you'll have a Docker image containing your Node.js application ready to be deployed to any Docker-compatible environment.
>> Read more:
- Top 9 Best Node.js Frameworks For Web App Development
- How to Install Node.js on Ubuntu 22.04?
- Upgrade Your AWS Infrastructure with EC2 to ECS Migration
Prerequisites
To go further with this guide, you need prepare yourself with the following fundamentals:
- Basic knowledge of JavaScript.
- Basic understanding of Docker.
- Node.js version 18 or above.
- NPM version 10.5.0 or above installed on your machine.
- Docker version 25.0.4 or above installed on your machine.
Before delving into the details of dockerizing a Node.js application and deploying it to an EC2 instance, it's essential to understand the foundational concepts of Docker and AWS (Amazon Web Services). Docker allows us to containerize applications, providing a consistent and isolated environment for running them. AWS EC2 offers scalable and flexible compute resources in the cloud, making it an ideal platform for hosting containerized applications.
You can follow this repository to get the code: https://github.com/cesc1802/dockerize-nodejs-application
Step 1: Initialize Your Node.js Project
Create a new directory for your project and navigate into it in your terminal:
mkdir dockerize-nodejs-application
cd dockerize-nodejs-application
Initialize npm to create a package.json
file:
npm init -y
Step 2: Install Dependencies
Install the required packages - Express, Sequelize, PostgreSQL driver. I also need to install some dependencies to make my server work smoothly with restful API. You can see it in the package.json
file:
npm install express sequelize pg pg-hstore
Step 3: Set Up Your Express Application
Create a file named app.js
and set up your Express application:
// app.js
const express = require('express');
const { Sequelize } = require('sequelize');
const app = express();
// Initialize Sequelize
const sequelize = new Sequelize('postgres://username:password@localhost:5432/database_name');
// Test the database connection
sequelize.authenticate()
.then(() => {
console.log('Database connection has been established successfully.');
})
.catch(err => {
console.error('Unable to connect to the database:', err);
});
// Define your routes and middleware
// ...
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`Server is running on port ${port}`);
});
Step 4: Define Your PostgreSQL Model
Create a model for your PostgreSQL table. For example, let's create a Todo
model. I have already added some more configuration to change some default value in sequelize
model. Specifically in this case, sequelize
will automatically create two fields createdAt
and updatedAt
, then I will change those to created_at
and updated_at
as I config in Todo
model.
const Todo = sequelize.define('todos', {
id: {
type: DataTypes.INTEGER,
allowNull: false,
primaryKey: true
},
description: {
type: DataTypes.STRING,
allowNull: false
},
status: {
type: DataTypes.STRING,
allowNull: false,
},
}, {
timestamps: true, // Enables createdAt and updatedAt fields
createdAt: 'created_at', // Customize createdAt field name
updatedAt: 'updated_at' // Customize updatedAt field name
});
Step 5: Create Your Routes
Define your routes and interact with the database using Sequelize models. For example, fetch all user that we have in the database and also create a new to do. Another route is simple route to check the our service status. To make sure our backend can read request body as:
// app.js
const express = require('express');
const User = require('./models/user');
const app = express();
// Define your routes and middleware
app.get("/ping", async (req, res) => {
res.status(200).json({ message: "pong" })
})
app.get("/api/v1/todos", async (req, res) => {
try {
const todos = await Todo.findAll()
res.status(200).json({ data: todos })
} catch (error) {
res.status(500).json({ error: error })
}
})
app.post("/api/v1/todos", async (req, res) => {
const { description, status } = req.body
try {
const todo = await Todo.create({ description, status }, {
fields: ['description', 'status']
})
res.status(200).json({ data: { todo } })
} catch (error) {
res.status(500).json({ error: error })
}
})
// Define other routes
// ...
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`Server is running on port ${port}`);
});
Step 6: Run Your Application
Start your Node.js application. Open your terminal, then change directory to the folder where you put code inside, then run the command below:
node app.js
Your application should now be running on http://localhost:3000
. You can test your routes using cURL and send this command curl --location 'localhost:3000/ping' --header 'Content-Type: application/json'
to the server. If you are getting the result that is a json object with this value { "message": "pong"}
, your server is running pretty good. Let’s try to add more some part of code to app.js
file to connect to the database. I will also setup bodyParser
middleware to make our service able to read json request body. Don’t forget to run npm i body-parse
to download that package.
require('dotenv').config();
const { Sequelize, DataTypes } = require('sequelize');
const bodyParser = require('body-parser');
// Use environment variables in your application
const sequelize = new Sequelize({
dialect: 'postgres',
host: process.env.DB_HOST,
port: process.env.DB_PORT,
username: process.env.DB_USERNAME,
password: process.env.DB_PASSWORD,
database: process.env.DB_DATABASE
});
// Test the database connection
sequelize.authenticate()
.then(() => {
console.log('Database connection has been established successfully.');
})
.catch(err => {
console.error('Unable to connect to the database:', err);
});
// Setup middlewares
// Parse JSON request bodies
app.use(bodyParser.json());
// Parse URL-encoded request bodies
app.use(bodyParser.urlencoded({ extended: true }));
You can see that we are using dotenv
package to read configuration from .env
file. So, you can run npm i dotenv
to install that package. Then now, app.js
file will look like this:
// app.js
require('dotenv').config();
const express = require('express');
const { Sequelize, DataTypes } = require('sequelize');
const bodyParser = require('body-parser');
const app = express();
// Use environment variables in your application
const sequelize = new Sequelize({
dialect: 'postgres',
host: process.env.DB_HOST,
port: process.env.DB_PORT,
username: process.env.DB_USERNAME,
password: process.env.DB_PASSWORD,
database: process.env.DB_DATABASE
});
// Test the database connection
sequelize.authenticate()
.then(() => {
console.log('Database connection has been established successfully.');
})
.catch(err => {
console.error('Unable to connect to the database:', err);
});
const Todo = sequelize.define('todos', {
id: {
type: DataTypes.INTEGER,
allowNull: false,
primaryKey: true
},
description: {
type: DataTypes.STRING,
allowNull: false
},
status: {
type: DataTypes.STRING,
allowNull: false,
},
}, {
timestamps: true, // Enables createdAt and updatedAt fields
createdAt: 'created_at', // Customize createdAt field name
updatedAt: 'updated_at' // Customize updatedAt field name
});
// Setup middlewares
// Parse JSON request bodies
app.use(bodyParser.json());
// Parse URL-encoded request bodies
app.use(bodyParser.urlencoded({ extended: true }));
// Define your routes and middleware
app.get("/ping", async (req, res) => {
res.status(200).json({ message: "pong" })
})
app.get("/api/v1/todos", async (req, res) => {
try {
const todos = await Todo.findAll()
res.status(200).json({ data: todos })
} catch (error) {
res.status(500).json({ error: error })
}
})
app.post("/api/v1/todos", async (req, res) => {
const { description, status } = req.body
try {
const todo = await Todo.create({ description, status }, {
fields: ['description', 'status']
})
res.status(200).json({ data: { todo } })
} catch (error) {
res.status(500).json({ error: error })
}
})
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`Server is running on port ${port}`);
});
Step 7: Dockerize Your Node.js Application
Next, create a Dockerfile in the root directory of your Node.js project. This Dockerfile will contain instructions for building your Docker image. Here's a simple example:
# Use an official Node.js runtime as a base image
FROM node:18
# Set the working directory in the container
WORKDIR /usr/src/app
# Copy package.json and package-lock.json to the working directory
COPY package*.json ./
# Install dependencies
RUN npm install
# Copy the rest of the application code to the working directory
COPY . .
# Expose the port your app runs on
EXPOSE 3000
# Command to run your application
CMD ["node", "app.js"]
Step 8: Define Your Docker Compose Configuration
Create a docker-compose.yml
file in the root directory of your project to define your application's services. This file specifies the services, networks, and volumes needed for your application. Here's an example:
version: '3.8'
services:
app:
build: .
ports:
- 3000:3000
depends_on:
- db
environment:
DB_USERNAME: admin
DB_PASSWORD: admin12345
DB_DATABASE: demo
DB_HOST: db
DB_PORT: 5432
networks:
- app-network
db:
image: bitnami/postgresql:14
environment:
POSTGRESQL_USERNAME: admin
POSTGRESQL_PASSWORD: admin12345
POSTGRESQL_DATABASE: demo
volumes:
- db-data:/bitnami/postgresql
- ./init-scripts:/docker-entrypoint-initdb.d
networks:
- app-network
networks:
app-network:
driver: bridge
volumes:
db-data:
Step 9: Define Your deploy.sh File
Define Your deploy.sh file to help you connect to EC2 instance and deploy your service:
#!/bin/bash
# Replace these variables with your own values
EC2_PUBLIC_IP="publib ip"
EC2_USER="user"
SSH_KEY_PATH="/path/to/ssh-key"
DOCKER_COMPOSE_FILE="docker-compose.yml"
echo "Copying nessecery file to deploy ..."
ssh -i "$SSH_KEY_PATH" $EC2_USER@$EC2_PUBLIC_IP "mkdir -p dockerize-nodejs-application"
scp -i "$SSH_KEY_PATH" app.js docker-compose.yml Dockerfile .env.example package.json package-lock.json $EC2_USER@$EC2_PUBLIC_IP:dockerize-nodejs-application
scp -i "$SSH_KEY_PATH" -r ./init-scripts $EC2_USER@$EC2_PUBLIC_IP:dockerize-nodejs-application
# SSH into EC2 instance and deploy
ssh -i "$SSH_KEY_PATH" $EC2_USER@$EC2_PUBLIC_IP << EOF
# Move to the directory containing the Docker Compose file
cd dockerize-nodejs-application
# Stop any running containers (optional)
docker compose -f "$DOCKER_COMPOSE_FILE" down
# Start Docker Compose
docker compose -f "$DOCKER_COMPOSE_FILE" up -d
EOF
echo "Deployment complete."
You need to provide some input EC2_PUBLIC_IP, EC2_USER, SSH_KEY_PATH, DOCKER_COMPOSE_FILE
, then open your terminal and run this command chmod +x [deploy.sh](<http://deploy.sh>)
. This command will make deploy.sh
file executable. Lastly, run ./deploy.sh
file on your terminal. If everything is good. You will be able to see the message Deployment complete.
Now, let's try to access to your public IP with this command to check curl --location 'yourPublicIP:3000/ping' --header 'Content-Type: application/json'
.
>> Read more about Docker-related topics:
- Docker Networking Fundamentals: Types, Working and Usage
- How To Create Containerizing Microservices With Docker?
- How To Use Terraform for Provisioning A Docker Container?
- A Comprehensive Guide To Dockerize A Golang Application
Conclusion
We have explored the steps to dockerize a Node.js application and deploy it to an EC2 instance, leveraging the power of containerization for easier deployment and scalability. Dockerizing a Node.js application involves creating a Dockerfile to define the application's environment, dependencies, and runtime settings. We also utilized a docker-compose.yml
file to define multi-container applications and manage them efficiently.
Once the Node.js application was dockerized, deploying it to an EC2 instance involved transferring the Docker images and necessary files to the instance using SCP (Secure Copy Protocol) and SSH (Secure Shell). We then utilized Docker Compose on the EC2 instance to orchestrate the containers and run the application seamlessly.
By dockerizing our Node.js application, we encapsulated its dependencies and configurations, making it portable and consistent across different environments. Deploying it to an EC2 instance allowed us to take advantage of the scalability and flexibility of cloud infrastructure while maintaining the ease of management provided by Docker.
In summary, dockerizing a Node.js application and deploying it to an EC2 instance streamlines the deployment process, improves scalability, and enhances the overall efficiency of managing and running applications in production environments.
>>> Follow and Contact Relia Software for more information!
- coding
- development
- Mobile App Development