Taste Of Docker.
Context : First Let me set up the context for this tutorial We gonna see the complete walk through to Docker. This gonna be three part series depending on my mood to be honest. This one part will gonna enough to start up our backend API project ( fu...

Context :
First Let me set up the context for this tutorial We gonna see the complete walk through to Docker. This gonna be three part series depending on my mood to be honest. This one part will gonna enough to start up our backend API project ( future ). To make the backend process more Easy and seemless we gonna learn Docker. It’s not a complete course of Docker but it’s a crash course for developer about Docker and give you enough walk-through that You can make your app “Dockerize”. This will be completely hand’s on journey. If you don't want to stay in tutorial hell follow along.
here is the github repo.
Content :
Introduction
What is Docker ?
Why Docker ?
Installation of Docker
Docker CLI
Docker CLI commands
Docker Image Vs Container
Port Mapping
Handing ENV files
Containerization a basic Node app
Publishing that to Docker-Hub
Real word handling( Docker-compose )
port mapping
ENV
services
That’s all for this part we can discuss. Now let’s start.
What is Docker ?
-Source ( Wikipedia)
In short, Docker is a application that help to “containerize” i.e. manage our package and help to make development of software and overall deployment easy.
Here is a jargon that maybe You can't chew that’s “container” … In the next section we will discuss about it.
Images
It’s lightweight, standalone package of software that use to store all the dependency, package, code that is use to run total program on it’s own.
Container
It’s a place where we can run docker image.
Why Docker ? - A Developer's Perspective
Let’s imagine a real-world scenario:
You and your friend are working on a Node.js project.
You use:
Node.js (v22)
Express (v4)
Mongoose (v5)
You’re on Windows. Your friend is on Mac.
Without Docker, your friend must set up everything manually. With Docker, things are smooth. Let’s break it down:
Without Docker
What you need to do:
Install the correct Node.js version (e.g., via
nvm
)Install MongoDB (matching your config)
Clone the repo
Run
npm install
(ensure lockfile is up-to-date)Manually set up
.env
Make sure OS-level issues (path, ports, etc.) don’t break things
Challenges:
"It works on my machine" syndrome 😓
OS-specific bugs (e.g., Windows vs Unix line endings)
Version mismatches (Node, MongoDB)
Manual environment setup
Time-consuming onboarding for new collaborators
With Docker
What you need to do:
Clone the repo
Run one command ( Docker command)
And boom 💥 — the project runs the same way on every machine.
Do you want to know what’s the command?
See the full blog to see it.
Benefits:
Zero local setup (no need to install Node or MongoDB)
Cross-platform consistency ( for diff os )
Pre-defined development environment
Clean, reproducible builds
Easy integration with CI/CD ( for deployment on server )
Installation of Docker
Click on this to install Docker.
Click on Docker Desktop. ( for your own Operating System)
Just click on next and next and it’s done.
Make account on Docker-Hub.
Then You can open Docker from the desktop.
Verification of Installation :
if the Ui of seems like this it’s installed
Docker CLI
when we run Docker in background a process run then it’s named as Docker Demon. Docker CLI is command line tool that is a text based tool and It’s used for running Docker Demon, Container, Images by only text. It’s very important for interaction with Docker desktop.
How to interact with it ?
- Open your PowerShell ( for windows )
docker --version
Run this command.
If this command does not give you error then Docker CLI is done installing.
Docker CLI commands
You have to find Docker-Image on Docker-hub.
Here we are choosing “ubuntu”.
Running Images in Docker CLI :
This command will run or install ubuntu operating system from docker image. Use this command
docker run -it <image_name>
instead of usingdocker pull <image_name>
Example: downloading Ubuntu in system ( important use this)
docker run -it ubuntu
Downloading a Image is done.
See in the above picture we can do it.
Now let’s open Docker Desktop to see Container and Images.
see this
Actually I pulled the image of “Ubuntu” twice. Here You can in the image you can see there are two container and There is only on image that is ubuntu. Now It can confuse it. But In the next section You will understand Difference between Docker Image Vs Container.
So, here you understand How you can run image in your local system.
To stop the container you just have to press
ctrl+d
Deleting the container and image. ( is imp when we are not using it otherwise this is consume a lot.)
Docker Image Vs Container
- In the previous section we have seen that we have installed “ubuntu” operating system twice in our docker file. Now let’s see the difference with respect to it.
Difference
Docker Container is like a plate and Images are like food. So,
In the case for ubuntu it’s the main thing. That we have to download for one time. Then we have to spin another container for using “ubuntu” it’s just spin up same “ubuntu” but in a different container.
That’s why the image is one but the number of container is two.
Port Mapping
We all know that a service should be running a definite port. Like for React, Next etc It’s fixed.
In a same way We also make the port in backend too. But we define it.
In which port the service is running. That is why the Port and port mapping is important.
In same way let’s suppose I install a app named NGINX and want to run it locally with docker. We can’t until we map the port to the right one port
Let’s Understand it by doing.
Suppose We want to install NGINX.
- Command to install
docker run -it nginx
- Installation of NGINX
Mapping the port
So for Port mapping You don't have install NGIX first you can directly run it.
docker run -p 80:80 nginx
Run it.
go to Docker Desktop
Click on ports(80:80) part
Congratulations, if you see this you have locally set up nginx by the help of port mapping.
Deep Dive:
we can change the first port as we want but 2nd part will be same ( checked from google).
docker run -p 800:80 nginx
this will work too but this time we can see the application on port:800 in our local host. (We can make it to anything)
Though It’s a good practice to make the both part same. ( 80:80 )
I hope u understand it.
Try to setup “Redis” on your own.
Try to set-up on your own. ( in your system)
When you are setting up in local system then you have to consider the http port not https port.
Handing ENV files
Okay, Let’s be real. It’s hard to understand this. We will understand It better. If We will see it in the next part. So let’s dockerize a node and express app . In this part we will understand the most of the part.
Containerization a basic Node app
Okay this will be in two parts.
- making a Basic Node and Express app
Installing the packages
npm init -y
Downloading Express
npm i express
npm i -D dotenv
Writing Basic server app:
My Basic Package.json looks like
{ "name": "docker_course", "version": "1.0.0", "main": "index.js", "type": "module", "scripts": { "start": "npm index.js" }, "keywords": [], "author": "", "license": "ISC", "description": "", "dependencies": { "express": "^5.1.0" }, "devDependencies": { "dotenv": "^16.5.0" } }
Node and express code for server
```javascript // index.js import dotenv from "dotenv" import express from "express"
dotenv.config() const app = express() const PORT=process.env.PORT || 3000
app.get("/api/hello", (req, res) => { res.status(201).send("it's working") })
app.listen(PORT, () => {
console.log(server is running on ${PORT}
)
})
* Env file
```plaintext
PORT=5000
Let’s Run index.js
npm run start
Output :
On “/” route :
on “ /api/hello “ routes :
So, Basic server is done. Now We have to Dockerize it.
Dockerizing the server :
Here is the new file structure.
- Making Dockerfile :
To make the server dockerize we have to first make Dockerfile.
FROM node:18-alpine
COPY package.json package.json
COPY package-lock.json package-lock.json
COPY index.js index.js
RUN npm install
ENTRYPOINT ["node", "index.js"]
Deepdive
Now let’s understand and deepdive into the code snippet
Making the Image from it:
docker build -t node-docker-course .
Here we after giving the command we will see this in the terminal
If we open Docker desktop then in image we can see the image that we have made
If you see like this it’s done.
Publishing that to Docker-Hub
make account on Docker-Hub
make a repository in Docker-Hub.
Give a name → copy the new name
Go to your vs-code ternimal
docker build -t ayush095/ayush-node-docker-course .
In Docker Desktop you can find new container
Push in Hub-Docker
docker push ayush095/ayush-node-docker-course
After it’s done you will see it like this.
And It’s done. Here is this. Click
Assignment try to use in on your own with Docker run -it command.
If I do it without port mapping it will be like this
Now with Port mapping
docker run -p 5000:3000 -it ayush095/ayush-node-docker-course
after doing it we can see
If yes congratulations It’s done. You have successfully dockarize a node app and published it and used it.
Docker Compose
Now, let’s talk about real life in a advance project there are many features are available in real life features.
To handle this all this we need to write big dockerfile. We have to write docker-compose.ymal.
New file structure:
making Docker compose file :
version: "3.8" services: postgres: image: postgres # pull postgres image from https://hub.docker.com image ports: - "5432:5432" environment: POSTGRES_USER: "Your name" POSTGRES_PASSWORD: yourpassword POSTGRES_DB: yourdatabase redis: image: redis ports: - "6379:6379"
This will pull the image of “postgres” and “redis” and from docker hub.
In this case We don't have to pull postgres and redis manually form docker. We will use Docker Compose file for this process.
output :
here we can see redis and postgres server is on. Here We can see the image that is there.
Conclusion :
Docker simplifies software development by allowing you to package applications and their dependencies into lightweight, portable containers. With Docker, you can run the same app in any environment without worrying about setup conflicts. In this crash course, you learned how to:
Use Docker images and containers
Write a
Dockerfile
to containerize appsUse
docker-compose.yml
to manage multi-container setups like PostgreSQL and RedisUnderstand key Docker commands and workflows
With these fundamentals, you're now ready to explore more advanced Docker features and build reliable, scalable applications.