Saturday, November 1, 2025

DA3: Dockerizing My Personal Portfolio — A Beginner-Friendly Guide

Introduction

For my DA3, I wanted to take my personal portfolio website and make it fully containerized using Docker. The idea was simple: what if anyone could run my entire portfolio (with all its interactive features) using just a single command — no setup headaches, no “it works on my machine” issues.

In this project, I split my work into three parts — setting up the frontend, building a backend with a persistent database, and finally combining everything using Docker Compose. The result is a neat little environment where my portfolio runs like a professional, multi-service app!


Objectives of Part 1

  • Create a Dockerfile for the frontend so it can run in any system.

  • Learn how to build and run a simple container using node:18-alpine.

  • Serve the portfolio locally and confirm it behaves exactly as before.


Objectives of Part 2

  • Build a lightweight backend API using Express.js.

  • Connect it to a SQLite database that stores messages permanently.

  • Dockerize the backend so it can run independently as a container.


Objectives of Part 3

  • Use Docker Compose to run both the frontend and backend together.

  • Add a terminal-style message system inside the portfolio (commands like send and inbox).

  • Test persistence by checking if messages survive after restarting containers.


Containers used and where to get them

Here are the main containers and base images I used:

  • 🐳 node:18-alpine – base image for both frontend and backend (Docker Hub link)

  • 🧩 (optional) nginx:alpine – for serving static builds (Docker Hub link)

  • 🗃️ sqlite – used directly as a file-based database, no container needed (official site)


Other software and what they do

  • Docker Engine & Docker Compose – the backbone of this project; handles containers and orchestration.

  • Node.js + npm – runs both the frontend and backend JavaScript code.

  • Express – used to create the backend API for sending and viewing messages.

  • SQLite3 – a lightweight database for message storage.

  • React – powers the portfolio frontend and terminal-style interface.

  • draw.io / diagrams.net – used to draw the final architecture diagram.

  • VS Code / DB Browser for SQLite – handy for debugging and checking the database file.


Overall Architecture

To visualize how it all fits together, I first asked an AI tool to generate a line diagram (and then redrew it neatly in draw.io).

Here’s how the architecture works:

  • Frontend container (React) → receives user commands and sends requests to the backend.

  • Backend container (Express + SQLite) → processes requests and saves data in messages.db.

  • Persistent volume → keeps the SQLite database safe, even when the container stops.

Inputs & Outputs:

  • Input: user messages entered via the terminal UI (send, setidentity).

  • Output: stored data in messages.db, displayed via inbox.

Once you verify the layout, redraw it in draw.io with visible text and export it as architecture_diagram.png for the report.


Architecture Description

My DA’s architecture is pretty straightforward — the frontend and backend are both Node-based containers that talk to each other using REST APIs. The frontend provides a command-line style interface where you can send and view messages.

The backend handles the logic and uses SQLite to store everything locally inside a mounted folder. Because of Docker volumes, even if you shut everything down, your messages stay safe — proving how powerful container persistence can be for beginners learning Docker.


Procedure — Part 1: Frontend Setup

Steps:

  1. Go into your frontend folder and make a Dockerfile like this:

    FROM node:18-alpine
    WORKDIR /app
    COPY package*.json ./
    RUN npm install
    COPY . .
    EXPOSE 3000
    CMD ["npm", "start"]
    
  2. Build and run:

    docker compose build frontend
    docker compose up frontend
    
  3. Visit http://localhost:3000 and confirm your portfolio loads inside the container.

📸 Screenshots to include:

  • Docker build process

  • Browser showing portfolio running

  • docker ps showing active container


Procedure — Part 2: Backend + SQLite

Steps:

  1. Create a simple Express app in backend/server.js that accepts POST and GET routes for messages.

  2. Dockerfile for backend:

    FROM node:18-alpine
    WORKDIR /app
    RUN apk add --no-cache python3 make g++
    COPY package*.json ./
    RUN npm install --production
    COPY . .
    EXPOSE 4000
    CMD ["node", "server.js"]
    
  3. Add a .env file:

    DB_PATH=/app/data/messages.db
    
  4. Create the data/ folder on your host.

  5. Build and run the backend:

    docker compose build backend
    docker compose up backend
    
  6. Test it:

    curl -X POST http://localhost:4000/api/messages \
    -H "Content-Type: application/json" \
    -d '{"name":"Test","email":"test@demo.com","message":"Hello world!"}'
    curl http://localhost:4000/api/messages
    

📸 Screenshots to include:

  • Curl test results

  • SQLite query output (SELECT * FROM messages;)

  • Docker logs showing server start


Procedure — Part 3: Combine Everything

Steps:

  1. Create this docker-compose.yml in your root folder:

    version: "3.9"
    services:
      backend:
        build: ./portfolio_backend
        ports:
          - "4000:4000"
        volumes:
          - ./portfolio_backend/data:/app/data
        environment:
          - DB_PATH=/app/data/messages.db
    
      frontend:
        build: ./frontend
        ports:
          - "3000:3000"
        depends_on:
          - backend
    
  2. Start both with:

    docker compose up --build
    
  3. Try using your terminal UI commands:

    • setidentity <name> <email>

    • send <message>

    • inbox

  4. Restart containers and confirm the database file still has your messages!

📸 Screenshots to include:

  • Portfolio terminal running commands

  • Inbox output in the UI

  • Data folder showing messages.db


Modifications Made to Containers

  1. Switched to lightweight node:18-alpine base images.

  2. Added small build tools for sqlite compilation.

  3. Defined DB_PATH environment variable for flexibility.

  4. Mounted local data/ folder as a volume for persistence.

  5. Updated compose file to automatically start both services together.


My GitHub & Docker Hub Links


Outcomes

By the end of this DA, I:

  • Learned how to create Dockerfiles for real projects.

  • Understood how docker compose simplifies multi-container setups.

  • Got hands-on experience with persistence using volumes.

  • Successfully made my portfolio app portable and easy to deploy anywhere.


Conclusion

This DA was a great first step into Docker. What started as “let’s just containerize my website” turned into a full setup with a database, an API, and multi-container orchestration. It really made me appreciate how Docker takes care of setup complexity, so developers (especially beginners) can focus on building features instead of fixing environments.

If you’re just getting into Docker, I highly recommend trying to containerize something small, like your portfolio — it’s surprisingly fun to see your entire project spin up with one command.


References and Acknowledgements


Recommended Images for the Blog

  1. Cover Image: Portfolio terminal interface (hero_terminal.png)

  2. Architecture Diagram: Clean draw.io diagram (architecture_diagram.png)

  3. Frontend Build Screenshot: (part1_build.png)

  4. Frontend in Browser: (part1_browser.png)

  5. Backend API Test: (part2_post.png)

  6. SQLite DB Output: (part2_db.png)

  7. Docker Compose Output: (part3_compose.png)

  8. Persistent Data Folder: (part3_datafile.png)


“Learning Docker is like unlocking the cheat code for reproducibility — everything just works, everywhere.”

Satwik Nukala

 Tutorial Video:


 

DA3: Dockerizing My Personal Portfolio — A Beginner-Friendly Guide Introduction For my DA3, I wanted to take my personal portfolio website...