Sitemap

πŸ”„ Real-Time Data Tracking in MongoDB with Change Streams (Node.js + Express)Implement Draggable list in NextJs

4 min readOct 12, 2025

🧠 The Beginning β€” A Challenge at Rivosys

Recently while I am working at Rivosys, where we handle multiple enterprise-grade applications that generate a large amount of dynamic data β€” user actions, transactions, and system events that need to be monitored and logged in real time.

One day, while building a real-time audit and monitoring system, I hit a familiar problem:

β€œHow can we track database changes (insert, update, delete) instantly β€” without writing manual cron jobs or polling MongoDB every few seconds?”

We were already using MongoDB Atlas, and that’s when I discovered a game-changing feature:
πŸ‘‰ MongoDB Change Streams

πŸ’‘ What Are MongoDB Change Streams?

Change Streams let you listen to real-time changes happening in your MongoDB collection β€” insertions, updates, and deletions β€” as they occur.

Instead of writing repeated queries, your application subscribes to these change events directly.

It’s like having a live event listener on your database.

Whenever data changes, MongoDB pushes an event automatically.

βš™οΈ Setting Up the Project

Here’s how I implemented it in a Node.js + Express app to log every data change happening in a collection.

Step 1: Initialize the project

mkdir rivosys-change-stream
cd rivosys-change-stream
npm init -y
npm install express mongodb dotenv
npm install --save-dev nodemon

Step 2: Add a dev script in package.json

"scripts": {
"start": "node server.js",
"dev": "nodemon server.js"
}

🧩 server.js β€” Complete Working Code

import express from "express";
import { MongoClient } from "mongodb";
import dotenv from "dotenv";

dotenv.config();

const app = express();
const port = process.env.PORT || 3000;

const uri = process.env.MONGO_URI;
const dbName = process.env.DB_NAME;
const collectionName = process.env.COLLECTION_NAME;

async function main() {
const client = new MongoClient(uri);
await client.connect();
console.log("βœ… Connected to MongoDB Atlas");

const db = client.db(dbName);
const collection = db.collection(collectionName);

// Enable pre and post images (MongoDB 6.0+)
await db.command({
collMod: collectionName,
changeStreamPreAndPostImages: { enabled: true },
});

// Change Stream pipeline
const pipeline = [
{ $match: { operationType: { $in: ["insert", "update", "delete"] } } },
];

// Create a change stream
const changeStream = collection.watch(pipeline, {
fullDocument: "updateLookup",
fullDocumentBeforeChange: "required",
});

changeStream.on("change", async (change) => {
console.log("\nπŸ”” Change Detected:", change.operationType);

if (change.operationType === "insert") {
console.log("πŸ“₯ Inserted Document:", change.fullDocument);
} else if (change.operationType === "update") {
console.log("🟑 Before Update:", change.fullDocumentBeforeChange);
console.log("🟒 After Update:", change.fullDocument);
} else if (change.operationType === "delete") {
console.log("❌ Deleted Document ID:", change.documentKey._id);
}
});

app.get("/", (req, res) => {
res.send("MongoDB Change Streams running successfully βœ…");
});

app.listen(port, () =>
console.log(`πŸš€ Server running on http://localhost:${port}`)
);
}

main().catch(console.error);

πŸ” .env File

MONGO_URI=mongodb+srv://<username>:<password>@cluster0.mongodb.net/
DB_NAME=logsdb
COLLECTION_NAME=audit_logs
PORT=4000

βœ… Replace <username> and <password> with your MongoDB Atlas credentials.

πŸ§ͺ Testing the Setup

npm run dev

Now, open your MongoDB collection and make a few changes:

db.audit_logs.insertOne({ user: "Abhishek", action: "Login", status: "success" });
db.audit_logs.updateOne({ user: "Abhishek" }, { $set: { status: "failed" } });
db.audit_logs.deleteOne({ user: "Abhishek" });

You’ll see real-time console logs like:

πŸ”” Change Detected: insert
πŸ“₯ Inserted Document: { user: "Abhishek", action: "Login", status: "success" }

πŸ”” Change Detected: update
🟑 Before Update: { user: "Abhishek", action: "Login", status: "success" }
🟒 After Update: { user: "Abhishek", action: "Login", status: "failed" }

πŸ”” Change Detected: delete
❌ Deleted Document ID: ObjectId("...")

🧩 The Magic Behind β€œBefore and After”

When you use:

fullDocumentBeforeChange: "required"

and enable:

db.runCommand({
collMod: "audit_logs",
changeStreamPreAndPostImages: { enabled: true }
});

MongoDB starts storing pre-images, which means you can access both:

  • change.fullDocumentBeforeChange β†’ document before update
  • change.fullDocument β†’ document after update

This feature is available from MongoDB 6.0 onward.

🧾 Real-World Use Case at Rivosys

At Rivosys, we used this setup to build a transparent logging system.
Every update made in the app is instantly recorded with its before and after states, providing full traceability β€” no delays, no missed updates.

This approach helped us:

  • Eliminate periodic polling
  • Detect issues in real-time
  • Improve our audit reliability significantly

⚠️ Common Mistakes Developers Make

❌ Using standalone MongoDB
β†’ Change Streams only work on replica sets or MongoDB Atlas.

⚠️ Older MongoDB versions (<6.0)
β†’ Don’t support pre- and post-images.

πŸ” Network Access not configured
β†’ Always whitelist your IP in Atlas under Network Access.

🧠 Final Thoughts

Working at Rivosys taught me how crucial real-time visibility is for modern applications.
MongoDB Change Streams gave us the power to track data as it happens, not after it happens.

Whether you’re building audit trails, analytics dashboards, or live notification systems β€”
Change Streams turn your database into a real-time event engine.

✍️ About the Author

I’m Abhishek Gupta, a Senior Software Engineer at Rivosys and a Data Science Trainer.
I love working on full-stack systems involving Node.js, FastAPI, and MongoDB, and I enjoy solving real-world challenges with simple, scalable solutions.

--

--

Abhishek Kumar Gupta
Abhishek Kumar Gupta

Written by Abhishek Kumar Gupta

Web and Native Mobile App Developer.

No responses yet