π Real-Time Data Tracking in MongoDB with Change Streams (Node.js + Express)Implement Draggable list in NextJs
π§ The Beginning β A Challenge at Rivosys
Recently while I am working at Rivosys, where we handle multiple enterprise-grade applications that generate a large amount of dynamic data β user actions, transactions, and system events that need to be monitored and logged in real time.
One day, while building a real-time audit and monitoring system, I hit a familiar problem:
βHow can we track database changes (insert, update, delete) instantly β without writing manual cron jobs or polling MongoDB every few seconds?β
We were already using MongoDB Atlas, and thatβs when I discovered a game-changing feature:
π MongoDB Change Streams
π‘ What Are MongoDB Change Streams?
Change Streams let you listen to real-time changes happening in your MongoDB collection β insertions, updates, and deletions β as they occur.
Instead of writing repeated queries, your application subscribes to these change events directly.
Itβs like having a live event listener on your database.
Whenever data changes, MongoDB pushes an event automatically.
βοΈ Setting Up the Project
Hereβs how I implemented it in a Node.js + Express app to log every data change happening in a collection.
Step 1: Initialize the project
mkdir rivosys-change-stream
cd rivosys-change-stream
npm init -y
npm install express mongodb dotenv
npm install --save-dev nodemonStep 2: Add a dev script in package.json
"scripts": {
"start": "node server.js",
"dev": "nodemon server.js"
}π§© server.js β Complete Working Code
import express from "express";
import { MongoClient } from "mongodb";
import dotenv from "dotenv";
dotenv.config();
const app = express();
const port = process.env.PORT || 3000;
const uri = process.env.MONGO_URI;
const dbName = process.env.DB_NAME;
const collectionName = process.env.COLLECTION_NAME;
async function main() {
const client = new MongoClient(uri);
await client.connect();
console.log("β
Connected to MongoDB Atlas");
const db = client.db(dbName);
const collection = db.collection(collectionName);
// Enable pre and post images (MongoDB 6.0+)
await db.command({
collMod: collectionName,
changeStreamPreAndPostImages: { enabled: true },
});
// Change Stream pipeline
const pipeline = [
{ $match: { operationType: { $in: ["insert", "update", "delete"] } } },
];
// Create a change stream
const changeStream = collection.watch(pipeline, {
fullDocument: "updateLookup",
fullDocumentBeforeChange: "required",
});
changeStream.on("change", async (change) => {
console.log("\nπ Change Detected:", change.operationType);
if (change.operationType === "insert") {
console.log("π₯ Inserted Document:", change.fullDocument);
} else if (change.operationType === "update") {
console.log("π‘ Before Update:", change.fullDocumentBeforeChange);
console.log("π’ After Update:", change.fullDocument);
} else if (change.operationType === "delete") {
console.log("β Deleted Document ID:", change.documentKey._id);
}
});
app.get("/", (req, res) => {
res.send("MongoDB Change Streams running successfully β
");
});
app.listen(port, () =>
console.log(`π Server running on http://localhost:${port}`)
);
}
main().catch(console.error);π .env File
MONGO_URI=mongodb+srv://<username>:<password>@cluster0.mongodb.net/
DB_NAME=logsdb
COLLECTION_NAME=audit_logs
PORT=4000β
Replace <username> and <password> with your MongoDB Atlas credentials.
π§ͺ Testing the Setup
npm run devNow, open your MongoDB collection and make a few changes:
db.audit_logs.insertOne({ user: "Abhishek", action: "Login", status: "success" });
db.audit_logs.updateOne({ user: "Abhishek" }, { $set: { status: "failed" } });
db.audit_logs.deleteOne({ user: "Abhishek" });Youβll see real-time console logs like:
π Change Detected: insert
π₯ Inserted Document: { user: "Abhishek", action: "Login", status: "success" }
π Change Detected: update
π‘ Before Update: { user: "Abhishek", action: "Login", status: "success" }
π’ After Update: { user: "Abhishek", action: "Login", status: "failed" }
π Change Detected: delete
β Deleted Document ID: ObjectId("...")π§© The Magic Behind βBefore and Afterβ
When you use:
fullDocumentBeforeChange: "required"and enable:
db.runCommand({
collMod: "audit_logs",
changeStreamPreAndPostImages: { enabled: true }
});MongoDB starts storing pre-images, which means you can access both:
change.fullDocumentBeforeChangeβ document before updatechange.fullDocumentβ document after update
This feature is available from MongoDB 6.0 onward.
π§Ύ Real-World Use Case at Rivosys
At Rivosys, we used this setup to build a transparent logging system.
Every update made in the app is instantly recorded with its before and after states, providing full traceability β no delays, no missed updates.
This approach helped us:
- Eliminate periodic polling
- Detect issues in real-time
- Improve our audit reliability significantly
β οΈ Common Mistakes Developers Make
β Using standalone MongoDB
β Change Streams only work on replica sets or MongoDB Atlas.
β οΈ Older MongoDB versions (<6.0)
β Donβt support pre- and post-images.
π Network Access not configured
β Always whitelist your IP in Atlas under Network Access.
π§ Final Thoughts
Working at Rivosys taught me how crucial real-time visibility is for modern applications.
MongoDB Change Streams gave us the power to track data as it happens, not after it happens.
Whether youβre building audit trails, analytics dashboards, or live notification systems β
Change Streams turn your database into a real-time event engine.
βοΈ About the Author
Iβm Abhishek Gupta, a Senior Software Engineer at Rivosys and a Data Science Trainer.
I love working on full-stack systems involving Node.js, FastAPI, and MongoDB, and I enjoy solving real-world challenges with simple, scalable solutions.
