
time
Node.js has undergone a remarkable transformation since its early days. If you’ve been writing Node.js for several years, you’ve likely witnessed this evolution firsthand—from the callback-heavy, CommonJS-dominated landscape to today’s clean, standards-based development experience.
The changes aren’t just cosmetic; they represent a fundamental shift in how we approach server-side JavaScript development. Modern Node.js embraces web standards, reduces external dependencies, and provides a more intuitive developer experience. Let’s explore these transformations and understand why they matter for your applications in 2025.
The module system is perhaps where you’ll notice the biggest difference. CommonJS served us well, but ES Modules (ESM) have become the clear winner, offering better tooling support and alignment with web standards.
Let’s look at how we used to structure modules. This approach required explicit exports and synchronous imports:
// math.js
function add(a, b) {
return a + b;
}
module.exports = { add };
// app.js
const { add } = require('./math');
console.log(add(2, 3));
This worked fine, but it had limitations—no static analysis, no tree-shaking, and it didn’t align with browser standards.
Modern Node.js development embraces ES Modules with a crucial addition—the node:
prefix for built-in modules. This explicit naming prevents confusion and makes dependencies crystal clear:
// math.js
export function add(a, b) {
return a + b;
}
// app.js
import { add } from './math.js';
import { readFile } from 'node:fs/promises'; // Modern node: prefix
import { createServer } from 'node:http';
console.log(add(2, 3));
The node:
prefix is more than just a convention—it’s a clear signal to both developers and tools that you’re importing Node.js built-ins rather than npm packages. This prevents potential conflicts and makes your code more explicit about its dependencies.
One of the most game-changing features is top-level await. No more wrapping your entire application in an async function just to use await at the module level:
// app.js - Clean initialization without wrapper functions
import { readFile } from 'node:fs/promises';
const config = JSON.parse(await readFile('config.json', 'utf8'));
const server = createServer(/* ... */);
console.log('App started with config:', config.appName);
This eliminates the common pattern of immediately-invoked async function expressions (IIFE) that we used to see everywhere. Your code becomes more linear and easier to reason about.
Node.js has embraced web standards in a big way, bringing APIs that web developers already know directly into the runtime. This means fewer dependencies and more consistency across environments.
Remember when every project needed axios, node-fetch, or similar libraries for HTTP requests? Those days are over. Node.js now includes the Fetch API natively:
// Old way - external dependencies required
const axios = require('axios');
const response = await axios.get('https://api.example.com/data');
// Modern way - built-in fetch with enhanced features
const response = await fetch('https://api.example.com/data');
const data = await response.json();
But the modern approach goes beyond just replacing your HTTP library. You get sophisticated timeout and cancellation support built-in:
async function fetchData(url) {
try {
const response = await fetch(url, {
signal: AbortSignal.timeout(5000) // Built-in timeout support
});
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
return await response.json();
} catch (error) {
if (error.name === 'TimeoutError') {
throw new Error('Request timed out');
}
throw error;
}
}
This approach eliminates the need for timeout libraries and provides a consistent error handling experience. The AbortSignal.timeout()
method is particularly elegant—it creates a signal that automatically aborts after the specified time.
Modern applications need to handle cancellation gracefully, whether it’s user-initiated or due to timeouts. AbortController provides a standardized way to cancel operations:
// Cancel long-running operations cleanly
const controller = new AbortController();
// Set up automatic cancellation
setTimeout(() => controller.abort(), 10000);
try {
const data = await fetch('https://slow-api.com/data', {
signal: controller.signal
});
console.log('Data received:', data);
} catch (error) {
if (error.name === 'AbortError') {
console.log('Request was cancelled - this is expected behavior');
} else {
console.error('Unexpected error:', error);
}
}
This pattern works across many Node.js APIs, not just fetch. You can use the same AbortController with file operations, database queries, and any async operation that supports cancellation.
Testing used to require choosing between Jest, Mocha, Ava, or other frameworks. Node.js now includes a full-featured test runner that covers most testing needs without any external dependencies.
The built-in test runner provides a clean, familiar API that feels modern and complete:
// test/math.test.js
import { test, describe } from 'node:test';
import assert from 'node:assert';
import { add, multiply } from '../math.js';
describe('Math functions', () => {
test('adds numbers correctly', () => {
assert.strictEqual(add(2, 3), 5);
});
test('handles async operations', async () => {
const result = await multiply(2, 3);
assert.strictEqual(result, 6);
});
test('throws on invalid input', () => {
assert.throws(() => add('a', 'b'), /Invalid input/);
});
});
What makes this particularly powerful is how seamlessly it integrates with the Node.js development workflow:
# Run all tests with built-in runner
node --test
# Watch mode for development
node --test --watch
# Coverage reporting (Node.js 20+)
node --test --experimental-test-coverage
The watch mode is especially valuable during development—your tests re-run automatically as you modify code, providing immediate feedback without any additional configuration.
While async/await isn’t new, the patterns around it have matured significantly. Modern Node.js development leverages these patterns more effectively and combines them with newer APIs.
Modern error handling combines async/await with sophisticated error recovery and parallel execution patterns:
import { readFile, writeFile } from 'node:fs/promises';
async function processData() {
try {
// Parallel execution of independent operations
const [config, userData] = await Promise.all([
readFile('config.json', 'utf8'),
fetch('/api/user').then(r => r.json())
]);
const processed = processUserData(userData, JSON.parse(config));
await writeFile('output.json', JSON.stringify(processed, null, 2));
return processed;
} catch (error) {
// Structured error logging with context
console.error('Processing failed:', {
error: error.message,
stack: error.stack,
timestamp: new Date().toISOString()
});
throw error;
}
}
This pattern combines parallel execution for performance with comprehensive error handling. The Promise.all()
ensures that independent operations run concurrently, while the try/catch provides a single point for error handling with rich context.
Event-driven programming has evolved beyond simple event listeners. AsyncIterators provide a more powerful way to handle streams of events:
import { EventEmitter, once } from 'node:events';
class DataProcessor extends EventEmitter {
async *processStream() {
for (let i = 0; i < 10; i++) {
this.emit('data', `chunk-${i}`);
yield `processed-${i}`;
// Simulate async processing time
await new Promise(resolve => setTimeout(resolve, 100));
}
this.emit('end');
}
}
// Consume events as an async iterator
const processor = new DataProcessor();
for await (const result of processor.processStream()) {
console.log('Processed:', result);
}
This approach is particularly powerful because it combines the flexibility of events with the control flow of async iteration. You can process events in sequence, handle backpressure naturally, and break out of processing loops cleanly.
Streams remain one of Node.js’s most powerful features, but they’ve evolved to embrace web standards and provide better interoperability.
Stream processing has become more intuitive with better APIs and clearer patterns:
import { Readable, Transform } from 'node:stream';
import { pipeline } from 'node:stream/promises';
import { createReadStream, createWriteStream } from 'node:fs';
// Create transform streams with clean, focused logic
const upperCaseTransform = new Transform({
objectMode: true,
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
// Process files with robust error handling
async function processFile(inputFile, outputFile) {
try {
await pipeline(
createReadStream(inputFile),
upperCaseTransform,
createWriteStream(outputFile)
);
console.log('File processed successfully');
} catch (error) {
console.error('Pipeline failed:', error);
throw error;
}
}
The pipeline
function with promises provides automatic cleanup and error handling, eliminating many of the traditional pain points with stream processing.
Modern Node.js can seamlessly work with Web Streams, enabling better compatibility with browser code and edge runtime environments:
// Create a Web Stream (compatible with browsers)
const webReadable = new ReadableStream({
start(controller) {
controller.enqueue('Hello ');
controller.enqueue('World!');
controller.close();
}
});
// Convert between Web Streams and Node.js streams
const nodeStream = Readable.fromWeb(webReadable);
const backToWeb = Readable.toWeb(nodeStream);
This interoperability is crucial for applications that need to run in multiple environments or share code between server and client.
JavaScript’s single-threaded nature isn’t always ideal for CPU-intensive work. Worker threads provide a way to leverage multiple cores effectively while maintaining the simplicity of JavaScript.
Worker threads are perfect for computationally expensive tasks that would otherwise block the main event loop:
// worker.js - Isolated computation environment
import { parentPort, workerData } from 'node:worker_threads';
function fibonacci(n) {
if (n < 2) return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}
const result = fibonacci(workerData.number);
parentPort.postMessage(result);
The main application can delegate heavy computations without blocking other operations:
// main.js - Non-blocking delegation
import { Worker } from 'node:worker_threads';
import { fileURLToPath } from 'node:url';
async function calculateFibonacci(number) {
return new Promise((resolve, reject) => {
const worker = new Worker(
fileURLToPath(new URL('./worker.js', import.meta.url)),
{ workerData: { number } }
);
worker.on('message', resolve);
worker.on('error', reject);
worker.on('exit', (code) => {
if (code !== 0) {
reject(new Error(`Worker stopped with exit code ${code}`));
}
});
});
}
// Your main application remains responsive
console.log('Starting calculation...');
const result = await calculateFibonacci(40);
console.log('Fibonacci result:', result);
console.log('Application remained responsive throughout!');
This pattern allows your application to utilize multiple CPU cores while keeping the familiar async/await programming model.
Modern Node.js prioritizes developer experience with built-in tools that previously required external packages or complex configurations.
Development workflow has been significantly streamlined with built-in watch mode and environment file support:
{
"name": "modern-node-app",
"type": "module",
"engines": {
"node": ">=20.0.0"
},
"scripts": {
"dev": "node --watch --env-file=.env app.js",
"test": "node --test --watch",
"start": "node app.js"
}
}
The --watch
flag eliminates the need for nodemon, while --env-file
removes the dependency on dotenv. Your development environment becomes simpler and faster:
// .env file automatically loaded with --env-file
// DATABASE_URL=postgres://localhost:5432/mydb
// API_KEY=secret123
// app.js - Environment variables available immediately
console.log('Connecting to:', process.env.DATABASE_URL);
console.log('API Key loaded:', process.env.API_KEY ? 'Yes' : 'No');
These features make development more pleasant by reducing configuration overhead and eliminating restart cycles.
Security and performance have become first-class concerns with built-in tools for monitoring and controlling application behavior.
The experimental permission model allows you to restrict what your application can access, following the principle of least privilege:
# Run with restricted file system access
node --experimental-permission --allow-fs-read=./data --allow-fs-write=./logs app.js
# Network restrictions
node --experimental-permission --allow-net=api.example.com app.js
# Above allow-net feature not avaiable yet, PR merged in node.js repo, will be available in future release
This is particularly valuable for applications that process untrusted code or need to demonstrate security compliance.
Performance monitoring is now built into the platform, eliminating the need for external APM tools for basic monitoring:
import { PerformanceObserver, performance } from 'node:perf_hooks';
// Set up automatic performance monitoring
const obs = new PerformanceObserver((list) => {
for (const entry of list.getEntries()) {
if (entry.duration > 100) { // Log slow operations
console.log(`Slow operation detected: ${entry.name} took ${entry.duration}ms`);
}
}
});
obs.observe({ entryTypes: ['function', 'http', 'dns'] });
// Instrument your own operations
async function processLargeDataset(data) {
performance.mark('processing-start');
const result = await heavyProcessing(data);
performance.mark('processing-end');
performance.measure('data-processing', 'processing-start', 'processing-end');
return result;
}
This provides visibility into application performance without external dependencies, helping you identify bottlenecks early in development.
Modern Node.js makes application distribution simpler with features like single executable applications and improved packaging.
You can now bundle your Node.js application into a single executable file, simplifying deployment and distribution:
# Create a self-contained executable
node --experimental-sea-config sea-config.json
The configuration file defines how your application gets bundled:
{
"main": "app.js",
"output": "my-app-bundle.blob",
"disableExperimentalSEAWarning": true
}
This is particularly valuable for CLI tools, desktop applications, or any scenario where you want to distribute your application without requiring users to install Node.js separately.
Error handling has evolved beyond simple try/catch blocks to include structured error handling and comprehensive diagnostics.
Modern applications benefit from structured, contextual error handling that provides better debugging information:
class AppError extends Error {
constructor(message, code, statusCode = 500, context = {}) {
super(message);
this.name = 'AppError';
this.code = code;
this.statusCode = statusCode;
this.context = context;
this.timestamp = new Date().toISOString();
}
toJSON() {
return {
name: this.name,
message: this.message,
code: this.code,
statusCode: this.statusCode,
context: this.context,
timestamp: this.timestamp,
stack: this.stack
};
}
}
// Usage with rich context
throw new AppError(
'Database connection failed',
'DB_CONNECTION_ERROR',
503,
{ host: 'localhost', port: 5432, retryAttempt: 3 }
);
This approach provides much richer error information for debugging and monitoring, while maintaining a consistent error interface across your application.
Node.js includes sophisticated diagnostic capabilities that help you understand what’s happening inside your application:
import diagnostics_channel from 'node:diagnostics_channel';
// Create custom diagnostic channels
const dbChannel = diagnostics_channel.channel('app:database');
const httpChannel = diagnostics_channel.channel('app:http');
// Subscribe to diagnostic events
dbChannel.subscribe((message) => {
console.log('Database operation:', {
operation: message.operation,
duration: message.duration,
query: message.query
});
});
// Publish diagnostic information
async function queryDatabase(sql, params) {
const start = performance.now();
try {
const result = await db.query(sql, params);
dbChannel.publish({
operation: 'query',
sql,
params,
duration: performance.now() - start,
success: true
});
return result;
} catch (error) {
dbChannel.publish({
operation: 'query',
sql,
params,
duration: performance.now() - start,
success: false,
error: error.message
});
throw error;
}
}
This diagnostic information can be consumed by monitoring tools, logged for analysis, or used to trigger automatic remediation actions.
Package management and module resolution have become more sophisticated, with better support for monorepos, internal packages, and flexible module resolution.
Modern Node.js supports import maps, allowing you to create clean internal module references:
{
"imports": {
"#config": "./src/config/index.js",
"#utils/*": "./src/utils/*.js",
"#db": "./src/database/connection.js"
}
}
This creates a clean, stable interface for internal modules:
// Clean internal imports that don't break when you reorganize
import config from '#config';
import { logger, validator } from '#utils/common';
import db from '#db';
These internal imports make refactoring easier and provide a clear distinction between internal and external dependencies.
Dynamic imports enable sophisticated loading patterns, including conditional loading and code splitting:
// Load features based on configuration or environment
async function loadDatabaseAdapter() {
const dbType = process.env.DATABASE_TYPE || 'sqlite';
try {
const adapter = await import(`#db/adapters/${dbType}`);
return adapter.default;
} catch (error) {
console.warn(`Database adapter ${dbType} not available, falling back to sqlite`);
const fallback = await import('#db/adapters/sqlite');
return fallback.default;
}
}
// Conditional feature loading
async function loadOptionalFeatures() {
const features = [];
if (process.env.ENABLE_ANALYTICS === 'true') {
const analytics = await import('#features/analytics');
features.push(analytics.default);
}
if (process.env.ENABLE_MONITORING === 'true') {
const monitoring = await import('#features/monitoring');
features.push(monitoring.default);
}
return features;
}
This pattern allows you to build applications that adapt to their environment and only load the code they actually need.
As we look at the current state of Node.js development, several key principles emerge:
Embrace Web Standards: Use node:
prefixes, fetch API, AbortController, and Web Streams for better compatibility and reduced dependencies
Leverage Built-in Tools: The test runner, watch mode, and environment file support reduce external dependencies and configuration complexity
Think in Modern Async Patterns: Top-level await, structured error handling, and async iterators make code more readable and maintainable
Use Worker Threads Strategically: For CPU-intensive tasks, worker threads provide true parallelism without blocking the main thread
Adopt Progressive Enhancement: Use permission models, diagnostics channels, and performance monitoring to build robust, observable applications
Optimize for Developer Experience: Watch mode, built-in testing, and import maps create a more pleasant development workflow
Plan for Distribution: Single executable applications and modern packaging make deployment simpler
The transformation of Node.js from a simple JavaScript runtime to a comprehensive development platform is remarkable. By adopting these modern patterns, you’re not just writing contemporary code—you’re building applications that are more maintainable, performant, and aligned with the broader JavaScript ecosystem.
The beauty of modern Node.js lies in its evolution while maintaining backward compatibility. You can adopt these patterns incrementally, and they work alongside existing code. Whether you’re starting a new project or modernizing an existing one, these patterns provide a clear path toward more robust, enjoyable Node.js development.
As we move through 2025, Node.js continues to evolve, but the foundational patterns we’ve explored here provide a solid base for building applications that will remain modern and maintainable for years to come.
Ever wonder why you never see a smartphone photo printed and framed on the wall? I'll explain exactly why.
The photos above look similar at a glance, but there are some key differences that seem small but are actually very sensitive to our perception of humans:
Let's take a look at an individual player:
More things to notice:
Off with their heads!
The above photo is from a professional camera. Let's compare iPhone with its mighty 48 MP to a 2004 Sony Digicam with a paltry 5.1 MP on auto mode:
I am sure there are many more differences that I am not able to see. All these details might seem minor, but the human eye is incredibly sensitive when perceiving other humans.
So this is why you don't see iPhone photos framed on walls. It's also why top brands like The Masters never post iPhone photos.
Most photos are of people, but I'll add a couple more photos that show the distortion inflicted by iPhone cameras: