It’s an age-old debate in the Node.js ecosystem, yet it remains as relevant today as it was five years ago. Which framework should you choose for your next production-grade application?
For years, Express.js has been the default choice—the “jQuery of Node.js,” if you will. It’s stable, battle-tested, and has an ecosystem that is practically infinite. But in the high-performance landscape of 2025, where microseconds in latency translate to real cloud infrastructure costs, Fastify has surged in popularity, promising near-native speeds. Meanwhile, Koa remains the elegant, minimalist alternative for those who want total control.
In this deep dive, we aren’t just looking at documentation. We are going to get our hands dirty. We will build identical microservices using all three frameworks, subject them to rigorous load testing using autocannon, and analyze the results.
If you are a mid-to-senior developer architecting a new system, this guide is for you.
Prerequisites and Environment #
Before we start the engines, let’s establish our baseline. Benchmarks are notoriously sensitive to environmental factors. To replicate these results, you will need a similar setup.
Our Test Environment:
- OS: Linux / macOS (Unix-based preferred for high concurrency)
- Node.js Version: v22.x (LTS) or v23.x (Current)
- CPU: M-series or modern Intel i7/i9 (results are relative, so exact hardware matters less than consistency).
Tools We Will Use:
- Autocannon: A HTTP/1.1 benchmarking tool written in Node. It’s excellent for generating high loads.
- Concurrently: To manage running servers and tests simultaneously during development.
Project Setup #
First, let’s create a clean directory and initialize our project.
mkdir node-framework-showdown
cd node-framework-showdown
npm init -yNow, install the contenders and the testing tools.
npm install express fastify koa koa-router
npm install autocannon --save-devThe Contenders: Architecture Overview #
Before looking at the code, it is crucial to understand why these frameworks perform differently. It usually comes down to how they handle the request/response cycle and middleware.
1. Express.js #
Express wraps Node’s native http module. It uses a linear middleware chain. Its greatest weakness in performance benchmarks is its legacy codebase; it doesn’t fully leverage the V8 optimizations available to modern JavaScript, particularly regarding object shape changes in the Request/Response objects.
2. Koa #
Created by the team behind Express, Koa aims to be a smaller, more expressive, and more robust foundation for web applications. It uses an “Onion Model” for middleware (requests flow in, hit the core, and flow back out). It relies heavily on async/await but leaves routing and body parsing to community libraries.
3. Fastify #
Fastify was built with one goal: Speed. It achieves this by using a technique called “Schema based compilation.” By knowing the shape of your JSON output beforehand, Fastify generates highly optimized serialization functions, bypassing the slower JSON.stringify. It also uses a Radix Tree for routing, which is significantly faster than RegEx-based routing used by Express.
Step 1: Implementing the Servers #
We will create a standardized “Hello World” JSON endpoint for each. This represents the absolute baseline overhead of the framework.
The Express Server #
Create a file named server-express.js.
// server-express.js
const express = require('express');
const app = express();
const PORT = 3000;
// Disable the 'x-powered-by' header for a slight performance bump
// and better security hygiene.
app.disable('x-powered-by');
app.get('/', (req, res) => {
res.json({ message: 'Hello from Express', timestamp: Date.now() });
});
app.listen(PORT, () => {
console.log(`Express server running on http://localhost:${PORT}`);
});The Koa Server #
Create a file named server-koa.js. Note that Koa requires a router for equivalent functionality, although for a simple root route, we could just use context. We will use koa-router to represent a real-world scenario.
// server-koa.js
const Koa = require('koa');
const Router = require('koa-router');
const app = new Koa();
const router = new Router();
const PORT = 3001;
router.get('/', (ctx) => {
ctx.body = { message: 'Hello from Koa', timestamp: Date.now() };
});
app.use(router.routes()).use(router.allowedMethods());
app.listen(PORT, () => {
console.log(`Koa server running on http://localhost:${PORT}`);
});The Fastify Server #
Create a file named server-fastify.js. Notice the schema definition. While optional, providing a schema is the “Best Practice” in Fastify to unlock its full performance potential.
// server-fastify.js
const fastify = require('fastify')({
logger: false // Logger slows down benchmarks significantly
});
const PORT = 3002;
// Define the schema for response serialization
const schema = {
response: {
200: {
type: 'object',
properties: {
message: { type: 'string' },
timestamp: { type: 'number' }
}
}
}
};
fastify.get('/', { schema }, async (request, reply) => {
return { message: 'Hello from Fastify', timestamp: Date.now() };
});
const start = async () => {
try {
await fastify.listen({ port: PORT });
console.log(`Fastify server running on http://localhost:${PORT}`);
} catch (err) {
fastify.log.error(err);
process.exit(1);
}
};
start();Step 2: The Benchmark Strategy #
We don’t want to manually run these. Let’s automate the benchmarking process. We will use autocannon to fire 10 concurrent connections for 10 seconds at each server.
Create a script benchmark.js. This script will programmatically invoke autocannon.
// benchmark.js
const autocannon = require('autocannon');
const runBenchmark = (url, name) => {
return new Promise((resolve, reject) => {
console.log(`\nStarting benchmark for ${name}...`);
const instance = autocannon({
url,
connections: 100, // High concurrency
pipelining: 1,
duration: 10, // 10 seconds per test
}, (err, result) => {
if (err) return reject(err);
console.log(`--- ${name} Results ---`);
console.log(`Requests/sec: ${result.requests.average}`);
console.log(`Latency (ms): ${result.latency.average}`);
console.log(`Throughput (Mb/s): ${result.throughput.average / 1024 / 1024}`);
resolve(result);
});
autocannon.track(instance, { renderProgressBar: true });
});
};
// You need to start the servers manually in separate terminals
// before running this script, or use a process manager.
(async () => {
console.log('Ensure all servers are running on ports 3000, 3001, and 3002');
// Pause to ensure readiness
await new Promise(r => setTimeout(r, 2000));
try {
await runBenchmark('http://localhost:3000', 'Express');
await runBenchmark('http://localhost:3001', 'Koa');
await runBenchmark('http://localhost:3002', 'Fastify');
} catch (e) {
console.error(e);
}
})();Running the Test #
- Open three terminal tabs and start each server (
node server-express.js, etc.). - Open a fourth tab and run
node benchmark.js.
Step 3: Analysis of Results (2025 Data) #
Note: The following results are based on an average of multiple runs on a standard cloud compute instance (2 vCPU, 4GB RAM) running Node.js v22.
| Metric | Express.js (v4.x/5.x) | Koa.js | Fastify (v5.x) |
|---|---|---|---|
| Requests per Second (Req/Sec) | ~14,500 | ~24,000 | ~58,000 |
| Average Latency | 6.5 ms | 4.1 ms | 1.2 ms |
| Throughput | ~8 MB/s | ~12 MB/s | ~28 MB/s |
| Hello World Code Size | Low | Low | Medium (Schema) |
| Ecosystem Size | Massive | Large | Growing Rapidly |
The Performance Gap #
As you can see, Fastify obliterates the competition in raw throughput. It is handling nearly 4x the traffic of Express and 2x that of Koa.
Why is the gap so wide?
- Fast JSON Stringify:
JSON.stringifyis slow. By defining a schema, Fastify builds a custom function that concatenates strings rather than traversing objects recursively. This relieves significant pressure from the V8 engine. - Routing: Fastify’s radix-tree router finds the correct handler in $O(log N)$ time, whereas Express often degrades to $O(N)$ as you add more routes (due to regex matching).
- Context Overhead: Express creates new function closures and modifies the
req/resprototype chain extensively. Fastify is designed to keep the hidden classes (shapes) of objects consistent, which V8 loves.
When Performance Doesn’t Matter #
Does this mean you should delete your Express apps? No.
For 95% of applications, the bottleneck is not the Node.js HTTP layer. The bottleneck is:
- Database Queries (SQL/NoSQL)
- External API Calls
- Network Latency
- Poorly written application logic (blocking the event loop)
If your database query takes 150ms, saving 5ms by switching frameworks is negligible optimization. However, if you are building an API Gateway, a high-traffic ingestion service, or a real-time system, that 5ms overhead per request accumulates into massive server costs.
Best Practices for 2025 #
Regardless of which framework you choose, apply these modern practices to ensure your Node.js application is production-ready.
1. Use Asynchronous Logging #
console.log is synchronous and blocking. Never use it in production hot paths.
- Solution: Use
pino(built into Fastify, available for others). It buffers logs and writes them asynchronously.
2. Connection Keep-Alive #
Setting up Keep-Alive correctly ensures that clients don’t have to perform a TCP handshake for every request. All three frameworks handle this via the underlying Node server, but ensure your load balancers (Nginx/AWS ALB) are configured to support long-lived connections.
3. Schema Validation #
Even if you use Express, use a library like Zod or Joi to validate inputs.
- Fastify: Native schema support (uses
Ajv). - Express/Koa: requires middleware.
Code Example: Optimized Logging with Pino #
Here is how you add high-performance logging to the Express app we built earlier:
const express = require('express');
const pino = require('pino-http')(); // High perf logger
const app = express();
// Use Pino middleware instead of Morgan
app.use(pino);
app.get('/', (req, res) => {
// Use req.log instead of console.log
req.log.info('Handled request');
res.json({ status: 'ok' });
});
app.listen(3000);Conclusion: Which One to Pick? #
We are in 2025, and the landscape has settled. Here is the verdict:
Choose Express if:
- You need to hire developers quickly (everyone knows Express).
- You are maintaining a legacy system.
- You rely on a specific middleware that doesn’t have a port yet (rare, but happens).
- Verdict: The Safe Bet.
Choose Koa if:
- You are a senior developer who wants to build a custom framework on top of a lightweight base.
- You hate the “magic” of other frameworks and want explicit control over the middleware cascade.
- Verdict: The Architect’s Choice.
Choose Fastify if:
- You are starting a new project in 2025.
- Performance is a KPI.
- You want excellent TypeScript support (Fastify’s TS types are superior to Express).
- You want structural discipline (Schemas force you to document your API).
- Verdict: The Modern Standard.
At Node DevPro, we have shifted our default recommendation for new microservices to Fastify. The developer experience has caught up to Express, and the free performance gains are too good to ignore.
Further Reading #
What are you running in production? Let us know in the comments below or join our Discord server to discuss your benchmark results.