Node.js Is a Cancer

A provocative critique of Node.js examining its scalability claims, violation of Unix philosophy, and the pitfalls of using JavaScript on the server side.

If there's one thing web developers love, it's knowing something that's better than the traditional way. But the traditional way is traditional for one reason: that shit works. Something had been bugging me about all the Node.js hype, but I didn't have time to figure out what, until I read a pain-in-the-ass post from Ryan Dahl, the creator of Node.js. I would have forgotten about it like any other whining from some jackass about Unix being too hard. But like a cop who, sensing something wrong with that family in the minivan, pulls them over and finds fifty kilos of heroin, I felt something was off about this sob story, and maybe, just maybe, he has no idea what he's doing and has been programming unchecked for years.

Since you're reading this, you've probably figured out that my hunch was confirmed.

Node.js is a cancer on the programming community, not just because it's completely insane, but because the people who use it infect others who can't think for themselves, until eventually every moron I meet starts preaching about event loops. Have you accepted epoll into your heart?

A Scalability Disaster Waiting to Happen

Let's start with the most terrible lie: Node.js is scalable because it "never blocks" (Radiation is good for you! Now in your toothpaste!). The Node.js website states:

Almost no function in Node directly performs I/O, so the process is never blocked. Because nothing blocks, less-than-expert programmers are able to develop fast systems.

This statement is tempting, reassuring, and completely f***ing wrong.

Let's start with a definition, since your specialty as know-it-alls is pedantry. A function call is called blocking when the execution of the calling thread is suspended until that function completes. Typically, we think of I/O operations as "blocking" — for example, if you call socket.read(), the program will wait for this call to complete because it needs to do something with the returned data.

Here's a fun fact: calling any function that uses the CPU is also blocking. This function that computes the N-th Fibonacci number will block the current thread because it uses the CPU:

function fibonacci(n) {
 if (n < 2)
 return 1;
 else
 return fibonacci(n-2) + fibonacci(n-1);
}

(Yes, I know about the closed-form solution. Shouldn't you be rehearsing in front of a mirror what you'll say when you finally work up the nerve to talk to Her?)

Let's see what happens with a Node.js program using this little gem as a request handler:

http.createServer(function (req, res) {
 res.writeHead(200, {'Content-Type': 'text/plain'});
 res.end(fibonacci(40));
}).listen(1337, "127.0.0.1");

On my previous laptop, the result is:

ted@lorenz:~$ time curl http://localhost:1337/
165580141
real 0m5.676s
user 0m0.010s
sys 0m0.000s

Response time — 5 seconds. Cool. So we all know JavaScript isn't a blazingly fast language, but what's so bad about that? The thing is, Node's event model and its brainwashed fanatics made you think everything is fine. Here's simple pseudocode showing how the event loop works:

while(1) {
 ready_file_descriptor = event_library->poll();
 handle_request(ready_file_descriptor);
}

This is all well and good if you know what you're doing, but applying it to server tasks multiplies this crap. If this loop runs in the same thread as handle_request, any competent programmer would notice that the request handler can block the loop, and it doesn't matter how asynchronous your library is.

So, given the above, let's see how my little Node server behaves under the most modest load — 10 requests, 5 concurrent:

ted@lorenz:~$ ab -n 10 -c 5 http://localhost:1337/
...
Requests per second: 0.17 [#/sec] (mean)
...

0.17 requests per second. A beast. Sure, Node lets you spawn child processes, but now the threading and event models are so tightly coupled that you've already got much bigger problems than scalability.

Given Node's original marketing policy, I'm damn scared of any "fast systems" that "less-than-experts" will gift to this world.

Denying Unix Philosophy, Node Punishes the Developer

Long ago, the bearded guys who stood at the origins decided that chaining small programs together, each performing a specific task, was a great idea, and the universal interface for them should be text.

If you develop on the Unix platform following this principle, the operating system will reward you with simplicity and prosperity. For example, when web applications first appeared, a web application was simply a program that output text to stdout. The web server was responsible for accepting incoming requests, executing this program, and returning the result to the client. We called this CGI, and it was a good way to get work done until the micro-optimizers stuck their dirty fingers into it.

Conceptually, any web application architecture that isn't brain cancer works exactly this way today: you have a web server whose job is to accept a request, parse it, and decide what to do with it next. This could be serving a static file, calling a CGI script, proxying the connection somewhere else — whatever. The point is that the HTTP server shouldn't do the application's work. Developers usually call this separation of concerns, and it exists for one reason: loosely coupled architectures are very easy to maintain.

And yet, Node seems to ignore this. Node has (and don't laugh, I'm not making this up) its own HTTP server, and you're supposed to use it to serve incoming traffic. Yes, in the example above where I called http.createServer(), that's from the documentation.

If you search for "node.js deployment" on the internet, you'll find a bunch of people sticking Nginx in front of Node, and some use something called Fugue. That's another JavaScript HTTP server that spawns a bunch of processes to handle incoming requests, because nobody thought that all this "non-blocking" nonsense might have CPU performance problems.

If you use Node, there's a 99% chance you're both the developer and the sysadmin, because any system administrator would first talk you out of using Node. Thus, you, the developer, will be punished by this HTTP proxying orgy if you want to put a real web server in front of Node for things like serving static content, request rewriting, rate limiting, load balancing, SSL, or any other futuristic things that modern HTTP servers can do. Yes, there will be another layer in your system that needs monitoring.

Though, let's be honest with ourselves — if you're a Node developer, you're probably running the application directly from Node, running in your screen session under your user account.

It's F***ing JavaScript

Perhaps the worst thing you can do with a server framework is write it in JavaScript.

if (typeof my_var !== "undefined" && my_var !== null) {
 // idiots, you've disgraced Rasmus Lerdorf
}

What is this, I can't even...

TL;DR

Node.js is unpleasant software, and I will not use it.


Update: translator's note

I'm also a JavaScript developer who has been watching Node.js with interest for a long time. I too feel hurt and pained for the language I love. However, if you set aside the pain, you can find meaning, arguments, and evidence in the text above. I would very much like a coherent discussion, because that's the only reason I translated it.

FAQ

What is this article about in one sentence?

This article explains the core idea in practical terms and focuses on what you can apply in real work.

Who is this article for?

It is written for engineers, technical leaders, and curious readers who want a clear, implementation-focused explanation.

What should I read next?

Use the related articles below to continue with closely connected topics and concrete examples.