In the mean time, the old work horse language of the Internet, PHP, gets a steady stream of criticism. and among the 14k Google hits for "PHP sucks" (exact term), people will say the most funny terrible things about the language while some of the critique is actually quite well balanced. Node.js introduces at least two new things (for a broader audience).
The other thing that makes Node.js different is that it's completely asynchronous and event driven. Node is based on the realization that a lot of computer code actually just sits idle and wait for I/O most of the time, like waiting for a file to be written to disk or for a MySQL query to return data. To accomplish that, more or less every single function in Node.js is non-blocking.
When you ask for node to open a file, you don't wait for it to return. Instead, you tell node what function to pass the results to and get on with executing other statements. This leads to a dramatically different way to structure your code with deeply nested callbacks and anonymous function and closures. You end up with something like this:
It's quite easy to end up with very deep nesting that in my opinion sometimes affects code readability in a negative way. But compared to what gets said about PHP, that's very mild critique. And.. oh! The third thing that is quite different is that in Node.js, you don't have to use a separate http(s) server. It's quite common to put Node.js behind a Nginx, but that's not strictly needed. So the heart of a typical Node.js web application is the implementation of the actual web server.
A fair way to compare
So no, it's not fair to say that we compare Node.js and PHP. What we really compare is Node.js and PHP+Apache2 (or any other http server). For this article, I've used Apache2 and mod_php since it's by far the most common configuration. Some might say that I'd get much better results if I had used Nginx or Lighthttpd as the http server for PHP. That's most likely very true, but at the end of the day, server side PHP depends on running in multiple separate processes. Regardless if we create those processes with mod_php or fastcgi or any other mechanism. So, I'm sticking with the standard server setup for PHP and I think that makes good sense.
The performance testing environment
So we're pitting PHP+Apache2 against a Node.js based application. To keep things reasonable, I've created a very (really, very) simple application in both PHP5 and Node.js. The application will get 50 rows of data from a Wordpress installation and output it as a json string. That's it, nothing more. The benefit of keeping it this simple was (a) that I didn't have to bother about too many implementation details between the two languages and (b) more important that we're not testing my ability to code, we're really testing the difference in architecture between the two. The server we're using for this test is a virtual server with:
- 1 x Core Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz
- 2 Gb RAM.
- OS is 64 Bit Ubuntu 12.10 installed fresh before running these tests.
- We installed the Load Impact Server metric agent.
For the tests, we're using:
- Apache/2.2.22 and
- PHP 5.4.6.
- Node.js version 0.8.18 (built using this script)
- MySQL is version 5.5.29.
- The data table in the tests is the options table from a random Wordpress blog.
The scripts we're using:
The PHP script is obviously much shorter, but on the other hand it doesn't have to implement a full http server either.
Running the performance tests
The Load Impact performance test configurations are also very simple, these two scripts are after all typical one trick ponies, so there's not that much of bells and whistles to use here. To be honest, I was surprised how many concurrent users I had to use in order to bring the difference out into the light. The test scripts had the following parameters:
- The ramp up went from 0-500 users in 5 minutes
- 100% of the traffic comes from one source (Ashburn US)
- Server metrics agent enabled
On the below images. the lines have the following meanings:
- Green line: Concurrent users
- Blue line: Response time
- Red line: Server CPU usage
Note: These are graphs from our old UI, but the content in this post is so awesome that we chose to leave everything untouched)
The first graph here shows what happens when we load test the Node.js server. The response time (blue) is pretty much constant all through the performance test. My back of a napkin analysis of the initial outliers is that they have to do with a cold MySQL cache. Now, have a look at the results from the PHP test:
Quite different results. It's not easy to see on this screen shot, but the blue lines is initially stable at 320 ms response time up to about 340 active concurrent users. After that, we first see a small increase in response time but after additional active concurrent users are added, the response time eventually goes through the roof completely.
So what's wrong with PHP/Apache?
Ok, so what we're looking at is not very surprising, it's the difference in architecture between the two solutions. Let's think about what goes on in each case.
When Apache2 serves up the PHP page it leaves the PHP execution to a specific child process. That child process can only handle one PHP request at a time so if there are more requests than than, the others have to wait. On this server, there's a maximum of 256 clients (MaxClients) configured vs 150 that comes standard. Even if it's possible to increase MaxClients to well beyond 256, that will in turn give you a problem with internal memory (RAM). At the end, you need to find the correct balance between max nr of concurrent requests and available server resources.
But for Node, it's easier. First of all, in the calm territory, each request is about 30% faster than for PHP, so in pure performance in this extremely basic setup, Node is quicker. Also going for Node is the fact that everything is in one single process on the server. One process with one active request handling thread. So thre's no inter process communication between different instances and the 'mother' process. Also, per request, Node is much more memory efficient. PHP/Apache needs to have a lot of php and process overhead per concurrent worker/client while Node will share most of it's memory between the requests.
Also note that in both these performance tests, CPU load was never a problem. Even if CPU loads varies with concurrent users in both tests it stays below 5% (and yes, I did not just rely on the graph, I checked it on the server as well). (I'll write a follow up on this article at some point when I can include server memory usage as well). So we haven't loaded this server into oblivion in any way, we've just loaded it hard enough for the PHP/Aapache architecture to start showing some of it's problems.
So if Node.js is so good...
Well of course. There are challenges with Node, both technical and cultural. On the technical side, the core design idea in Node is to have one process with one thread makes it a bit of a challenge to scale up on a multi core server. You may have already noted that the test machine uses only one core which is an unfair advantage to Node. If it had 2 cores, PHP/Apache would have been able to use that, but for Node to do the same, you have to do some tricks.
On the cultural side, PHP is still "everywhere" and Node is not. So if you decide to go with Node, you need to prepare to do a lot more work yourself, there's simply nowhere near as many coders, web hotels, computer book authors, world leading CMS'es and what have you. With PHP, you never walk alone.
Hopefully, this little performance testing exercise shows the inherit differences in two different server technologies. One old trusted and one young and trending. Hopefully it's apparent that your core technical choices will affect your server performance and in the end, how much load you can take. Designing for high load and high scalability begins early in the process, before the first line of code is ever written.
And sure, in real life, there are numerous of tricks available to reduce the effects seen here. In real life, lots of Facebook still runs on PHP.