Some Much Nicer Numbers
Wednesday, February 22, 2006, at 08:29AM
By Eric Richardson
It feels funny to go from a post about getting engaged to a post about benchmarking, but oh well.
We're preparing to launch our first Rails app at work today, and so in prep I had mentioned previously that I was looking into performance setups. With WEBBrick, the all-Ruby simple server, I could do about 7 requests / sec to a reasonably intensive search function.
I've now done much better.
I mentioned previously that I had been trying to set up lighttpd, but had only succeeded in getting Rails to segfault under FastCGI. Turns out that was the result of serious holes in my installed FastCGI base. After updating my software setup I managed to get it working just fine.
Once I had that going I was still frustrated, though, since I could only seem to manage 11 requests / sec, and still couldn't peg the CPUs.
Then it dawned on me... The HTML I was getting returned was about 50kB. I was testing remotely, running ApacheBench from the server in my apartment. I wasa running with a concurrency of 10, so that was giving me roughly 500kB/sec down, or 4mbit/sec. The limiting factor now was simply my inbound bandwidth.
Once I ran ab locally on our colo'ed server the results looked a little different:
Concurrency Level: 10
Time taken for tests: 4.675831 seconds
Complete requests: 200
Failed requests: 0
Write errors: 0
Total transferred: 10453400 bytes
HTML transferred: 10410200 bytes
Requests per second: 42.77 [#/sec] (mean)
Time per request: 233.792 [ms] (mean)
Time per request: 23.379 [ms] (mean, across all concurrent requests)
Transfer rate: 2183.14 [Kbytes/sec] received
42 requests / sec is nothing to sneeze at. And that's a fairly hefty search. If I do a request to get data for a single building I get 304 requests / sec.
There's certainly more tuning to be done here with caching, but I'm pretty sure that's not too pressing given these numbers.