How can you say it's bad without knowing how knowing what each request is doing?
Github isn't a microbenchmark.
A concrete example:
I can say that one of my rails sites handles 850 dynamic requests per second running on a single small 1 cpu server. That's because all that particular request does is lookup 4k of data from memcached and returns it. (i.e. http://www.tanga.com/feeds/current_deal.xml)
However, as a general rule, I know that each small server can handle about 30-50 pages per second, because each page takes a lot of data crunching to generate (and because I haven't been bothered to make it as efficient as possible, it's fast enough as is).
If all I was doing was returning a small bit of text that didn't require much lookups or calculation, then sure, a 8 core cpu with rails could probably do 4-5000 requests per second easily.
Github isn't a microbenchmark.
A concrete example:
I can say that one of my rails sites handles 850 dynamic requests per second running on a single small 1 cpu server. That's because all that particular request does is lookup 4k of data from memcached and returns it. (i.e. http://www.tanga.com/feeds/current_deal.xml)
However, as a general rule, I know that each small server can handle about 30-50 pages per second, because each page takes a lot of data crunching to generate (and because I haven't been bothered to make it as efficient as possible, it's fast enough as is).
If all I was doing was returning a small bit of text that didn't require much lookups or calculation, then sure, a 8 core cpu with rails could probably do 4-5000 requests per second easily.