• Coding
  • Stress Testing a Website

I'm totally ignorant about the subject, but I need to some stress and performance tests for a client's website. I'm doing some research but I haven't found anything very useful so far. Anyone has some experience with the subject?
Apache used to come with a bundle of benchmarking utilities, that's all I know. For the rest, you will probably learn more from google. I guess there are servers out there that will do it for you, for free of for a fee.
I have a VPS so in theory I could stress-test your site, but I'm afraid it would be detected as DoS and get me in trouble.
Check out blitz. Let us know how it works out!
samer wroteCheck out blitz. Let us know how it works out!
Thanks for sharing this samer! Apparently, my doubts were confirmed. We definitely need more RAM because there's only 34MB of free RAM left, the rest of the 500MB were eaten up by IIS, SQL Server and a few other processes.

Any recommendations on how much RAM we should be purchasing? The site is still getting started, not much traffic yet.
I doubt anyone could help you unless you tell us what website it is :P

Small hint:
The question of resources in web dev is very linked to caching. This is where 80% of your load should go.
Have you implemented proper caching? Maybe review your caching policies before thinking of spending more money on hardwrare. 500 MB should be enough for a mid-size website, it's not ideal but would work.
check out JMeter

First a bit of a hassle to learn, later it is one very nice tool :)

Open source tool, small size that can do almost anything and emulate user testing with scenario like user browse page, logins, requests content etc.


Also you may want to check out HP Load runner
@samer: nice site, thanks :)
rahmu wroteI doubt anyone could help you unless you tell us what website it is :P

Small hint:
The question of resources in web dev is very linked to caching. This is where 80% of your load should go.
Have you implemented proper caching? Maybe review your caching policies before thinking of spending more money on hardwrare. 500 MB should be enough for a mid-size website, it's not ideal but would work.
Ok you're actually right. The website is Fisharwe.com. I was asking for general guidelines though. :)

About caching, the answer to your question is "No". I thought about it but I did not really know what to do. It is an auction website, and timing is very crucial I believe (especially in case the website took off and more people started using the site). For example, suppose two or three users are bidding on an item and it's about to close. The users are very eager to actually win the bid, accordingly they are willing to out-bid each other until the last second. If my code is showing the user a cached version of the bidding page, and this user is supposedly winning while another user actually out-bid him before the page got cached, then the former user might not get the chance to bid again and might actually think he got screwed up.

On the other hand, I think some queries and/or pages can be cached and I probably should be doing that. I have to admit though, that this is one of the areas I need to improve in my skill-set. So, a few good articles/references on the subject would be very much appreciated :)
Actually, your problem is not server-side related. You have a lot of issues on the front-end level.

710KB of javascript\css\images are being loaded and later on cashed by the browser, this is affecting your page's loading time and raising it to 29seconds. Which by any standard is very slow. Do not waste your time stress testing the server's ability to handle 1000 simultaneous hits as you will never reach that level in the near future, instead focus on enhancing your page's loading time. The jquery ui library alone affects severely the loading time.

Consider loading your libraries from external servers such as the Google's CDN which hosts most of the popular javascript libraries. Compress your images as well.

Apache's mod_deflate library is a pretty good solution for compression as it gzips all the content transferred between the server and the client, the overhead of compression is negligible compared to the contents loading time. Since you are using a windows server try to look for a way to compress your files.

Google's FireFox plugin "page speed" will give you all the front-end benchmarking you need, and google's developer tools for Chrome are perfect for this scenario.

Caching is not as bad as you think IF and only IF you know what you are caching. I believe what rahmu suggested was caching of the front-end content that do not change over time such as all your javascript libraries and some of your css files, and images that are part of the unchanging layout. It is already done implicitly by ISPs and the users browsers, you might as well take control of it.

A lot can be done to enhance the speed of your website, which if done properly will contribute to your service reaching the desired 100000hits per hour (or whatever large number) you are seeking. Do not scale before you need to.
Link-, thanks for the tips, really appreciated. I believe you are completely right.

But none-the-less, if any of you guys has some really good content about caching, that would be great. I found a couple of video series on PluralSight, but I really do not want go through a whole video series just for that. An article or two would be great for now.
This is just a stab in the dark... but on such a site, if you're having memory problems, maybe take a look at database access, like if it's issuing 15 queries per page...