I would like to know how to benchmark a php/mysql site.
We have a web app almost completed and ready to go live, we know how many people are going to be using it in a years time but have absolutely no idea how much bandwidth the average user takes, to how much time they burn up on the database etc. We need to determine the correct servers to purchase.
Is there something server side linux that can monitor these statistics per user? So that we can then take this data and extrapolate it?
If I am going about this completely wrong please let me know but I believe that this is a frequent activity for new web apps.
EDIT: I may have asked for the incorrect information. We can see how long the database queries take and how long it takes to load the page but have no idea what load is placed on the server. The question I am asking is can we handle 100 users at once on average...1000? What type of server requirements are needed to hit 1M users. Etc.
Thanks for your help.
Using MySQL the benchmark tests a single MySQL Server instance. Using MySQL Cluster the benchmark tool can drive large distributed tests with many MySQL Cluster Data nodes and MySQL Server instances. The DBT2 Benchmark Tool provides scripts to automate execution of these benchmarks.
MySQL is a first choice of PHP developers. As an open source Relational Database Management System (RDBMS) that uses SQL language, MySQL database helps to automate data retrieving and provide great support in PHP MySQL web application development.
MySQL is the most popular database system used with PHP.
A tool that I find fairly useful is jmeter which allows (at it's most basic) you to set your browser to use jmeter as a proxy then you wander all around your website and it will record everything you do.
Once you are happy that it's a decent test of most of your website you can then save the test in jmeter and tell it to run your test with a set number of threads and a number of loops per thread to simulate load on your website.
For example you can run 50 clients each running the testplan 10 times.
You can then ramp the numbers up and down to see the performance impact it has on the site, it graphs the response time for you.
This lets you tune different parameters, try different caching strategies and check the real world impact of those changes.
You can use ApacheBench tool (ab, usually a part of apache web-server package) for stress-testing(1k requests with 10 clients = ab -c 10 -n 1000 http://url) of script, that you suspect could be slow enough. It will show you the distribution of response times (in 90% cases request processed in less than 200msec).
Than you can also grab SQL-queries executed by that particular script and do "explain plan" for them to get a rough idea how will it degrade when there will be 10-100-10mln times more records in tables.
Regarding how many users can it serve - you can use your favourite browser and emulate a typical user visit, take access_log file and sum sent bytes (one of last numbers in log line). For example it was 5kb text/html+50kb png/jpg/etc.=55kb per user visit. Plus headers/etc let's say 60kb per visit*1m=60gb traffic per day. Is your bandwidth good enough for that? (60gb/86.4ksec=700kb/sec).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With