Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP Web Service optimisations and testing methods

Tags:

php

sql-server

I'm working on a web service in PHP which accesses an MSSQL database and have a few questions about handling large amounts of requests.

  1. I don't actually know what constitutes 'high traffic' and I don't know if my service will ever experience 'high traffic' but would optimisations in this area be largely attributed to the server processing speed and database access speed?

  2. Currently when a request is sent to the server I do the following:

    • Open database connection
    • Process Request
    • Return data

    Is there anyway I can 'cache' this database connection across multiple requests? As long as each request was processed simultaneously the database will remain valid.

  3. Can I store user session id and limit the amount of requests per hour from a particular session?

  4. How can I create 'dummy' clients to send requests to the web server? I guess I could just spam send requests in a for loop or something? Better methods?

Thanks for any advice

like image 984
rocklobster Avatar asked Feb 07 '26 02:02

rocklobster


1 Answers

  1. You never know when high traffic occurs. High traffic might result from your search engine ranking, a blog writing a post of your web service or from any other unforseen random event. You better prepare yourself to scale up. By scaling up, i don't primarily mean adding more processing power, but firstly optimizing your code. Common performance problems are:

    • unoptimized SQL queries (do you really need all the data you actually fetch?)
    • too many SQL queries (try to never execute queries in a loop)
    • unoptimized databases (check your indexing)
    • transaction safety (are your transactions fast? keep in mind that all incoming requests need to be synchronized when calling database transactions. If you have many requests, this can easily lead to a slow service.)
    • unnecessary database calls (if your access is read only, try to cache the information)
    • unnecessary data in your frontend (does the user really need all the data you provide? does your service provide more data than your frontend uses?)
  2. Of course you can cache. You should indeed cache for read-only data that does not change upon every request. There is a useful blogpost on PHP caching techniques. You might also want to consider the caching package of the framework of your choice or use a standalone php caching library.

  3. You can limit the service usage, but i would not recommend to do this by session id, ip address, etc. It is very easy to renew these and then your protection fails. If you have authenticated users, then you can limit the requests on a per-account-basis like Google does (using an API key for all their publicly available services per user)

  4. To do HTTP load and performance testing you might want to consider a tool like Siege, which exactly does what you expect.

I hope to have answered all your questions.

like image 105
Fabian Keller Avatar answered Feb 09 '26 16:02

Fabian Keller



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!