Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How does a server handle web service requests from multiple clients

I just completed an Android application that uses web services to connect to a remote database. I was working on localhost.

Now, I plan to host my web services on a server. Let's say I have my Android application installed on any number of different client smartphones. Each smartphone user calls the web service at the same time.

Now how does the server handle these requests? Does it execute one thread per request? I want to know about the server processing in detail. Considering, all phones use GPRS, will there be any sort of delay in such a situation?

BTW, my web services are all SOAP based and the server I plan to use later will be an SQL Server. I have used .NET framework for creating web services.

like image 628
Parth Doshi Avatar asked Feb 02 '23 11:02

Parth Doshi


1 Answers

Its for the general concept, not a Android specific

Usually, each of the users sends an HTTP request for the page. The server receives the requests and delegates them to different workers (processes or threads).

Depending on the URL given, the server reads a file and sends it back to the user. If the file is a dynamic file such as a PHP file, the file is executed before it's sent back to the user.

Once the requested file has been sent back, the server usually closes the connection after a few seconds.

Look at How Web Servers Work

EDIT:

For HTTP uses TCP which is a connection-based protocol. That is, clients establish a TCP connection while they're communicating with the server.

Multiple clients are allowed to connect to the same destination port on the same destination machine at the same time. The server just opens up multiple simultaneous connections.

Apache (and most other HTTP servers) have a multi-processing module (MPM). This is responsible for allocating Apache threads/processes to handle connections. These processes or threads can then run in parallel on their own connection, without blocking each other. Apache's MPM also tends to keep open "spare" threads or processes even when no connections are open, which helps speed up subsequent requests.

Note:

One of the most common issues with multi-threading is "race conditions"-- where you two requests are doing the same thing ("racing" to do the same thing), if it is a single resource, one of them is going to win. If they both insert a record into the database, they can't both get the same id-- one of them will win. So you need to be careful when writing code to realize other requests are going on at the same time and may modify your database, write files or change globals.

like image 69
user370305 Avatar answered Feb 06 '23 16:02

user370305