Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

multiple HTTP GET requests in one TCP/IP connection - processed parallel or sequential

Tags:

http

nginx

tcp

Me get a lot of Googlebot requests.

Googlebot requests up to 11 different files via 11 HTTP GET request, all in one single TCP/IP connection.

Are these GET request (all in the same TCP/IP connection) processed via the server in

  • parallel
  • or in sequence?

Or is it up the the server?

  • in this case, how does Nginx handle this?
like image 232
Franz Enzenhofer Avatar asked Oct 06 '10 08:10

Franz Enzenhofer


People also ask

Why did multiple HTTP requests use the same TCP connection?

It can send several requests over the same connection without waiting for responses for previous requests. The requests are processed in FIFO manner i.e. The client can send several requests in sequence, and the server is supposed to send a response to each request in the same order the request was received.

Can a TCP segment carry multiple HTTP requests?

With regard to a TCP connection, a single connection can be used for multiple HTTP requests, for example in case several images are being fetched from the same server. However, this still requires a FIFO queue of requests and responses. Thus, browsers prefer separate TCP connections per HTTP request.

Does HTTP use multiple TCP connections?

In one of the pairs of protocols given below, both the protocols can use multiple TCP connections between the same client and the server. Which one is that? Explanation: HTTP may use different TCP connection for different objects of a webpage if non-persistent connections are used.

What are parallel connections HTTP?

Parallel Connections May Make Pages Load FasterThe enclosing HTML page is loaded first, and then the remaining three transactions are processed concurrently, each with their own connection. Because the images are loaded in parallel, the connection delays are overlapped.


1 Answers

are these GET request (all in the same TCP/IP connection) processed via the server in

parallel or in sequence?

It is processed in sequence. It is called pipelining. Pipelining is part of HTTP/1.1 and it means that the client need not wait for the current request to complete before sending the next request over a persistent connection. It can send several requests over the same connection without waiting for responses for previous requests. The requests are processed in FIFO manner i.e. The client can send several requests in sequence, and the server is supposed to send a response to each request in the same order the request was received. So if the server you are using in HTTP/1.1 compliant, then it should be handled in sequence.

like image 129
Suresh Kumar Avatar answered Sep 25 '22 02:09

Suresh Kumar