If I understand right, applications sometimes use HTTP to send messages, since using other ports is liable to cause firewall problems. But how does that work without conflicting with other applications such as web-browsers? In fact how do multiple browsers running at once not conflict? Do they all monitor the port and get notified... can you share a port in this way?
I have a feeling this is a dumb question, but not something I ever thought of before, and in other cases I've seen problems when 2 apps are configured to use the same port.
Yes, they do. Notes: You can see that the local ports are all different. The remote ports are usually 80 (HTTP), 443 (HTTPS) or 8080 (HTTP Alternate).
On a Web server or Hypertext Transfer Protocol daemon, port 80 is the port that the server "listens to" or expects to receive from a Web client, assuming that the default was taken when the server was configured or set up.
Since port 80 is not an option, you need to find an alternative port. There is no official HTTP alternative port. When port 80 is used for one address/webserver, it's fairly common to use port 8080 or 8000 for another site on the same address/webserver.
There are 2 ports: a source port (browser) and a destination port (server). The browser asks the OS for an available source port (let's say it receives 33123) then makes a socket connection to the destination port (usually 80/HTTP, 443/HTTPS).
When the web server receives the answer, it sends a response that has 80 as source port and 33123 as destination port.
So if you have 2 browsers concurrently accessing stackoverflow.com, you'd have something like this:
Firefox (localhost:33123) <-----------> stackoverflow.com (69.59.196.211:80) Chrome (localhost:33124) <-----------> stackoverflow.com (69.59.196.211:80)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With