How to calculate the speed of an internet connection by some average ping rates.What are the calculations involved in it.IS it possible to calculate upload/download limit by ping rate
EDIT If ping is not a solution what else is?
Know the difference between download and upload speeds While download speeds are important, upload speeds are actually more important for gaming online because low latency (or low ping) depends on decent upload speeds. Upload speeds have a larger impact on response time and game performance than download speeds.
To see MBps, take the Mbps, and divide by eight or multiply by 0.125. People can find out roughly how long a large file will take to download if they take its size in megabytes, multiply by eight and then divide by their internet speed in Mbps. Mbps is usually used to show the maximum possible speed of a network.
Ping and jitter are measures of the speed at which you can request and receive data (ping) and the variation in that response time (jitter). In essence, they are measures of the quality of your connection and are used to diagnose performance of real-time applications like video streaming or voice over internet (VoIP).
Both download and upload speed are measured in megabits per second, or Mbps. Ping, also referred to as latency, measures the reaction time of your connection—the amount of time it takes to send a request and receive a response, measured in milliseconds (ms).
I used ping to calculate bandwidth in local network. I think it's as accurate as other means of measuring bandwidth (e.g. downloading a big file). You can use it too for your internet connection if you have a symmetric link to the internet (i.e. not adsl).
Here's how I do it:
I have a gigabit ethernet LAN and I want to measure speed between my computer and a host in server room. My MTU is 1500, so I use packet size 1472. Just randomly, I use 83,333 packets in this test (about 1 gigabit). Then:
sudo ping -f -c 83333 -s 1472 192.168.3.103
at the end of the result i get:
round-trip min/avg/max/stddev = 0.174/0.219/2.078/0.020 ms
so in average it takes 0.219 ms to send 1500 bytes and receive 1500 bytes, that's 24 kb.
24 kb / 0.219 ms = 110 Mb/s
If you want to use that to a server on the internet, you need to lower the packet size to something like 1464 (for MTU 1492), drop the -f option and lower the count so it won't take too long to finish.
p.s. I think this should go to superuser, not stackoverflow.
Latency is distinct from bandwidth. Imagine a truckload of DVDs being driven across the country. The bandwidth is high, but the latency is huge.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With