I have a list of URLs that I need to use goroutine to fire off HTTP requests concurrently. Is there anyway to check and limit how many of those HTTP requests are sent per second?
A very simple version of this in Go would be an adaptation of a Leaky Bucket algorithm, using a channel and a goroutine. Receiving a token from the rate
channel before making a request will check the rate and block if the rate limiter is empty.
// create a buffered channel.
// The capacity of the channel is maximum burst that can be made.
rate := make(chan struct{}, 10)
go func() {
ticker := time.NewTicker(100 * time.Millisecond)
for range ticker.C {
rate <- struct{}{}
}
}()
Since a series of requests that take longer than the average rate will end up being concurrent, you may need to limit concurrency too. You can add a second channel as a semaphore, adding a token to the semaphore before making a request, and removing it when it's complete.
// limit concurrency to 5
semaphore := make(chan struct{}, 5)
// in request function
semaphore <- struct{}{}
defer func() {
<-semaphore
}()
A slightly more complete example is here:
https://play.golang.org/p/ZrTPLcdeDF
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With