I have a list of urls to process, but I want to run a maximum number of goroutines at a time. For example, if I have 30 urls, I only want 10 goroutines working in parallel.
My attempt at this is the following:
parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()
urls := flag.Args()
var wg sync.WaitGroup
client := rest.Client{}
results := make(chan string, *parallel)
for _, url := range urls {
wg.Add(1)
go worker(url, client, results, &wg)
}
for res := range results {
fmt.Println(res)
}
wg.Wait()
close(results)
My understanding is that if I create a buffered channel of size parallel, then the code will block until I read off the results channel, which will unblock my code and allow another goroutine to be spawned. However, this code doesn't seems to block after processing all the urls. Can someone explain to me how I can use channels to limit the number of goroutines running?
Create the desired number of workers instead of one worker per url:
parallel := flag.Int("parallel", 10, "max parallel requests allowed")
flag.Parse()
// Workers get URLs from this channel
urls := make(chan string)
// Feed the workers with URLs
go func() {
for _, u := range flag.Args() {
urls <- u
}
// Workers will exit from range loop when channel is closed
close(urls)
}()
var wg sync.WaitGroup
client := rest.Client{}
results := make(chan string)
// Start the specified number of workers.
for i := 0; i < *parallel; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for url := range urls {
worker(url, client, results)
}
}()
}
// When workers are done, close results so that main will exit.
go func() {
wg.Wait()
close(results)
}()
for res := range results {
fmt.Println(res)
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With