I have a slice of 30'000 strings. How do I separate processing this slice into, say, 10 goroutines that would take 3000 strings from the slice, extract some data from them and push into a new slice?
So, in the end, I will have 10 slices with 3000 processed results in each. What's the pattern to handle this problem?
I have had a look at this article, but not sure which of these patterns applies to my case.
Go by Example: Goroutines A goroutine is a lightweight thread of execution. Suppose we have a function call f(s) . Here's how we'd call that in the usual way, running it synchronously. To invoke this function in a goroutine, use go f(s) .
A goroutine has a simple model: it is a function executing concurrently with other goroutines in the same address space. It is lightweight, costing little more than the allocation of stack space. And the stacks start small, so they are cheap, and grow by allocating (and freeing) heap storage as required.
First, you will create a program that uses goroutines to run multiple functions at once. Then you will add channels to that program to communicate between the running goroutines. Finally, you'll add more goroutines to the program to simulate a program running with multiple worker goroutines.
Goroutines can be nested to an arbitrary depth and the output will be even more random in many cases.
Using a channel, read the elements from the slice, use a Fan out to distribute load and pass messages. Then, process the strings in goroutines and collect the results back (fan in ) in a single goroutine to avoid mutexes.
You may want to set the number of Max concurrent concurrent goroutines.
Keep in mind that slices are not thread safe when writing to them.
Useful info:
https://blog.golang.org/pipelines https://talks.golang.org/2012/concurrency.slide#1 https://blog.golang.org/advanced-go-concurrency-patterns https://talks.golang.org/2013/advconc.slide#1
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With