If not whats the maximum while still remaining efficient?
I'm creating 14 threads, each of which opens a list of URLs(about 500) creates a new thread for each one, which then downloads it, and adds it to a MySQL db. The MySQL pool size is set to 50.
This is a rake task in RoR.
Would this work better using Kernal#fork
or some other method?
require 'open-uri'
a = 'http://www.example.com ' * 30
arr = a.split(' ')
arr.each_slice(3) do |group|
group.map do |site|
Thread.new do
open(site)
p 'finished'
end
end.each(&:join)
end
With Ruby 1.8, it's practically limited to how much memory you have. You can create tens of thousands of thread per process. The Ruby interpreter handles the management of the threads and only one or two native thread are created. It isn't true multitasking where the CPU switches between threads.
Ruby 1.9 uses native threads. The limit seems to be what is allowed by the OS. Just for testing, I can create over 2000 threads on my mac with Ruby 1.9 before the OS disallows any more.
Note that having thousands of threads for a process isn't a good idea. Thread scheduling becomes a burden long before that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With