Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to optimize Workqueue of well know time consuming processes

I have an IEnumerable of actions and they are decendent ordered by the time they will consume when executing. Now i want all of them to be executed in parallel. Are there any better solutions than this one?

IEnumerable<WorkItem> workItemsOrderedByTime = myFactory.WorkItems.DecendentOrderedBy(t => t.ExecutionTime);
Parallel.ForEach(workItemsOrderedByTime, t => t.Execute(), Environment.ProcessorCount);

So my idea is to first execute all expensice tasks in terms of time they need to be done.

EDIT: The question is if there is a better solution to get all done in minimum of time.

like image 927
Lori Avatar asked Dec 17 '16 11:12

Lori


2 Answers

To solve your XY Problem of

Because otherwise it can happen that 9 of 10 tasks are finished and the last one is executed on 1 core and all other cores are doing nothing.

What you need to do is tell Parallel.ForEach to only take one item from the source list at a time. That way when you are down to the last items you won't have a bunch of slow work items all in a single core's queue.

This can be done by using Partitioner.Create and passing in EnumerablePartitionerOptions.NoBuffering

Parallel.ForEach(Partitioner.Create(workItems, EnumerablePartitionerOptions.NoBuffering), 
                new ParallelOptions{MaxDegreeOfParallelism = Environment.ProcessorCount},
                t => t.Execute());
like image 92
Scott Chamberlain Avatar answered Nov 20 '22 10:11

Scott Chamberlain


  1. By default there is no execution order guarantee in Parallel.ForEach
  2. That is why your call to DecendentOrderedBy does not do anything good. Though it might do something bad: in case default partitioner decides to do a range partition dividing say 12 WorkItems into 4 groups of 3 items, by the order in IEnumerable. Then first core has much more work to do, thus creating the problem you try to avoid.
  3. Easy fix to (2) is explained in the answer by Scott. If Parallel.ForEach takes just one item then you naturally get some load balancing. In most cases this will work fine
  4. The optimal (in most cases) solution for an ordered IEnumerable (as you have) will be Striped Partitioning number of buckets = number of cores. AFIK there you don't get this out-of-the-box in .NET. But you can provide a custom OrderablePartitioner that will partition data just this way.

I am sorry to say it but: "No free lunch"

like image 34
Dima Ogurtsov Avatar answered Nov 20 '22 10:11

Dima Ogurtsov