Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiprocess multiple files in a list

I am trying to read a list that contains N number of .csv files stored in a list synchronously.

Right now I do the following:

import multiprocess

  1. Empty list
  2. Append list with listdir of .csv's
  3. def A() -- even files (list[::2])
  4. def B() -- odd files (list[1::2]
  5. Process 1 def A()
  6. Process 2 def B()

    def read_all_lead_files(folder):
    
        for files in glob.glob(folder+"*.csv"):
            file_list.append(files)
            def read_even():
               file_list[::2]    
            def read_odd():
               file_list[1::2]  
    
         p1 = Process(target=read_even)
         p1.start()
         p2 = Process(target=read_odd)
         p2.start()
    

Is there a faster way to split up the partitioning of the list to Process function?

like image 228
Christopher W Avatar asked May 21 '14 21:05

Christopher W


1 Answers

I'm guessing here at your request, because the original question is quite unclear. Since os.listdir doesn't guarantee an ordering, I'm assuming your "two" functions are actually identical and you just need to perform the same process on multiple files simultaneously.

The easiest way to do this, in my experience, is to spin up a Pool, launch a process for each file, and then wait. e.g.

import multiprocessing

def process(file):
    pass # do stuff to a file

p = multiprocessing.Pool()
for f in glob.glob(folder+"*.csv"):
    # launch a process for each file (ish).
    # The result will be approximately one process per CPU core available.
    p.apply_async(process, [f]) 

p.close()
p.join() # Wait for all child processes to close.
like image 51
Henry Keiter Avatar answered Oct 04 '22 19:10

Henry Keiter