Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ruby - Read file in batches

I am reading a file that is 10mb in size and which contains some id's. I read them into a list in ruby. I am concerned that it might cause memory issues in the future, when the number of id's in file might increase. Is there a effective way of reading a large file in batches?

Thank you

like image 767
Boolean Avatar asked Jun 02 '10 22:06

Boolean


1 Answers

With Lazy Enumerators and each_slice, you can get the best of both worlds. You don't need to worry about cutting lines in the middle, and you can iterate over multiple lines in a batch. batch_size can be chosen freely.

header_lines = 1
batch_size   = 2000

File.open("big_file") do |file|
  file.lazy.drop(header_lines).each_slice(batch_size) do |lines|
    # do something with batch of lines
  end
end

It could be used to import a huge CSV file into a database:

require 'csv'
batch_size   = 2000

File.open("big_data.csv") do |file|
  headers = file.first
  file.lazy.each_slice(batch_size) do |lines|
    csv_rows = CSV.parse(lines.join, headers: headers)
    # do something with 2000 csv rows, e.g. bulk insert them into a database
  end
end
like image 136
Eric Duminil Avatar answered Oct 25 '22 13:10

Eric Duminil