I want to read huge data from CSV, containing around 500,000 rows. I am using OpenCSV library for it. My code for it is like this
CsvToBean<User> csvConvertor = new CsvToBean<User>();
List<User> list = null;
try {
list =csvConvertor.parse(strategy, new BufferedReader(new FileReader(filepath)));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
Upto 200,000 records,data is read into list of User bean objects. But for data more than that I am getting
java.lang.OutOfMemoryError: Java heap space
I have this memory setting in "eclipse.ini" file
-Xms256m
-Xmx1024m
I am thinking a solution of splitting the huge file in separate files and read those files again, which I think is a lengthy solution.
Is there any other way, by which I can avoid OutOfMemoryError exception.
So, how do you open large CSV files in Excel? Essentially, there are two options: Split the CSV file into multiple smaller files that do fit within the 1,048,576 row limit; or, Find an Excel add-in that supports CSV files with a higher number of rows.
Read line by line
something like this
CSVReader reader = new CSVReader(new FileReader("yourfile.csv"));
String [] nextLine;
while ((nextLine = reader.readNext()) != null) {
// nextLine[] is an array of values from the line
System.out.println(nextLine[0] + nextLine[1] + "etc...");
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With