I am using LINQ to parse a large list of strings read from csv files. My code works fine with 100MB file. But unable to go beyond it due to stack overflow exception. I am testing my code with 500MB files, where the count of the strings in the list is around 4 million.(approximately 4 million lines in 500MB csv file)
public List<Metrics> MetricsParser(DateTime StartDate, TimeSpan StartTime, DateTime EndDate, TimeSpan EndTime,int dateIndex,int timeIndex)
{
DateTime sd = StartDate;
DateTime ed = EndDate;
TimeSpan st = StartTime;
TimeSpan et = EndTime;
StreamReader streamReader;
List<string> lines = new List<string>();
try
{
streamReader = new StreamReader("file.csv");
lines.Clear();
while (!streamReader.EndOfStream)
lines.Add(streamReader.ReadLine());
}
catch (Exception ex)
{
throw ex;
}
finally
{
if (streamReader != null)
streamReader.Close();
}
IEnumerable<Metrics> parsedFileData = null;
parsedFileData = from line in lines
let log = line.Split(",")
where (!(line.StartsWith("#")) & (line.Length > 0))
let dateVal = _utility.GetDateTime(dateformatType, log[(int)dateIndex], log[(int)timeIndex])
let timeVal = _utility.GetTime(log[(int)timeIndex], timeformatType)
where (dateVal >= new DateTime(sd.Year, sd.Month, sd.Day, st.Hours, st.Minutes, st.Seconds)
& dateVal <= new DateTime(ed.Year, ed.Month, ed.Day, et.Hours, et.Minutes, et.Seconds))
select new Metrics()
{
Date = dateVal,
Metrics1 = log[(int)Metrics1Index],
Metrics2 = (Metrics2Index != null) ? log[(int)Metrics2Index] : "default",
Metrics3 = (log[(int)Metrics3Index] == null || log[(int)Metrics3Index] == "") ? "-" : log[(int)Metrics3Index],
Metrics4 = (log[(int)Metrics4Index] == null || log[(int)Metrics4Index] == "") ? "-" : log[(int)Metrics4Index],
Metrics5 = (log[(int)Metrics5Index] == null || log[(int)Metrics5Index] == "") ? "-" : log[(int)Metrics5Index],
Metrics6 = (log[(int)Metrics6Index] == null || log[(int)Metrics6Index] == "") ? "-" : log[(int)Metrics6Index],
Metrics7 = (log[(int)Metrics7Index] == null || log[(int)Metrics7Index] == "") ? "-" : log[(int)Metrics7Index],
Metrics8 = (log[(int)Metrics8Index] == null || log[(int)Metrics8Index] == "") ? "-" : log[(int)Metrics8Index],
Metrics9 = (log[(int)Metrics9Index] == null || log[(int)Metrics9Index] == "") ? "-" : log[(int)Metrics9Index],
};
return parsedFileData.ToList();
}
Any ideas how to achieve the task with larger data.
i tried like below as per some suggestions but it too couldnt overcome stack overflow exception!
try
{
streamReader = new StreamReader("file.csv");
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
if (!(line.StartsWith("#")) & (line.Length > 0))
{
var log = line.Split(",");
var dateVal = _utility.GetDateTime(dateformatType, log[(int)dateIndex], log[(int)timeIndex]);
parsedData.Add(
new Metrics()
{
Date = dateVal,
Metrics1 = log[(int)Metrics1Index],
Metrics2 = (Metrics2Index != null) ? log[(int)Metrics2Index] : "default",
Metrics3 = (log[(int)Metrics3Index] == null || log[(int)Metrics3Index] == "") ? "-" : log[(int)Metrics3Index],
Metrics4 = (log[(int)Metrics4Index] == null || log[(int)Metrics4Index] == "") ? "-" : log[(int)Metrics4Index],
Metrics5 = (log[(int)Metrics5Index] == null || log[(int)Metrics5Index] == "") ? "-" : log[(int)Metrics5Index],
Metrics6 = (log[(int)Metrics6Index] == null || log[(int)Metrics6Index] == "") ? "-" : log[(int)Metrics6Index],
Metrics7 = (log[(int)Metrics7Index] == null || log[(int)Metrics7Index] == "") ? "-" : log[(int)Metrics7Index],
Metrics8 = (log[(int)Metrics8Index] == null || log[(int)Metrics8Index] == "") ? "-" : log[(int)Metrics8Index],
Metrics9 = (log[(int)Metrics9Index] == null || log[(int)Metrics9Index] == "") ? "-" : log[(int)Metrics9Index],
}
);
}
}
}
thanks for the ideas!
Try to parse file line by line, instead of saving in to memory, like this
var parsedFileData = new List<Metrics>();
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
if(IsLineNeedToBeParsed(line))
parsedFileData.Add(ParseLine(line));
}
Where ParseLine
is the method, which has content of your LINQ query, but operates on single line and IsLineNeedToBeParsed
is your where
clause.. As I've noticed - you don't do any joining of lines.
Avoid loading whole file contents and then executing some large query with lots of let
clauses - it will consume lots of memory during execution.
Try to create pure functions that filter, select of aggregate data and then, if you still don't like performance, try to optimize query by adding state, eliminating redundant computation, maybe caching, adding batches and so on.
One quick point to make: you should make file loading lazy, like this:
private IEnumerable<string> GetAllLines(string path)
{
using (StreamReader streamReader = new StreamReader(path))
{
while (!streamReader.EndOfStream)
{
yield return streamReader.ReadLine();
}
}
}
Then you can call it from LINQ
query like
from line in GetAllLines("file.csv")
and all lines will be loaded on demand and your memory consumption should be relatively constant during execution.
UPDATE:
I've just found, that File.ReadLines(string path)
reads file lazily by creating ReadLinesIterator
internally . So you can use this call just inside your LINQ query.
I've took some courage to refactor your code a little bit. Note that you still need to add some check and this is not the final version - I just want to show general idea. Also note, that I haven't compile it - because you have access to parser state and I know nothing of it's type and values. Code is a little bit longer than yours, but I never forget Clean Code book by Robert Martin, which has big point of "It's not the shortness, that makes code readable". Please correct me if I'm wrong somewhere.
public List<Metrics> MetricsParser(DateTime StartDate, TimeSpan StartTime, DateTime EndDate, TimeSpan EndTime,int dateIndex,int timeIndex)
{
DateTime sd = StartDate;
DateTime ed = EndDate;
TimeSpan st = StartTime;
TimeSpan et = EndTime;
List<Metrics> parsedFileData = new List<Metrics>();
using (StreamReader streamReader = new StreamReader("file.csv"))
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
if(IsLineNeedToBeParsed(line))
parsedFileData.Add(ParseLine(line));
}
}
return parsedFileData;
}
private bool IsLineNeedToBeParsed(string line)
{
return !(line.StartsWith("#")) && (line.Length > 0) && IsInDateRange(line);
}
private bool IsInDateRange(string line)
{
var dateVal = GetDateTime(line);
return dateVal >= new DateTime(sd.Year, sd.Month, sd.Day, st.Hours, st.Minutes, st.Seconds)
& dateVal <= new DateTime(ed.Year, ed.Month, ed.Day, et.Hours, et.Minutes, et.Seconds);
}
private Metrics ParseLine(string line)
{
var log = line.Split(',');
var time = _utility.GetTime(log[(int)timeIndex], timeformatType);
var dateVal = GetDateTime(line);
return new Metrics{ /* fill values here */ }
}
private string[] GetDateTime(string line)
{
var log = line.Split(',');
return _utility.GetDateTime(dateformatType, log[(int)dateIndex], log[(int)timeIndex]);
}
public class Metrics{}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With