I'm calling a stored procedure with multiple results sets (always 2) and writing the results to seperate files (in pipe delimited format). I can't split the results sets into seperate stored procs. I'm using IDataReader and IEnumerable to keep as little in memory in the process.
Is there a cleaner way of consuming my IEnumerable<IEnumerable<string>>
than using GetEnumerator/MoveNext/Current to get to the inner IEnumerable<string>
to pass to File.AppendAllLines?
public void Execute()
{
var reader = GetLines();
using (var enumerator = reader.GetEnumerator())
{
enumerator.MoveNext();
File.AppendAllLines("file1.dat", enumerator.Current);
enumerator.MoveNext();
File.AppendAllLines("file2.dat", enumerator.Current);
}
}
public IEnumerable<IEnumerable<string>> GetLines()
{
Database db = DatabaseFactory.CreateDatabase("connectionStringKey");
using (var command = db.GetStoredProcCommand("getdata_sp"))
{
var reader = db.ExecuteReader(command);
yield return GetInnerEnumerable(reader);
reader.NextResult();
yield return GetInnerEnumerable(reader);
}
}
private IEnumerable<string> GetInnerEnumerable(IDataReader reader)
{
while (reader.Read())
{
object[] rowValues = new object[reader.FieldCount];
reader.GetValues(rowValues);
yield return String.Join("|", rowValues);
}
}
Why not a foreach
loop? That is the most basic.
Personally, I would just use a foreach loop with a separate variable for tracking which file to write to, something like:
public void Execute()
{
var reader = GetLines();
int i = 0;
foreach (var inner in reader)
{
if (i % 2 == 0)
File.AppendAllLines("file1.dat", inner);
else
File.AppendAllLines("file2.dat", inner);
++i;
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With