I am trying to get the available memory in a process to ensure I do not get an OutOfMemoryException. I have searched the internet and found several examples of how to get memory used but not available.
Let me provide the use case...
I have a process that is doing a Bulk Insert (using SqlBulkCopy). I am passing a DataTable
into the WriteToServer
method. I cannot use a DataReader
because I have to be able to retry the process upon failure. My first thought was to pick an arbitrary number of rows to insert at a time, say 50,000. But this is a generic process that does not know the data; it does not know the number of columns nor the amount of data in each row. So I was thinking I could monitor the memory as I am adding rows to the DataTable
and then post it to the SqlBulkCopy
when it got close to running out of memory.
Is this a valid approach or is there a better way?
If this is a valid approach, what function would I use to determine the amount of available memory?
Here is my code so far... The AvailableMemoryIsLow
is what I cannot figure out how to determine.
// m_buffer is a read-once cache (implements IDataReader) that pulls
// data from an external source as needed so it uses very little memory.
// My original implementation just used m_buffer as the parameter of
// WriteToServer but now I have to add retry logic into the process.
DataTable dataTable = new DataTable(m_tableName);
foreach (DataField d in m_buffer.GetColumns())
dataTable.Columns.Add(new DataColumn(d.FieldName, d.FieldType));
while (m_buffer.Read())
{
DataRow row = dataTable.NewRow();
for (int i = 0; i < m_buffer.FieldCount; i++)
row[i] = m_buffer.GetValue(i);
dataTable.Rows.Add(row);
// How do I determine AvailableMemoryIsLow
if (rowCount++ >= 50000 || AvailableMemoryIsLow)
{
PutDataIntoDatabase(dataTable);
dataTable.Clear();
rowCount = 0;
}
}
if (dataTable.Rows.Count > 0)
PutDataIntoDatabase(dataTable);
Clearly you are running this code on a 32-bit machine or you wouldn't have this problem. In general, pushing a program to consume nearly all of the available virtual memory space (2 gigabytes) is not a reasonable thing to do. Short from the ever-present danger of OOM, the kind of data your are handling is "live data", it is highly likely to be mapped to RAM. A program that demands nearly all available RAM is pretty detrimental to operation of that program, the operating system and other processes that run on that machine.
You force the operating system to start choosing how to allocate RAM between what the processes need as well as what it reserves for the file system cache. That kind of choice always ends up forcing data from RAM into the paging file. That can slow down operation a great deal, both when it is written and again when a process needs it back in RAM. An operating system perf problem called "thrashing".
Just don't do this, dunking so much data in RAM just doesn't make your program any faster. It makes it slower. A reasonable upper limit on the amount of RAM you consume on a 32-bit operating system hovers somewhere near 500 megabytes. There isn't any need to hit exactly that limit, counting rows is quite good enough.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With