I have a dynamic query that returns around 590,000 records. It runs successfully the first time, but if I run it again, I keep getting a System.OutOfMemoryException
. What are some reasons this could be happening?
The error is happening here:
public static DataSet GetDataSet(string databaseName,string
storedProcedureName,params object[] parameters)
{
//Creates blank dataset
DataSet ds = null;
try
{
//Creates database
Database db = DatabaseFactory.CreateDatabase(databaseName);
//Creates command to execute
DbCommand dbCommand = db.GetStoredProcCommand(storedProcedureName);
dbCommand.CommandTimeout = COMMAND_TIMEOUT;
//Returns the list of SQL parameters associated with that stored proecdure
db.DiscoverParameters(dbCommand);
int i = 1;
//Loop through the list of parameters and set the values
foreach (object parameter in parameters)
{
dbCommand.Parameters[i++].Value = parameter;
}
//Retrieve dataset and set to ds
ds = db.ExecuteDataSet(dbCommand);
}
//Check for exceptions
catch (SqlException sqle)
{
throw sqle;
}
catch (Exception e)
{
throw e; // Error is thrown here.
}
//Returns dataset
return ds;
}
Here is the code the runs on the button click:
protected void btnSearchSBIDatabase_Click(object sender, EventArgs e)
{
LicenseSearch ls = new LicenseSearch();
DataTable dtSearchResults = new DataTable();
dtSearchResults = ls.Search();
Session["dtSearchResults"] = dtSearchResults;
Response.Redirect("~/FCCSearch/SearchResults.aspx");
}
else
lblResults.Visible = true;
}
The exception that is thrown when there is not enough memory to continue the execution of a program.
OutOfMemoryException' was thrown. To resolve this issue, I had to restart Visual Studio or go to the Windows Task Manager and terminate IIS Express process. This error could happen due to a variety of reasons related to memory consumption of the application.
OutOfMemoryException Class (System) The exception that is thrown when there is not enough memory to continue the execution of a program.
Well, according to the topic of the question, best way to avoid out of memory exception would be not to create objects that fill in that memory. Then you can calculate the length of your queue based on estimate of one object memory capacity. Another way would be to check for memory size in each worker thread.
It runs successfully the first time, but if I run it again, I keep getting a System.OutOfMemoryException. What are some reasons this could be happening?
Regardless of what the others have said, the error has nothing to do with forgetting to dispose your DBCommand or DBConnection, and you will not fix your error by disposing of either of them.
The error has everything to do with your dataset which contains nearly 600,000 rows of data. Apparently your dataset consumes more than 50% of the available memory on your machine. Clearly, you'll run out of memory when you return another dataset of the same size before the first one has been garbage collected. Simple as that.
You can remedy this problem in a few ways:
Consider returning fewer records. I personally can't imagine a time when returning 600K records has ever been useful to a user. To minimize the records returned, try:
Limiting your query to the first 1000 records. If there are more than 1000 results returned from the query, inform the user to narrow their search results.
If your users really insist on seeing that much data at once, try paging the data. Remember: Google never shows you all 22 bajillion results of a search at once, it shows you 20 or so records at a time. Google probably doesn't hold all 22 bajillion results in memory at once, it probably finds its more memory efficient to requery its database to generate a new page.
If you just need to iterate through the data and you don't need random access, try returning a datareader instead. A datareader only loads one record into memory at a time.
If none of those are an option, then you need to force .NET to free up the memory used by the dataset before calling your method using one of these methods:
Remove all references to your old dataset. Anything holding on to a refenence of your dataset will prevent it from being reclaimed by memory.
If you can't null all the references to your dataset, clear all of the rows from the dataset and any objects bound to those rows instead. This removes references to the datarows and allows them to be eaten by the garbage collector.
I don't believe you'll need to call GC.Collect()
to force a gen cycle. Not only is it generally a bad idea to call GC.Collect()
, because sufficient memory pressure will cause .NET invoke the garbage collector on its own.
Note: calling Dispose on your dataset does not free any memory, nor does it invoke the garbage collector, nor does it remove a reference to your dataset. Dispose is used to clean up unmanaged resources, but the DataSet does not have any unmanaged resources. It only implements IDispoable because it inherents from MarshalByValueComponent, so the Dispose method on the dataset is pretty much useless.
Perhaps you're not disposing of the previous connection/ result classes from the previous run which means their still hanging around in memory.
You're obviously not disposing of things.
Consider the "using" command when temporarily using objects that implement IDisposable.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With