public static IList<T> ConvertTo<T>(DataTable table)
{
if (table == null)
{
return null;
}
List<DataRow> rows = new List<DataRow>();
foreach (DataRow row in table.Rows)
{
rows.Add(row);
}
return ConvertTo<T>(rows);
}
public static T ConvertItem<T>(DataTable table)
{
T obj = default(T);
if (table != null && table.Rows.Count > 0)
{
obj = CreateItem<T>(table.Rows[0]);
}
return obj;
}
public static T CreateItem<T>(DataRow row)
{
T obj = default(T);
if (row != null)
{
obj = Activator.CreateInstance<T>();
Type entityType = typeof(T);
PropertyInfo[] properties = entityType.GetProperties();
for (int i = 0; i < properties.Length; i++)
{
object[] customAttributes = properties[i].GetCustomAttributes(typeof(ColumnAttributes), false);
ColumnAttributes dataField = null;
if (null != customAttributes && customAttributes.Length > 0 && null != (dataField = customAttributes[0] as ColumnAttributes))
{
if (row.Table.Columns.Contains(dataField.FieldName) && !row[dataField.FieldName].GetType().FullName.Equals("System.DBNull"))
{
properties[i].SetValue(obj, row[dataField.FieldName], null);
}
}
}
}
return obj;
}
Thats the only thing we can think of right now is that we must be doing something where we need to Garbage collect Ourselves?
Thoughts?
Why we think there might be a leak?:
We are getting Out of Memory Errors. If a Page does not require business logic to use this type of conversion, the II6 process does not grow, but when we hit a page that uses it, it grows.
We are currently getting ANTS Profiler to give us more details.
There are the following 3 ways to convert a DataTable to a List. Using a Loop. Using LINQ. Using a Generic Method.
A DataTable is not an IEnumerable by itself. Internally, the . AsEnumerable() call on it actually takes the . Rows collection and converts that into an IEnumerable(Of DataRow).
In the ADO.NET library, C# DataTable is a central object. It represents the database tables that provide a collection of rows and columns in grid form. There are different ways to create rows and columns in the DataTable.
That won't be an actual leak, but it could be stressing things unnecessarily...
How many rows are you working over?
Note that reflection is a pain, and that every call to things like GetCustomAttributes
can return a new array (so you want to do that once, not once per-property-per-row).
Personally, I'd pre-construct the work I intend to do... something like below.
Note that if I was doing this lots, I'd either switch to HyperDescriptor, or if .NET 3.5 was an option, maybe a compiled Expression. Since DataTable
isn't strongly typed, HyperDescriptor
would be a logical next step (for performance) after the below...
sealed class Tuple<T1, T2>
{
public Tuple() {}
public Tuple(T1 value1, T2 value2) {Value1 = value1; Value2 = value2;}
public T1 Value1 {get;set;}
public T2 Value2 {get;set;}
}
public static List<T> Convert<T>(DataTable table)
where T : class, new()
{
List<Tuple<DataColumn, PropertyInfo>> map =
new List<Tuple<DataColumn,PropertyInfo>>();
foreach(PropertyInfo pi in typeof(T).GetProperties())
{
ColumnAttribute col = (ColumnAttribute)
Attribute.GetCustomAttribute(pi, typeof(ColumnAttribute));
if(col == null) continue;
if(table.Columns.Contains(col.FieldName))
{
map.Add(new Tuple<DataColumn,PropertyInfo>(
table.Columns[col.FieldName], pi));
}
}
List<T> list = new List<T>(table.Rows.Count);
foreach(DataRow row in table.Rows)
{
if(row == null)
{
list.Add(null);
continue;
}
T item = new T();
foreach(Tuple<DataColumn,PropertyInfo> pair in map) {
object value = row[pair.Value1];
if(value is DBNull) value = null;
pair.Value2.SetValue(item, value, null);
}
list.Add(item);
}
return list;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With