I am new to JSON and SQLBulkCopy. I have a JSON formatted POST data that I want to Bulk Copy/Insert in Microsoft SQL using C#.
JSON Format:
{ "URLs": [{ "url_name": "Google", "url_address": "http://www.google.com/" }, { "url_name": "Yahoo", "url_address": "http://www.yahoo.com/" }, { "url_name": "FB", "url_address": "http://www.fb.com/" }, { "url_name": "MegaSearches", "url_address": "http://www.megasearches.com/" }] }
Classes:
public class UrlData { public List<Url> URLs {get;set;} } public class Url { public string url_address {get;set;} public string url_name {get;set;} }
How can I do that efficiently?
TL;DR If you have your data already represented as DataTable
, you can insert it to the destination table on the server with SqlBulkCopy
:
string csDestination = "put here the a connection string to the database"; using (SqlConnection connection = new SqlConnection(csDestination)) using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection)) { connection.Open() bulkCopy.DestinationTableName = "TUrls"; bulkCopy.WriteToServer(dataTableOfUrls); }
If you want to load just "from 10 to 50 urls" there's no need to use SqlBulkCopy
- its general purpose to eliminate thousands of separate inserts.
So, inserting without SqlBulkCopy
[and without EntityFramework] can be done one by one:
string insertQuery = "insert into TUrls(address, name) values(@address, @name)"; foreach (URL url in listOfUrls) { SqlCommand cmd = new SqlCommand(insertQuery); cmd.Parameters.AddWithValue("@name", url.url_name); cmd.Parameters.AddWithValue("@address", url.urld_address); // Remember to take care of connection! I omit this part for clarity cmd.ExecuteNonQuery(); }
To insert data with SqlBulkCopy
you need to convert your data (e.g. a list of custom class objects) to DataTable
. Below is the quote from Marc Gravell's answer as an example of generic solution for such conversion:
Here's a nice 2013 update using FastMember from NuGet:
IEnumerable<SomeType> data = ... DataTable table = new DataTable(); using(var reader = ObjectReader.Create(data)) { table.Load(reader); }
Yes, this is pretty much the exact opposite of this one; reflection would suffice - or if you need quicker,
HyperDescriptor
in 2.0, or maybeExpression
in 3.5. Actually,HyperDescriptor
should be more than adequate.For example:
// remove "this" if not on C# 3.0 / .NET 3.5 public static DataTable ToDataTable<T>(this IList<T> data) { PropertyDescriptorCollection props = TypeDescriptor.GetProperties(typeof(T)); DataTable table = new DataTable(); for(int i = 0 ; i < props.Count ; i++) { PropertyDescriptor prop = props[i]; table.Columns.Add(prop.Name, prop.PropertyType); } object[] values = new object[props.Count]; foreach (T item in data) { for (int i = 0; i < values.Length; i++) { values[i] = props[i].GetValue(item); } table.Rows.Add(values); } return table; }
Now, having your data represented as DataTable
, you're ready to write it to the destination table on the server:
string csDestination = "put here the a connection string to the database"; using (SqlConnection connection = new SqlConnection(csDestination)) using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection)) { connection.Open() bulkCopy.DestinationTableName = "TUrls"; bulkCopy.WriteToServer(dataTableOfUrls); }
Hope it helps.
UPD
Yes, you can - BulkCopy works just as an insert command in a way that it appends data.
Also, consider using an intermediate table in case there's high probability for operation to go wrong (e.g. long insert time and connection issues), and you want to busy/lock the destination table as little time as possible. Another use case for an intermediate table is, of course, the need to do some data transformations before an insert.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With