I need to load one column of strings from table on SqlServer into Array in memory using C#. Is there a faster way than open SqlDataReader and loop through it. Table is large and time is critical.
EDIT I am trying to build .dll and use it on server for some operations on database. But it is to slow for now. If this is fastest than I have to redesign the database. I tough there may be some solution how to speed thing up.
SqlDataReader will be faster than SQlDataAdapter because it works in a connected state which means the first result is returned from query as soon as its available ..
Answers. Datareaders are fast compare to DataAdapters/DataSets because of the following reason. DataReader offers better performance because it avoids the performance and memory overhead associated with the creation of the DataSet.
It was generally agreed that a DataReader is faster, but we wanted to see how much faster. The results surprised us. The DataTable was consistently faster than the DataReader. Approaching twice as fast sometimes.
A SqlDataAdapter is typically used to fill a DataSet or DataTable and so you will have access to the data after your connection has been closed (disconnected access). The SqlDataReader is a fast forward-only and connected cursor which tends to be generally quicker than filling a DataSet/DataTable.
Data Reader
About the fastest access you will get to SQL is with the SqlDataReader.
Profile it
It's worth actually profiling where your performance issue is. Usually, where you think the performance issue is, is proven to be totally wrong after you've profiled it.
For example it could be:
Profiling each of these in isolation will give you a better idea of where your bottleneck is. For profiling your code, there is a great article from Microsoft
Cache it
The thing to look at to improve performance is to work out if you need to load all that data every time. Can the list (or part of it) be cached? Take a look at the new System.Runtime.Caching namespace.
Rewrite as T-SQL
If you're doing purely data operations (as your question suggests), you could rewrite your code which is using the data to be T-SQL and run natively on SQL. This has the potential to be much faster, as you will be working with the data directly and not shifting it about.
If your code has a lot of necessary procedural logic, you can try mixing T-SQL with CLR Integration giving you the benefits of both worlds.
This very much comes down to the complexity (or more procedural nature) of your logic.
If all else fails
If all areas are optimal (or as near as), and your design is without fault. I wouldn't even get into micro-optimisation, I'd just throw hardware at it.
What hardware? Try the reliability and performance monitor to find out where the bottle neck is. Most likely place for the problem you describe HDD or RAM.
If SqlDataReader
isn't fast enough, perhaps you should store your stuff somewhere else, such as an (in-memory) cache.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With