Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there anything faster than SqlDataReader in .NET?

I need to load one column of strings from table on SqlServer into Array in memory using C#. Is there a faster way than open SqlDataReader and loop through it. Table is large and time is critical.

EDIT I am trying to build .dll and use it on server for some operations on database. But it is to slow for now. If this is fastest than I have to redesign the database. I tough there may be some solution how to speed thing up.

like image 406
watbywbarif Avatar asked Sep 16 '10 11:09

watbywbarif


People also ask

Which is faster SqlDataReader or SQlDataAdapter?

SqlDataReader will be faster than SQlDataAdapter because it works in a connected state which means the first result is returned from query as soon as its available ..

Which is faster DataReader or Dataadapter?

Answers. Datareaders are fast compare to DataAdapters/DataSets because of the following reason. DataReader offers better performance because it avoids the performance and memory overhead associated with the creation of the DataSet.

Which is faster DataReader or DataTable?

It was generally agreed that a DataReader is faster, but we wanted to see how much faster. The results surprised us. The DataTable was consistently faster than the DataReader. Approaching twice as fast sometimes.

What is the difference between SqlDataReader and SQlDataAdapter?

A SqlDataAdapter is typically used to fill a DataSet or DataTable and so you will have access to the data after your connection has been closed (disconnected access). The SqlDataReader is a fast forward-only and connected cursor which tends to be generally quicker than filling a DataSet/DataTable.


2 Answers

Data Reader

About the fastest access you will get to SQL is with the SqlDataReader.

Profile it

It's worth actually profiling where your performance issue is. Usually, where you think the performance issue is, is proven to be totally wrong after you've profiled it.

For example it could be:

  1. The time... the query takes to run
  2. The time... the data takes to copy across the network/process boundry
  3. The time... .Net takes to load the data into memory
  4. The time... your code takes to do something with it

Profiling each of these in isolation will give you a better idea of where your bottleneck is. For profiling your code, there is a great article from Microsoft

Cache it

The thing to look at to improve performance is to work out if you need to load all that data every time. Can the list (or part of it) be cached? Take a look at the new System.Runtime.Caching namespace.

Rewrite as T-SQL

If you're doing purely data operations (as your question suggests), you could rewrite your code which is using the data to be T-SQL and run natively on SQL. This has the potential to be much faster, as you will be working with the data directly and not shifting it about.

If your code has a lot of necessary procedural logic, you can try mixing T-SQL with CLR Integration giving you the benefits of both worlds.

This very much comes down to the complexity (or more procedural nature) of your logic.

If all else fails

If all areas are optimal (or as near as), and your design is without fault. I wouldn't even get into micro-optimisation, I'd just throw hardware at it.

What hardware? Try the reliability and performance monitor to find out where the bottle neck is. Most likely place for the problem you describe HDD or RAM.

like image 90
badbod99 Avatar answered Oct 24 '22 20:10

badbod99


If SqlDataReader isn't fast enough, perhaps you should store your stuff somewhere else, such as an (in-memory) cache.

like image 43
Steven Avatar answered Oct 24 '22 20:10

Steven