Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best optimizing for a large SQL Server table (100-200 Mil records)

What are the best options/recommendations and optimizations that can be performed when working with a large SQL Server 2005 table that contains anywhere from 100-200 Million records?

like image 907
RPS Avatar asked May 13 '10 23:05

RPS


1 Answers

Since you didn't state the purpose of the database, or the requirements, here are some general things, in no particular order:

  1. Small clustered index on each table. Consider making this your primary key on each table. This will be very efficient and save on space in the main table and dependent tables.
  2. Appropriate non-clustered indexes (covering indexes where possible)
  3. Referential Integrity
  4. Normalized Tables
  5. Consistent naming on all database objects for easier maintenance
  6. Appropriate Partitioning (table and index) if you have the Enterprise Edition of SQL Server
  7. Appropriate check constraints on tables if you are going to allow direct data manipulation in the database.
  8. Decide where your business rules are going to reside and don't deviate from that. In most cases they do not belong in the database.
  9. Run Query Analyzer on your heavily used queries (at least) and look for table scans. This will kill performance.
  10. Be prepared to deal with deadlocks. With a database of this size, especially if there will be heavy writing, deadlocks could very well be a problem.
  11. Take ample advantage of views to hide query join complexity and potential for query optimization and flexible security implementation.
  12. Consider using schemas to better organize data and flexible security implementation.
  13. Get familiar with Profiler. With a database of this size, you will more than likely be spending some time trying to determine query bottlenecks. Profiler can help you here.
like image 50
Randy Minder Avatar answered Nov 15 '22 08:11

Randy Minder