Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Stored procedure performance randomly plummets; trivial ALTER fixes it. Why?

I have a couple of stored procedures on SQL Server 2005 that I've noticed will suddenly take a significantly long time to complete when invoked from my ASP.NET MVC app running in an IIS6 web farm of four servers. Normal, expected execution time is less than a second; unexpected anomalous execution time is 25-45 seconds. The problem doesn't seem to ever correct itself.

However, if I ALTER the stored procedure (even if I don't change anything in the procedure, except to perhaps add a space to the script created by SSMS Modify command), the completion time reverts to expected completion time.

IIS and SQL Server are running on separate boxes, both running Windows Server 2003 R2 Enterprise Edition. SQL Server is Standard Edition. All machines have dual Xeon E5450 3GHz CPUs and 4GB RAM. SQL Server is accessed using its TCP/IP protocol over gigabit ethernet (not sure what physical medium).

The problem is present from all web servers in the web farm. When I invoke the procedure from a query window in SSMS on my development machine, the procedure completes in normal time. This is strange because I was under the impression that SSMS used the same SqlClient driver as in .NET. When I point my development instance of the web app to the production database, I again get the anomalous long completion time. If my SqlCommand Timeout is too short, I get

System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

Question: Why would performing ALTER on the stored procedure, without actually changing anything in it, restore the completion time to less than a second, as expected?

Edit: To clarify, when the procedure is running slow for the app, it simultaneously runs fine in SSMS with the same parameters. The only difference I can discern is login credentials (next time I notice the behavior, I'll be checking from SSMS with the same creds). The ultimate goal is to get the procs to sustainably run with expected speed without requiring occasional intervention.

Resolution: I wanted to to update this question in case others are experiencing this issue. Following the leads of the answers below, I was able to consistently reproduce this behavior. In order to test, I utilize sp_recompile and pass it one of the susceptible sprocs. I then initiate a website request from my browser that will invoke the sproc with atypical parameters. Lastly, I initiate a website request to a page that invokes the sproc with typical parameters, and observe that the request does not complete because of a SQL timeout on the sproc invocation.

To resolve this on SQL Server 2005, I've added OPTIMIZE FOR hints to my SELECT. The sprocs that were vulnerable all have the "all-in-one" pattern described in this article. This pattern is certainly not ideal but was a necessary trade-off given the timeframe for the project.

like image 572
G-Wiz Avatar asked Dec 04 '09 22:12

G-Wiz


People also ask

Why do stored procedures and functions improve performance?

They reduce the number of calls to the database and decrease network traffic by bundling commands.

How does stored procedure improve performance?

The main performance advantage of a stored procedure is that they have the ability to reuse compiled and cached query plans. In the first execution of a stored procedure, its execution plan is stored in the query plan cache and this query plan is used in the next execution of the procedure.

What is a stored procedure and why is it particularly useful?

A stored procedure is a named block of PL/SQL and SQL statements. One of the major advantages of stored procedures is that they can be used to encapsulate and represent business transactions. For example, you can create a stored procedure to represent a product sale, a credit update, or the addition of a new customer.


3 Answers

Parameter sniffing and plan reuse. Every now and then you get a bad plan. Doing an ALTER bumps the metadata version on the procedure so the plans must be recompiled on next execution. The solution depends on a miriad of factors, you may have bad SQL, you may get an unlucky 'optimization', we can't possible know. Identify the statement in the procedure that is slow, when is slow. SQL Profiler is your friend, trace SP:StmtCompleted eevnt with a duration > 5000 for instance.

like image 63
Remus Rusanu Avatar answered Sep 21 '22 19:09

Remus Rusanu


Execution plan was probably regenerated.

There are various ways you can force this to be regenerated, and also ways to avoid parameter sniffing, where an execution plan is determined based on the parameters, but which may not be suitable for all inputs. If you were on 2008, I'd recommend using the OPTIMIZE FOR UNKNOWN option.

Alternatively, mask all your input variables with local variables to avoid parameter sniffing, and consider calling WITH RECOMPILE or declaring the SPs that way.

Also, ensure that your statistics are up to date. Out of date statistics might mean a plan is no longer good.

The comments in Remus' answer also indicate a number of things which can affect which plan the server picks.

like image 21
Cade Roux Avatar answered Sep 23 '22 19:09

Cade Roux


When you say the query runs fine in SSMS, consider the following:

  1. Query plans are cached based on the exact byte-for-byte structure of the query string, including white space
  2. Plans for queries with parameters can be different than for queries with fixed values (corollary of #1)

You might try using SQL Profiler, and copy-and-paste the entire slow query from there into SSMS, to see if it's still slow.

like image 44
RickNZ Avatar answered Sep 21 '22 19:09

RickNZ