Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiple queries VS Stored Procedure

I have an application which does around 20000 DATA-OPERATIONS per/hour DATA-OPERATION has overall 30 parameters(for all 10 queries). Some are text, some are numeric. Some Text params are as long as 10000 chars.

Every DATA-OPERATION does following:

  • A single DATA-OPERATION, inserts / updates multiple tables(around 10) in database.
  • For every DATA-OPERATION, I take one connection,
  • Then I use new prepared-statement for each query in the DATA-OPERATION.
  • Prepared-statement is closed every time a query is executed.
  • Connection is reused for all 10 prepared-statements.
  • Connection is closed when DATA-OPERATION is completed.

Now to perform this DATA-OPERATION,

  • 10 queries, 10 prepared-statement(create, execute, close), 1o n/w calls.
  • 1 connection (Open,Close).

I personally think that, if I create a Stored Procedure from above 10 queries, it will be better choice.

In case of SP, DATA-OPERATION will have:

  • 1 connection, 1 callable statement, 1 n/w hit.

I suggested this, but I am told that

  • This might be more time consuming than SQL-queries.
  • It will put additional load on DB server.

I still think SP is a better choice. Please let me know your inputs.

Benchmarking is an option. Will have to search any tools which can help in this. Also can any one suggest already available benchmarks for this kind of problem.

like image 289
Learn More Avatar asked Oct 03 '22 11:10

Learn More


1 Answers

Any recommendation depends partially on where the script executing the queries resides. If the script executing the queries is on the same server as the MySQL instance then you won't see that much of a difference, but there will still be a small overhead in executing 200k queries compared to 1 stored procedure.

My advice either way would be though to make it as a stored procedure. You would need maybe a couple of procedures.

  1. A procedure that combines the 10 statements you do per-operation into 1 call
  2. A procedure that can iterate over a table of arguments using a CURSOR to feed into procedure 1

Your process would be

  1. Populate a table with arguments which would be fed into procedure 1 by procedure 2
  2. Execute procedure 2

This would yield performance benefits as there is no need to connect to the MySQL server 20000*10 times. While the overhead per-request may be small, milliseconds add up. Even if the saving is 0.1ms per request, that's still 20 seconds saved.

Another option could be to modify your requests to perform all 20k data operations at once (if viable) by adjusting your 10 queries to pull data from the database table mentioned above. The key to all of this is to get the arguments loaded in a single batch insert, and then using statements on the MySQL server within a procedure to process them without further round trips.

like image 67
Simon at My School Portal Avatar answered Oct 07 '22 18:10

Simon at My School Portal