The following query is given in a PL/SQL procedure
.
SELECT e.data FROM extra e WHERE e.external_id in
(SELECT * FROM TABLE (p_external_ids)).
The type of p_external_ids
is create or replace type "VARCHAR2TABLE" as table of VARCHAR2(4000 CHAR)
.
Oracle inefficiently executes the query using a full table scan. Hints on query did not help and necessary indexes are in place. Replacing the SELECT *
part with hardcoded ids reduce query running time by a factor of 20
, when the number of rows in the table is 200 000.
For reference it takes about 0.3 sec to execute with SELECT * FROM TABLE
clause, and around 0.015 ms
for a single hardcoded id.
What are the suggested efficient ways (key search) to write a stored procedure to extract the data from the table for multiple ids? The provided collection type must be used to pass in the list of ids to a stored procedure.
What hints did you try? Can you post the fast and the slow query plan?
One of the general issues with using PL/SQL collections in SQL is that the CBO often guesses incorrectly at the number of elements in the collection and chooses the wrong plan as a result. It is often helpful in those cases to use the CARDINALITY hint, i.e.
SELECT e.data
FROM extra e
WHERE e.external_id IN (
SELECT /*+ cardinality(ids 10) */ *
FROM TABLE( p_external_ids ) ids
)
tells the optimizer to expect 10 elements in P_EXTERNAL_IDS.
Tom Kyte has a more in depth discussion about the cardinality hint and PL/SQL collections on askTom as well.
What is the data type of the EXTERNAL_ID column? Your collection is a collection of strings but EXTERNAL_ID tends to imply a NUMBER. Is there really a data type mismatch here?
Copying the collection into a temporary table would only be expected to help if the problem was that the optimizer couldn't get an accurate cardinality estimate when you referenced the collection but it could get an accurate estimate when you referenced the temporary table. If you are correctly specifying the CARDINALITY hint and that doesn't change performance, that would imply that the problem is not with the optimizer's cardinality estimates.
Can you post the fast and the slow query plans? Can you post the exact SQL statement you are using that includes the CARDINALITY hint (perhaps there is a syntax error)
I believe it is doing a full scan because it can't predict if the p_external_ids is going to be larger or smaller than the break even point.
What I mean:
If it costs 200 to do a single index lookup, and 100000 to do a full table scan, if you are looking up 20 values, total cost will be 4000 (less than 100000). But if you are looking up 1000 values, the total cost using the indices would be 200000.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With