I access several tables remotely via DB Link. They are very normalized and the data in each is effective-dated. Of the millions of records in each table, only a subset of ~50k are current records.
The tables are internally managed by a commercial product that will throw a huge fit if I add indexes or make alterations to its tables in any way.
What are my options for speeding up access to these tables?
Having two identical indexes makes a negative impact on the performance of SQL queries. It is actually a waste of disk space and also slows down the insertions to the table. Therefore, it is a good practice to avoid duplicate indexes to eliminate these issues.
Indexing makes columns faster to query by creating pointers to where data is stored within a database. Imagine you want to find a piece of information that is within a large database. To get this information out of the database the computer will look through every row until it finds it.
You could try to create a materialized view of some subset of the tables over the DB link and then query from those.
I think you're stuck between a rock and a hard place here, but in the past the following has worked for me:
You can pull down a snapshot of the current data at specified intervals, every hour or nightly or whatever works, and add your indexes to your own tables as needed. If you need realtime access the data, then you can try pulling all the current records into a temp table and indexing as needed.
The extra overhead of copying from one database into your own may dwarf the actual benefit, but its worth a shot.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With