Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Postgresql archiving old data

Tags:

postgresql

I need some expert advice on Postgres

I have few tables in my database that can grow huge, may be a hundred million records and have to implement some sort of data archiving in place. Say I have a subscriber table and subscriber_logs table. The subscriber_logs table will grow huge with time, affecting performance. I wanted to create a separate table called archive_subscriber_logs and create a scheduled task which will read from subscriber_logs and insert the data into archive_subscriber_logs, then delete the dumped data from subscriber_logs. But my concern is, should I create the archive_subscriber_logs in the same database or in a different database. The problem with storing in a different db is the foreign key constraints that already exists on the main tables.

Anyone can suggest whether same db or different db is preferable? Or any other solutions?

like image 775
Kevin Joymungol Avatar asked Nov 30 '14 07:11

Kevin Joymungol


1 Answers

Consider table partitioning, which is implemented in Postgres using table inheritance. This will improve performance on very large tables. Of course you would do measurements first to make sure it is worth implementing. The details are in the excellent Postgres documentation.

Using separate databases is not recommended because you won't be able to have foreign key constraints easily.

like image 93
Laryx Decidua Avatar answered Oct 20 '22 13:10

Laryx Decidua