Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Slow indexing of 300GB Postgis table

I am loading about 300GB of contour line data in to an postgis table. To speed up the process i read that it is fastest to first load the data, and then create an index. Loading the data only took about 2 days, but now I have been waiting for the index for about 30 days, and it is still not ready.

The query was:

create index idx_contour_geom on contour.contour using gist(geom);

I ran it in pgadmin4, and the memory consumption of the progran has varied from 500MB to 100GB++ since.

Is it normal to use this long time to index such a database?

Any tips on how to speed up the process?

Edit: The data is loaded from 1x1 degree (lat/lon) cells (about 30.000 cells) so no line has a bounding box larger than 1x1 degree, most of then should be much smaller. They are in EPSG:4326 projection and the only attributes are height and the geometry (geom).

like image 720
Einar Avatar asked Oct 29 '22 07:10

Einar


1 Answers

I changed the maintenance_work_mem to 1GB and stopped all other writing to disk (a lot of insert opperations had ANALYZE appended, which took a lot of resources). I now ran in 23min.

like image 107
Einar Avatar answered Nov 15 '22 05:11

Einar