I have 70+ raster images in TIFF format that I am trying to merge.
Originals can be found here: http://www.faa.gov/air_traffic/flight_info/aeronav/digital_products/vfr/
After pre-processing (pct2rgb, gdalwarp individual charts, gdal_translate to cut the collars) I try to run them through gdalwarp to mosaic them using a command like this:
gdalwarp --config GDAL_CACHEMAX 3000 -overwrite -wm 3000 -r bilinear -srcnodata 0 -dstnodata 0 -wo "NUM_THREADS=3" /data/aeronav/sec/c/Albuquerque_c.tif .....70 other file names ...master.tif
After 12 hours of processing:
Creating output file that is 321521P x 125647L. Processing input file /data/aeronav/sec/c/Albuquerque_c.tif. 0...10...20...30...40...
This means gdalwarp is never going to finish.
In contrast. A gdal_merge command like this:
gdal_merge.py -n 0 -a_nodata 0 -o /data/aeronav/sec/master.tif /data/aeronav/sec/c/Albuquerque_c.tif ......70 plus files.....
Finishes in couple of hours.
Problem with gdal_merge is inferior quality output because of "average" sampling. I would like to use "bilinear" at the minimum - and "cubic" sampling if possible and for that gdalwarp is required.
Why is there such a big difference in performance of the two ? Why doesn't gdalwarp want to finish ? Is there any other command line option to speed things up in gadalwarp or is there a way to add sampling option to gdal_merge ?
It seems gdalwarp
is not the ideal command to merge these GeoTiffs (since I am not interested in warping again). Instead I used
gdalbuildvrt /data/aeronav/sec/master.virt .... 70+ files in order
to build a virtual mosaic. And then I used gdal_translate
to convert the virt file into a GeoTiff:
gdal_translate -of GTiff /data/aeronav/sec/master.virt /data/aeronav/sec/master.tif
That's it—this took less than an hour (even faster than gdal_merge
and preserves quality of original files).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With