I would like to extract spatial data in a buffer of 10 km around 30 000 objects of class SpatialLines and calculate proportion of each land cover type around buffered lines. In a first time, I used the function crop
to crop my raster. Then, I used the function extract
(package raster) to calculate proportion of 10 land cover types. Here is my code:
lapply(1:nrow(tab_lines), FUN=function(k){
buf_line <- gBuffer(seg_line[k], width=10000) ## seg_line = Lines objects
ha <-extract(x=data_raster,y=buf_line)
The proportion of each land cover type must be in columns (one column = one land cover type)
ha_1 <-length(ha[[1]][ha[[1]]==1])/length(ha[[1]])
ha_2 <-length(ha[[1]][ha[[1]]==2])/length(ha[[1]])
ha_3 <-length(ha[[1]][ha[[1]]==3])/length(ha[[1]])
ha_4 <-length(ha[[1]][ha[[1]]==4])/length(ha[[1]])
ha_5 <-length(ha[[1]][ha[[1]]==5])/length(ha[[1]])
ha_6 <-length(ha[[1]][ha[[1]]==6])/length(ha[[1]])
ha_7 <-length(ha[[1]][ha[[1]]==7])/length(ha[[1]])
ha_8 <-length(ha[[1]][ha[[1]]==8])/length(ha[[1]])
ha_9 <-length(ha[[1]][ha[[1]]==9])/length(ha[[1]])
ha_10 <-length(ha[[1]][ha[[1]]==10])/length(ha[[1]])
return(cbind(ha_1, ha_2, ha_3, ha_4, ha_5, ha_6, ha_7, ha_8, ha_9, ha_10))
})
How can I speed up the processing time for 30 000 spatial lines? Is there any other packages in R that can provide faster processing for this type of extraction ?
Here is a more concise formulation
library(raster)
library(rgeos)
buf_line <- gBuffer(seg_line, width=10000, byid=TRUE)
ha <- extract(x=data_raster, y=buf_line)
h <- sapply(ha, function(x) tabulate(x, 10))
h <- h / colSums(h)
But I do not think this will be much faster. Instead of extract you could try sp::over
Depending on your computer, things might speed up by first running
beginCluster()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With