This is the data I am working with
Sales <- data.frame(State = c("New York", "New Jersey", "Texas","New Mexico","California",
"Kansas","Florida","Alaska","Montana", "Maine"),
Units = c(455,453,125,135,135,568,451,125,215,314),
Sales = c("20000","12530","51110","54110","65000",
"58220","54612","45102","45896","12510"),
ManagerName = c("Chip","Kelly","Steve","Young","Troy",
"Aikman", "Chad","Pennington","Mike","Vick")
)
My end goal is to have 10 different excel files for each "State".
I know I can do something like
#Seperate by State
library(dplyr)
NewYork <- Sales %>% filter(State == "New York")
#Save to excel
library(xlsx)
write.xlsx(NewYork, file="NewYork-Sales-November2020.xlsx",
sheetName="NewYork")
But the problem is that I have over 50+ rows and do not want to have 50 commands to filter by state and then another 50 commands to write an excel file.
Is there a more efficient way to save each row "State" into an different excel files?
The following does what the question asks for, I have tested it with the posted data.
split the data by State;The sheets are named by State.
library(openxlsx)
sp <- split(Sales, Sales$State)
names(sp) <- ifelse(nchar(names(sp)) > 31, substr(names(sp), 1, 31), names(sp))
wb <- createWorkbook(creator = "FruityPebblePug")
lapply(seq_along(sp), function(i){
addWorksheet(wb, sheetName = names(sp)[[i]])
})
lapply(seq_along(sp), function(i){
writeData(wb, sheet = names(sp)[[i]], sp[[i]])
})
saveWorkbook(wb, "Sales-November2020.xlsx")
As noted in comments, if the sheets names have too many characters, exceding Excel's 31 characters limit, the following will name the sheets "State 1", "State 2", etc.
sp <- split(Sales, Sales$State)
wb <- createWorkbook(creator = "FruityPebblePug")
lapply(seq_along(sp), function(i){
addWorksheet(wb, sheetName = paste("State", i))
})
lapply(seq_along(sp), function(i){
writeData(wb, sheet = paste("State", i), sp[[i]])
})
saveWorkbook(wb, "Sales-November2020_b.xlsx")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With