I've got a simple problem that I haven't been able to solve despite the many similar posts, because I'm a bit of a knucklehead in R, and I'm not getting whatever it is I should be getting. I have two sets of files,
All.Files <- objects(pattern="constant.country[0-9]{4}")
all.files <- objects(pattern="constant[0-9]{4}")
that I wish to merge
mergefun <- function(X1, Y1) {
merge(X1, Y1, by = "id")
}
and then save each iteration of merge into a new dataframe
for (i in All.Files) {
a <- get(i)
{ for (j in all.files)
b <- get(j)
d <- dataframe(mergefun(a, b))
newname <- paste("C", substr(j, 9, 12), sep="")
names(d) <- c("Id", "Country", "logGDP", "GRI", GRI.group", "Year")
assign(newname,d)
}
}
While I am sure there is more elegant code out there, this code does what I want it to. The problem is that it is only saving the final iteration of merge, so that instead of getting 1:43 dataframes, I only get the 43rd. I know I am failing to index properly within the for loop, but I have struggled for hours to understand my mistake and have failed.
I am sorry I have not included a reproducible example, but was hopeful that because my code actually worked, someone would be able to instantly see what I am missing that would allow all 43 iterations to get outputted via assign. I am also aware that mapply would likely be a better solution, but I was unable to get any traction with it despite several hours of trying!
A most humble (and humbled by my own ignoRance) thank you.
for (i in All.Files) {
a <- get(i)
for (j in all.files) {
b <- get(j)
d <- dataframe(mergefun(a, b))
newname <- paste("C", substr(j, 9, 12), sep="")
names(d) <- c("Id", "Country", "logGDP", "GRI", "GRI.group", "Year")
assign(newname,d)
}
}
I fixed your bracketing. It should also be clear that since your newname only relies on j that it will be the case that your new variables will only be created on the LAST iteration of the outer for loop. Thus, you end up with only length(all.files) variables.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With