[Disclaimer: similar questions have been asked many times. I don't believe this is the same as the many threads I've just read.]
I did:
library(dplyr)
colnames(LarvalSamples) %<>%
stringr::str_remove_all("_log") %>%
stringr::str_replace_all("Sea_Level", "Sea_Level_Height") %>% #sealevel, sealion, chinook, chl
stringr::str_replace_all("SeaLion", "Sea_lion") %>%
stringr::str_replace_all("Chinook_Salmon", "Salmon") %>%
stringr::str_replace_all("Chlorophyll_a", "Chlorophyll_A")
Worked fine, no messages, outputs as expected/desired. Then I copy/pasted those first two lines, except the terminal pipe:
colnames(LarvalSamples) %<>%
stringr::str_remove_all("_log")
Error in colnames(LarvalSamples) %<>% stringr::str_remove_all("_log") : could not find function "%<>%"
I realise there are other posts on here about not finding functions but dplyr
is loaded AND worked on MORE code just two lines above. As it happens there are no "_log"
patterns in the colnames
but I tried a different character pattern which does exist and that failed the same so that's one potential source of error eliminated. Any thoughts/guesses appreciated, this feels like a bug more than a question tbh but would be good to get sharper eyes on it before raising it up the chain, if so required. Thanks.
> sessionInfo()
R version 3.5.0 (2018-04-23)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 17134)
Matrix products: default
locale:
[1] LC_COLLATE=English_United Kingdom.1252 LC_CTYPE=English_United Kingdom.1252
[3] LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C
[5] LC_TIME=English_United Kingdom.1252
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] dplyr_0.8.0.1 beepr_1.3 gbm.auto_1.2.0
loaded via a namespace (and not attached):
[1] Rcpp_1.0.0 compiler_3.5.0 pillar_1.3.1 shapefiles_0.7 tools_3.5.0 tibble_2.0.1
[7] gtable_0.2.0 lattice_0.20-35 pkgconfig_2.0.2 rlang_0.3.1 Matrix_1.2-14 DBI_1.0.0
[13] rstudioapi_0.9.0 rgdal_1.4-2 gbm_2.1.5 dismo_1.1-4 gridExtra_2.3 stringr_1.4.0
[19] raster_2.8-19 mapplots_1.5.1 rgeos_0.4-2 grid_3.5.0 tidyselect_0.2.5 glue_1.3.0
[25] R6_2.4.0 survival_2.41-3 foreign_0.8-70 sp_1.3-1 purrr_0.3.1 magrittr_1.5
[31] codetools_0.2-15 splines_3.5.0 maptools_0.9-5 assertthat_0.2.0 stringi_1.3.1 crayon_1.3.4
[37] audio_0.1-5.1
Update: Reproducible example below. This definitely seems to be a bug. With a completely fresh system:
Data <- data.frame(
Name_Bad = sample(1:10),
Name_Guud = sample(1:10)
)
colnames(Data) %<>%
stringr::str_remove_all("_Bad") %>%
stringr::str_replace_all("Guud", "Good")
# Error: could not find function "%>%"
install.packages("dplyr")
library(dplyr)
install.packages("stringr")
library(stringr)
colnames(Data) %<>%
stringr::str_remove_all("_Bad") %>%
stringr::str_replace_all("Guud", "Good")
# no error, worked
colnames(Data) %<>%
stringr::str_remove_all("_Bad")
# Error: could not find function "%<>%"
This error usually occurs when a package has not been loaded into R via library . R does not know where to find the specified function. It's a good habit to use the library functions on all of the packages you will be using in the top R chunk in your R Markdown file, which is usually given the chunk name setup .
%>% is called the forward pipe operator in R. It provides a mechanism for chaining commands with a new forward-pipe operator, %>%. This operator will forward a value, or the result of an expression, into the next function call/expression.
Similarly to readr , dplyr and tidyr are also part of the tidyverse. These packages were loaded in R's memory when we called library(tidyverse) earlier.
%<>%
isn’t exported by dplyr (only %>%
is). You need to load magrittr instead.
Your reproducible example is running into a subtle magrittr bug, which causes the evaluation of pipeline expressions to search for some operators in magrittr’s scope, rather than in the calling scope. That way, x %<>% y %>% z
, which evaluates as `%>%`(x %<>% y, z)
, finds and evaluates magrittr’s `%<>%`
operator.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With