I want to use the across()
function in dplyr
but get an error. For instance, running
iris %>%
group_by(Species) %>%
summarise(across(starts_with("Sepal"), mean))
gives me
Error in across(starts_with("Sepal"), mean) :
could not find function "across"
across()
is a recent introduction https://towardsdatascience.com/what-you-need-to-know-about-the-new-dplyr-1-0-0-7eaaaf6d78ac in dplyr
. However, the package dplyr
is updated and loaded
packageVersion('dplyr')
[1] ‘1.0.0’
Checking inside dplyr
ls("package:dplyr")
[1] "%>%" "add_count" "add_count_" "add_row" "add_rownames" "add_tally"
[7] "add_tally_" "all_equal" "all_vars" "anti_join" "any_vars" "arrange"
[13] "arrange_" "arrange_all" "arrange_at" "arrange_if" "as_data_frame" "as_label"
I find that across
does not exist, however if I look the function up in the helper ?across
I get the classic page explaining the functionalities of across
.
How to get across
to work?
----- EDIT -----
My sessionInfo()
is as follows:
> sessionInfo()
R version 3.6.1 (2019-07-05)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Mojave 10.14.6
Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] tidyselect_1.1.0 dplyr_1.0.0
loaded via a namespace (and not attached):
[1] Rcpp_1.0.3 cellranger_1.1.0 pillar_1.4.2 compiler_3.6.1 forcats_0.4.0 tools_3.6.1 jsonlite_1.6 lubridate_1.7.4 lifecycle_0.2.0
[10] tibble_2.1.3 nlme_3.1-140 gtable_0.3.0 lattice_0.20-38 pkgconfig_2.0.3 rlang_0.4.6 cli_1.1.0 rstudioapi_0.10 haven_2.1.1
[19] xml2_1.2.2 httr_1.4.1 stringr_1.4.0 generics_0.0.2 vctrs_0.3.1 hms_0.5.1 grid_3.6.1 glue_1.4.1 R6_2.4.0
[28] fansi_0.4.0 readxl_1.3.1 readr_1.3.1 modelr_0.1.5 tidyr_1.0.0 purrr_0.3.3 ggplot2_3.2.1 magrittr_1.5 backports_1.1.4
[37] scales_1.0.0 rvest_0.3.4 assertthat_0.2.1 tidyverse_1.2.1 colorspace_1.4-1 utf8_1.1.4 stringi_1.4.3 lazyeval_0.2.2 munsell_0.5.0
[46] broom_0.5.2 crayon_1.3.4
> .libPaths()
[1] "/Library/Frameworks/R.framework/Versions/3.6/Resources/library"
0 The across function is available only in the development version of dplyr, not on CRAN yet to use the accross function install dplyr dev version by using the code below install.packages("devtools") library(devtools) devtools::install_github("tidyverse/dplyr") library(dplyr)
Nope, it doesn't. Error: 'across' is not an exported object from 'namespace:dplyr' – Andrew Jun 25 '20 at 14:44 1 Is dplyr 1.0.0 listed in your sessionInfo after you load the package? I'm wondering if you have another version of dplyr somewhere else that is causing problems.
The dplyr package is used to perform simulations in the data by performing manipulations and transformations. It can be installed into the working space using the following command : The is.na () method in R is used to check if the variable value is equivalent to NA or not.
Because across () is used within functions like summarise () and mutate (), you can't select or compute upon grouping variables. across () returns a tibble with one column for each column in .cols and each function in .fns. if_any () and if_all () return a logical vector. R code in dplyr verbs is generally evaluated once per group.
The across function is available only in the development version of dplyr, not on CRAN yet to use the accross function install dplyr dev version by using the code below
install.packages("devtools")
library(devtools)
devtools::install_github("tidyverse/dplyr")
library(dplyr)
now your code should work
data("iris")
iris %>%
group_by(Species) %>%
summarise(across(starts_with("Sepal"), mean))
output
# A tibble: 3 x 3
Species Sepal.Length Sepal.Width
<fct> <dbl> <dbl>
1 setosa 5.01 3.43
2 versicolor 5.94 2.77
3 virginica 6.59 2.97
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With