I'm trying to run an R script (in particular, I am using the "getLineages" function from the Bioconductor package, Slingshot.
I'm wondering why the error "vector memory exhausted (limit reached?)" is showing up when I use this function, as it doesn't seem to be the most memory-intensive function compared to the other functions in this package (with the data I am analyzing).
I do understand that there are other questions like this on Stackoverflow, but they all suggest to switch over to the 64-bit version of R. However, I am already using this version. There seem to be no other answers to this issue so far, I was wondering if anyone might know?
The data is only ~120mb in size, which is far less than my computer's 8GB of RAM.
For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000)
, as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:
Error: vector memory exhausted (limit reached?)
After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:
Step 1: Open terminal,
Step 2:
cd ~ touch .Renviron open .Renviron
Step 3: Save the following as the first line of .Renviron
:
R_MAX_VSIZE=100Gb
Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine
I had the same problem, increasing the "R_MAX_VSIZE"
did not help in my case, instead cleaning the variables no longer needed solved the problem. Hope this helps those who are struggling here.
rm(large_df, large_list, large_vector, temp_variables)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With