Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Error: vector memory exhausted (limit reached?)

Tags:

macos

r

I previously saved a 2.8G RData file and now I'm trying to load it so I can work on it again, but weirdly, I can't. It's giving the error

Error: vector memory exhausted (limit reached?)

This is weird since I was working with it fine before. One thing that changed though is I upgraded to the latest version of R 3.5.0. I saw a previous post with the same error like this but it wasn't resolved. I was hopeful with this solution which increases the memory.limit() but unfortunately, it's only available for Windows.

Can anyone help? I don't really understand what's the problem here since I was able to work with my dataset before the update so it shouldn't be throwing this error.

Did the update somehow decreased the RAM allocated to R? Can we manually increase the memory.limit() in Mac to solve this error?

like image 203
Brent Carbonera Avatar asked Jun 28 '18 15:06

Brent Carbonera


1 Answers

This change was necessary to deal with operating system memory over-commit issues on Mac OS. From the NEWS file:

  \item The environment variable \env{R_MAX_VSIZE} can now be used
  to specify the maximal vector heap size. On macOS, unless specified
  by this environment variable, the maximal vector heap size is set to
  the maximum of 16GB and the available physical memory. This is to
  avoid having the \command{R} process killed when macOS over-commits
  memory.

Set the environment variable R_MAX_VSIZE to an appropriate value for your system before starting R and you should be able to read your file.

like image 91
Luke Tierney Avatar answered Oct 17 '22 22:10

Luke Tierney