I work on Python 3.7 with R 3.5.1. When I call my function from the R script from Python it works in principle, however in the end it crashes with output:
rpy2.rinterface.RRuntimeError: Error: cannot allocate vector of size 1006.0 Mb
I have 16GB of RAM and I inspected during the long processing the fluctuations of memory usage which maxed at less than 5GB.
To make things even more certain of what's going on I traced memory usage with
gc()
gcinfo(TRUE)
which dumps garbage and tracks when R does so too by itself.
Additionally, in R I get the following, indicating 16k MB which is all my default RAM:
> memory.limit()
[1] 16244
Finally, when I run the R script straight from R it works fine and no memory issues are encountered.
Is it a formidable bug of rpy2 or something else that could be managed right now? If anyone has any suggestions they are most welcome, I will try.
OS: Windows 10, 64bit
From this thread :
This seems to be caused by hard-to-reconcile differences between...R vectors and Python arrays.
I can suggest doing the following as a workaround:
consider a type of conversion that is offered in the thread, such as:
from rpy2.robjects import conversion
df_R = conversion.converter.py2ri(df_pandas)
Moderator note: this could be a comment, but I am unable to post comments
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With