I have been using R for a while, but this is the first time I have been trying to use it to play with "Big Data." In addition, I am new to Ubuntu.
Currently, there is nothing in my workspace (I am using R Studio), but when I look at the System, the only open R session, generated by R Studio, is allocating over 2 GB of memory on my machine. See the screenshot below.
What am I missing? I typically use rm(), but obviously that is not freeing up the memory. Any help very much appreciated.

I assume you allocated large objects before calling rm. Don't forget to remove any hidden objects (with names that start with ".").
You then need to call gc to actually collect and dispose of all the garbage...
# Remove all objects in the workspace
rm(list=ls(all=TRUE))
# Then collect garbage...
gc()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With