I'm using 64bit R in Windows 10, and my current
memory.limit() is 16287.
I'm working with mass spectra files (mzXML), I've been calling individual files one at a time using the line below, which
memory.size() to 7738.28. Then I filter out noise using basic R functions, and plot the EICs.
msdata <- xcmsRaw(datafile1,profstep=0.01,profmethod="bin",profparam=list(),includeMSn=FALSE,mslevel=NULL, scanrange=NULL) EIC<-rawEIC(mzxcms,mass) RT<-mzxcms@scantime plot(RT,EIC[], type='l')
Then I remove all the variables I created that are saved in my global environment using
But when I try to call in a new mzXML file, R tells me it
cannot allocate vector of size **Gb. When this error showed up, I checked my
memory.size(), which was 419.32, and I also used
gc() to confirm that the
used memory (on the
Vcells row) is on the same order with the number I see when I first open a new R session and type in
I couldn't find any information on why R still thinks that something is taking up a bunch of memory space when the environment is completely empty. But if I terminate the session and reopen the program, I can import the data file. So it seems like there is still some memory being used even when the environment is empty, and it would still take up the space until I terminate the session. Does anyone have similar experience or suggestion on why this is happening?