memory.limit
2
0
Entering edit mode
@dmitriy-scvortsov-1822
Last seen 10.3 years ago
An embedded and charset-unspecified text was scrubbed... Name: not available Url: https://stat.ethz.ch/pipermail/bioconductor/attachments/20060803/ 3820851d/attachment.pl
• 359 views
ADD COMMENT
0
Entering edit mode
@james-w-macdonald-5106
Last seen 2 days ago
United States
Hi Dimitriy, Dmitriy Scvortsov wrote: > Hi guys > > I am trying to load dataset of 50 cel files using ReadAffy in my machine > > > It runs win XP and I'm using R Version 2.3.1 > > Each cel file is about 13 mb so estimated memory usage is at most 800 mb > I have 2G available > Well, that estimate doesn't take into account the amount of copying that goes on, nor does it account for Window's inability to return large blocks of memory that have already been used. It is not hard at all to burn through 2 Gb of RAM with 50 U133plus2 chips. I think you could probably process this many on say, Linux, but not on Windows. You are better off trying justRMA(), which is much more memory friendly. > I have requested memory limit 2G by memory.limit(size=2048 ) command, > but I keep getting > > Error message "Error: cannot allocate vector of size 518671 Kb" which is > less than default 1G. > This error doesn't concern the total amount of memory that has been allocated, but the amount of RAM that R is requesting for the current operation. > When I am trying to check amount of memory available by > > >> memory.size(max =T) >> > > >> [1] 546152448 >> > > Can anybody explain me what's going on here , and how to make it work ? > Not sure about that one, unless you are running some other process that is sucking up a bunch of RAM. Anyway, the fix is to use justRMA(), or boot Linux. You can get Quantian and boot directly off a CD to see if Linux will work for you, if you are interested. HTH, Jim > Thank you > > > [[alternative HTML version deleted]] > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: http://news.gmane.org/gmane.science.biology.informatics.conductor >
ADD COMMENT
0
Entering edit mode
@henrik-bengtsson-4333
Last seen 7 months ago
United States
On 8/3/06, Dmitriy Scvortsov <scvortso at="" usc.edu=""> wrote: > Hi guys > > I am trying to load dataset of 50 cel files using ReadAffy in my machine > > > It runs win XP and I'm using R Version 2.3.1 > > Each cel file is about 13 mb so estimated memory usage is at most 800 mb > I have 2G available > > I have requested memory limit 2G by memory.limit(size=2048 ) command, > but I keep getting > > Error message "Error: cannot allocate vector of size 518671 Kb" which is > less than default 1G. > > When I am trying to check amount of memory available by > > >memory.size(max =T) > > >[1] 546152448 > > Can anybody explain me what's going on here , and how to make it work ? Anyone correct me if I'm wrong, but what the error message says is that R could not allocate a *contiguous* vector of size 518671Kb in RAM; that is that your memory is fragmented. It might help shutting down all applications (or even restart Windows) and restart R. /Henrik > Thank you > > > [[alternative HTML version deleted]] > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: http://news.gmane.org/gmane.science.biology.informatics.conductor > >
ADD COMMENT

Login before adding your answer.

Traffic: 819 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6