Memory Problems
3
0
Entering edit mode
@david-neil-hayes-573
Last seen 10.3 years ago
Thanks to Dr. Huber for the response to my earlier question. Another matchprobes question that may have more general interest in terms of memory usage (which in my experience has been a bigger problem than processing speed) I have a folder of files, each file representing one affybatch object (which is a single array). I am using the "load" command to read these files in batches of 10, then I perform a "combine" function. I save the results to a file, then move on to the next batch of 10. I find that my page file usage continues to increase, even though I have "removed" the original 10 affybatch objects and all references to them. As you might expect, I quickly exhaust my RAM. I have been unable to solve this on my own. In talking with some of the Bioconductor staff, I understand this may relate to the environments used in the affy package. To reduce my memory usage I have tried: affybatch <- 0 gc() rm(affybatch) putting the entire batching process in a separate function function from which I exit before moving to the next batch _________________________________________________________________ ItÂ’s our best dial-up Internet access offer: 6 months @$9.95/month. Get it
PROcess PROcess • 924 views
ADD COMMENT
0
Entering edit mode
@james-w-macdonald-5106
Last seen 5 hours ago
United States
You don't mention what version of R you are using, nor your OS. However, since you are having memory re-allocation problems, I have to assume you are on win32 and that you are using R < 1.9.0 or 1.8.1-patched. My understanding of memory issues in win32 with earlier versions of R is that the memory allocation process is sort of one-way, so you can run out of memory even if you are running the garbage collector to reclaim it. I am sure this is not technically correct, and if BDR were subscribed to this list he would correct me, but the effect remains; if you allocate too much memory to big objects you will eventually run out even if you try to reclaim it. The patched version of R and R-1.9.0 have a different malloc that is supposed to be better at reclaiming memory, so you might go to Duncan Murdoch's website and get one or the other. http://www.stats.uwo.ca/faculty/murdoch/software/r-devel/ Best, Jim James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623 >>> "david neil hayes" <davidneilhayes@hotmail.com> 12/17/03 04:15PM >>> Thanks to Dr. Huber for the response to my earlier question. Another matchprobes question that may have more general interest in terms of memory usage (which in my experience has been a bigger problem than processing speed) I have a folder of files, each file representing one affybatch object (which is a single array). I am using the "load" command to read these files in batches of 10, then I perform a "combine" function. I save the results to a file, then move on to the next batch of 10. I find that my page file usage continues to increase, even though I have "removed" the original 10 affybatch objects and all references to them. As you might expect, I quickly exhaust my RAM. I have been unable to solve this on my own. In talking with some of the Bioconductor staff, I understand this may relate to the environments used in the affy package. To reduce my memory usage I have tried: affybatch <- 0 gc() rm(affybatch) putting the entire batching process in a separate function function from which I exit before moving to the next batch _________________________________________________________________ It's our best dial-up Internet access offer: 6 months @$9.95/month. Get it _______________________________________________ Bioconductor mailing list Bioconductor@stat.math.ethz.ch https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor
ADD COMMENT
0
Entering edit mode
@david-neil-hayes-573
Last seen 10.3 years ago
Thanks for the insight, I will try this. You are correct in that I am using a windows machine, R1.8.1. Would this problem (and many related problems) be substantially improved if I switched to a Linux system? Thanks, Neil >From: "James MacDonald" <jmacdon@med.umich.edu> >To: <davidneilhayes@hotmail.com>,<bioconductor@stat.math.ethz.ch> >Subject: Re: [BioC] Memory Problems >Date: Thu, 18 Dec 2003 08:45:25 -0500 > >You don't mention what version of R you are using, nor your OS. However, >since you are having memory re-allocation problems, I have to assume you >are on win32 and that you are using R < 1.9.0 or 1.8.1-patched. > >My understanding of memory issues in win32 with earlier versions of R >is that the memory allocation process is sort of one-way, so you can run >out of memory even if you are running the garbage collector to reclaim >it. I am sure this is not technically correct, and if BDR were >subscribed to this list he would correct me, but the effect remains; if >you allocate too much memory to big objects you will eventually run out >even if you try to reclaim it. > >The patched version of R and R-1.9.0 have a different malloc that is >supposed to be better at reclaiming memory, so you might go to Duncan >Murdoch's website and get one or the other. > >http://www.stats.uwo.ca/faculty/murdoch/software/r-devel/ > >Best, > >Jim > > > >James W. MacDonald >Affymetrix and cDNA Microarray Core >University of Michigan Cancer Center >1500 E. Medical Center Drive >7410 CCGC >Ann Arbor MI 48109 >734-647-5623 > > >>> "david neil hayes" <davidneilhayes@hotmail.com> 12/17/03 04:15PM > >>> >Thanks to Dr. Huber for the response to my earlier question. Another >matchprobes question that may have more general interest in terms of >memory >usage (which in my experience has been a bigger problem than processing > >speed) > >I have a folder of files, each file representing one affybatch object >(which >is a single array). I am using the "load" command to read these files >in >batches of 10, then I perform a "combine" function. I save the >results to >a file, then move on to the next batch of 10. > >I find that my page file usage continues to increase, even though I >have >"removed" the original 10 affybatch objects and all references to them. > As >you might expect, I quickly exhaust my RAM. I have been unable to >solve >this on my own. In talking with some of the Bioconductor staff, I >understand this may relate to the environments used in the affy >package. > >To reduce my memory usage I have tried: > affybatch <- 0 > gc() > rm(affybatch) > putting the entire batching process in a separate function function >from >which I exit before > moving to the next batch > >_________________________________________________________________ >It's our best dial-up Internet access offer: 6 months @$9.95/month. >Get it > >_______________________________________________ >Bioconductor mailing list >Bioconductor@stat.math.ethz.ch >https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor _________________________________________________________________ Check your PC for viruses with the FREE McAfee online computer scan.
ADD COMMENT
0
Entering edit mode
@james-w-macdonald-5106
Last seen 5 hours ago
United States
Probably. I did a little study of the amount of memory required to read in a given number of either U95A or U133A chips on WinXP and SuSE Linux 7.3, and Linux did a better job. By 'better job', I mean that I didn't have to kill R after every run to free memory back up, and it appeared to use less memory per chip. Of course this was using R-1.7.1 and Affy 1.3.1 and 1.3.3 (back in the dark ages ;-D). Ben Bolstad did a more comprehensive study that you can see here: http://stat-www.berkeley.edu/~bolstad/ComputeRMAFAQ/size.html AFAIK, the new malloc is supposed to address some of the problems with memory allocation under win32, but I think the Linux memory allocation is still likely to be superior. Best, Jim James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623 >>> "david neil hayes" <davidneilhayes@hotmail.com> 12/18/03 09:47AM >>> Thanks for the insight, I will try this. You are correct in that I am using a windows machine, R1.8.1. Would this problem (and many related problems) be substantially improved if I switched to a Linux system? Thanks, Neil >From: "James MacDonald" <jmacdon@med.umich.edu> >To: <davidneilhayes@hotmail.com>,<bioconductor@stat.math.ethz.ch> >Subject: Re: [BioC] Memory Problems >Date: Thu, 18 Dec 2003 08:45:25 -0500 > >You don't mention what version of R you are using, nor your OS. However, >since you are having memory re-allocation problems, I have to assume you >are on win32 and that you are using R < 1.9.0 or 1.8.1-patched. > >My understanding of memory issues in win32 with earlier versions of R >is that the memory allocation process is sort of one-way, so you can run >out of memory even if you are running the garbage collector to reclaim >it. I am sure this is not technically correct, and if BDR were >subscribed to this list he would correct me, but the effect remains; if >you allocate too much memory to big objects you will eventually run out >even if you try to reclaim it. > >The patched version of R and R-1.9.0 have a different malloc that is >supposed to be better at reclaiming memory, so you might go to Duncan >Murdoch's website and get one or the other. > >http://www.stats.uwo.ca/faculty/murdoch/software/r-devel/ > >Best, > >Jim > > > >James W. MacDonald >Affymetrix and cDNA Microarray Core >University of Michigan Cancer Center >1500 E. Medical Center Drive >7410 CCGC >Ann Arbor MI 48109 >734-647-5623 > > >>> "david neil hayes" <davidneilhayes@hotmail.com> 12/17/03 04:15PM > >>> >Thanks to Dr. Huber for the response to my earlier question. Another >matchprobes question that may have more general interest in terms of >memory >usage (which in my experience has been a bigger problem than processing > >speed) > >I have a folder of files, each file representing one affybatch object >(which >is a single array). I am using the "load" command to read these files >in >batches of 10, then I perform a "combine" function. I save the >results to >a file, then move on to the next batch of 10. > >I find that my page file usage continues to increase, even though I >have >"removed" the original 10 affybatch objects and all references to them. > As >you might expect, I quickly exhaust my RAM. I have been unable to >solve >this on my own. In talking with some of the Bioconductor staff, I >understand this may relate to the environments used in the affy >package. > >To reduce my memory usage I have tried: > affybatch <- 0 > gc() > rm(affybatch) > putting the entire batching process in a separate function function >from >which I exit before > moving to the next batch > >_________________________________________________________________ >It's our best dial-up Internet access offer: 6 months @$9.95/month. >Get it > >_______________________________________________ >Bioconductor mailing list >Bioconductor@stat.math.ethz.ch >https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor _________________________________________________________________ Check your PC for viruses with the FREE McAfee online computer scan. _______________________________________________ Bioconductor mailing list Bioconductor@stat.math.ethz.ch https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor
ADD COMMENT

Login before adding your answer.

Traffic: 588 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6