affy & problems with memory
5
0
Entering edit mode
@ettinger-nicholas-1549
Last seen 10.2 years ago
Hello all! I have 16 Affy Hg-U133plus2 cel files that I am trying to analyze. I am having difficulties generating expression data because of memory problems. I am working on a Dell desktop running WinXP. I can successfully generate expression values if I use "RMA" but if I try MAS5 or GCRMA or LiWong instead then I consistently get an error message with "Cannot allocate vector of size 169362 Kb." To try to increase available memory I have already done two things: (a) I right-clicked on the R icon and changed the box labeled 'Target' to read "C:\Program Files\R\R-2.2.0\bin\Rgui.exe" --sdi --max-mem-size=2000M" and (b) I have increased the virtual memory Total Page File size to 2000MB. But it still runs out of memory. Help! Any suggestions -- I don't really have access to any bigger computers? Thanks. ---Nick
affy gcrma affy gcrma • 2.4k views
ADD COMMENT
0
Entering edit mode
@ettinger-nicholas-1549
Last seen 10.2 years ago
Thanks for the tip! That worked great for GCRMA. I tried to look but couldn't find anything similar for mas5 or for the LiWong algorithm? Did I miss something or is the only solution there to get more RAM? Thanks. ---Nick -----Original Message----- From: Marta Agudo [mailto:martabar@um.es] Sent: Tuesday, January 10, 2006 12:36 PM To: Ettinger, Nicholas; bioconductor-bounces at stat.math.ethz.ch Subject: RE: [BioC] affy & problems with memory Try justGCRMA Marta Agudo PhD Departamento de Oftalmolog?a Facultad de Medicina Campus Espinardo 30100 Murcia- Spain Phone:+34 968363996 -----Mensaje original----- De: bioconductor-bounces at stat.math.ethz.ch [mailto:bioconductor-bounces at stat.math.ethz.ch] En nombre de Ettinger, Nicholas Enviado el: martes, 10 de enero de 2006 17:03 Para: bioconductor at stat.math.ethz.ch Asunto: [BioC] affy & problems with memory Hello all! I have 16 Affy Hg-U133plus2 cel files that I am trying to analyze. I am having difficulties generating expression data because of memory problems. I am working on a Dell desktop running WinXP. I can successfully generate expression values if I use "RMA" but if I try MAS5 or GCRMA or LiWong instead then I consistently get an error message with "Cannot allocate vector of size 169362 Kb." To try to increase available memory I have already done two things: (a) I right-clicked on the R icon and changed the box labeled 'Target' to read "C:\Program Files\R\R-2.2.0\bin\Rgui.exe" --sdi --max-mem-size=2000M" and (b) I have increased the virtual memory Total Page File size to 2000MB. But it still runs out of memory. Help! Any suggestions -- I don't really have access to any bigger computers? Thanks. ---Nick _______________________________________________ Bioconductor mailing list Bioconductor at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/bioconductor
ADD COMMENT
0
Entering edit mode
Ettinger, Nicholas wrote: > Thanks for the tip! > > That worked great for GCRMA. I tried to look but couldn't find anything similar for mas5 or for the LiWong algorithm? > > Did I miss something or is the only solution there to get more RAM? justMAS() in the simpleaffy package. I don't know of anything for LiWong (except of course for dChip itself). Best, Jim -- James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623
ADD REPLY
0
Entering edit mode
@james-w-macdonald-5106
Last seen 13 minutes ago
United States
Ettinger, Nicholas wrote: > Hello all! > > I have 16 Affy Hg-U133plus2 cel files that I am trying to analyze. I am > having difficulties generating expression data because of memory > problems. I am working on a Dell desktop running WinXP. I can > successfully generate expression values if I use "RMA" but if I try MAS5 > or GCRMA or LiWong instead then I consistently get an error message with > "Cannot allocate vector of size 169362 Kb." To try to increase > available memory I have already done two things: (a) I right- clicked on > the R icon and changed the box labeled 'Target' to read "C:\Program > Files\R\R-2.2.0\bin\Rgui.exe" --sdi --max-mem-size=2000M" and (b) I > have increased the virtual memory Total Page File size to 2000MB. But > it still runs out of memory. > > Help! Any suggestions -- I don't really have access to any bigger > computers? You don't need a bigger computer, you need more RAM, which is pretty cheap these days. I can't imagine it would take more than 512 Mb to do 16 U133 plus 2 chips, so you might look into adding a 512 Mb stick. Best, Jim > > Thanks. > ---Nick > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor -- James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623
ADD COMMENT
0
Entering edit mode
Hi Nick, I'm currently working on that chip type myself on a system with 512MB RAM and had to increase the memory assigned to R too. However "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of how much memory is necessary for the hgu133plus2 maybe somebody out there could answer me this two related questions: * I'm running R under WinXp AND under Linux on the same machine. However there has never been any memory problem in linux. So is the memory allocation or the assignment of the max. memory size that can be used by R different in the two systems? * With my 512MB RAM system the old maximum memory value under WinXP was 512MB. Is this just a coincidence or does this maximum value rise to 1024MB if I upgrade my system to 1024MB RAM? regards, Benjamin > -----Original Message----- > From: bioconductor-bounces at stat.math.ethz.ch > [mailto:bioconductor-bounces at stat.math.ethz.ch]On Behalf Of James W. > MacDonald > Sent: 10 January 2006 20:02 > To: Ettinger, Nicholas > Cc: bioconductor at stat.math.ethz.ch > Subject: Re: [BioC] affy & problems with memory > > > Ettinger, Nicholas wrote: > > Hello all! > > > > I have 16 Affy Hg-U133plus2 cel files that I am trying to analyze. I am > > having difficulties generating expression data because of memory > > problems. I am working on a Dell desktop running WinXP. I can > > successfully generate expression values if I use "RMA" but if I try MAS5 > > or GCRMA or LiWong instead then I consistently get an error message with > > "Cannot allocate vector of size 169362 Kb." To try to increase > > available memory I have already done two things: (a) I right- clicked on > > the R icon and changed the box labeled 'Target' to read "C:\Program > > Files\R\R-2.2.0\bin\Rgui.exe" --sdi --max-mem-size=2000M" and (b) I > > have increased the virtual memory Total Page File size to 2000MB. But > > it still runs out of memory. > > > > Help! Any suggestions -- I don't really have access to any bigger > > computers? > > You don't need a bigger computer, you need more RAM, which is pretty > cheap these days. I can't imagine it would take more than 512 Mb to do > 16 U133 plus 2 chips, so you might look into adding a 512 Mb stick. > > Best, > > Jim > > > > > > Thanks. > > ---Nick > > > > _______________________________________________ > > Bioconductor mailing list > > Bioconductor at stat.math.ethz.ch > > https://stat.ethz.ch/mailman/listinfo/bioconductor > > > -- > James W. MacDonald > Affymetrix and cDNA Microarray Core > University of Michigan Cancer Center > 1500 E. Medical Center Drive > 7410 CCGC > Ann Arbor MI 48109 > 734-647-5623 > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor >
ADD REPLY
0
Entering edit mode
The maximum number of hgu133plus2 arrays that can be processed using expresso-like tools in R on a 32-bit Windows machine is about 30. The maximum number of hgu95a arrays that can be processed using expresso-like tools in R on a 32-bit Windows machine is about 90. The rough computation is simple: each hgu133plus 2 array contains 1164*1164 = 1,354,896 probes. Allowing at least 10 bytes to store the probe intenisty as a double precision float, the probe-level data for one array requires about 13.5 MB. Thus, 30 arrays requires at least 400 MB just to store the probe level data. Because R implements pass-by-value rather than pass-by-reference (and because BioConductor has followed the usual good programming practice of writing and reusing modular functions), you can easily get as many as 3 copies of this data structure floating around in memory at the same time, putting us at the 1.2 GB level. This doesn't include any additional overhead for other parts of the AffyBatch structure or the amount of space needed to store the code objects, and anything else in your current session. Now, a 32-bit machine is limited to adressing 4GB of RAM, but as far as I can tell, Windows won't let a single program see more than 2GB, so you can begin to see why expresso can't handle more than about 30 of these arrays at once. The specialized functions like justrma get around this problem by passing control over to routines written in C, which presimably use pass-by-reference to avoid making extra copies of the data. -- Kevin --On Wednesday, January 11, 2006 10:30 AM +0100 Benjamin Otto <b.otto at="" uke.uni-hamburg.de=""> wrote: > Hi Nick, > > I'm currently working on that chip type myself on a system with 512MB RAM > and had to increase the memory assigned to R too. However > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of > how much memory is necessary for the hgu133plus2 maybe somebody out there > could answer me this two related questions: > * I'm running R under WinXp AND under Linux on the same machine. However > there has never been any memory problem in linux. So is the memory > allocation or the assignment of the max. memory size that can be used by > R different in the two systems? > * With my 512MB RAM system the old maximum memory value under WinXP was > 512MB. Is this just a coincidence or does this maximum value rise to > 1024MB if I upgrade my system to 1024MB RAM? > > regards, > Benjamin > >> -----Original Message----- >> From: bioconductor-bounces at stat.math.ethz.ch >> [mailto:bioconductor-bounces at stat.math.ethz.ch]On Behalf Of James W. >> MacDonald >> Sent: 10 January 2006 20:02 >> To: Ettinger, Nicholas >> Cc: bioconductor at stat.math.ethz.ch >> Subject: Re: [BioC] affy & problems with memory >> >> >> Ettinger, Nicholas wrote: >> > Hello all! >> > >> > I have 16 Affy Hg-U133plus2 cel files that I am trying to analyze. I >> > am having difficulties generating expression data because of memory >> > problems. I am working on a Dell desktop running WinXP. I can >> > successfully generate expression values if I use "RMA" but if I try >> > MAS5 or GCRMA or LiWong instead then I consistently get an error >> > message with "Cannot allocate vector of size 169362 Kb." To try to >> > increase available memory I have already done two things: (a) I >> > right-clicked on the R icon and changed the box labeled 'Target' to >> > read "C:\Program Files\R\R-2.2.0\bin\Rgui.exe" --sdi >> > --max-mem-size=2000M" and (b) I have increased the virtual memory >> > Total Page File size to 2000MB. But it still runs out of memory. >> > >> > Help! Any suggestions -- I don't really have access to any bigger >> > computers? >> >> You don't need a bigger computer, you need more RAM, which is pretty >> cheap these days. I can't imagine it would take more than 512 Mb to do >> 16 U133 plus 2 chips, so you might look into adding a 512 Mb stick. >> >> Best, >> >> Jim >> >> >> > >> > Thanks. >> > ---Nick >> > >> > _______________________________________________ >> > Bioconductor mailing list >> > Bioconductor at stat.math.ethz.ch >> > https://stat.ethz.ch/mailman/listinfo/bioconductor >> >> >> -- >> James W. MacDonald >> Affymetrix and cDNA Microarray Core >> University of Michigan Cancer Center >> 1500 E. Medical Center Drive >> 7410 CCGC >> Ann Arbor MI 48109 >> 734-647-5623 >> >> _______________________________________________ >> Bioconductor mailing list >> Bioconductor at stat.math.ethz.ch >> https://stat.ethz.ch/mailman/listinfo/bioconductor >> > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor
ADD REPLY
0
Entering edit mode
I forgot: If concerning GCRMA and RMA you could use justGCRMA and justRMA which are less memory consuming... regards, Benjamin > > > Hi Nick, > > I'm currently working on that chip type myself on a system with 512MB RAM > and had to increase the memory assigned to R too. However > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of > how much memory is necessary for the hgu133plus2 maybe somebody out there > could answer me this two related questions: > * I'm running R under WinXp AND under Linux on the same machine. However > there has never been any memory problem in linux. So is the memory > allocation or the assignment of the max. memory size that can be > used by R > different in the two systems? > * With my 512MB RAM system the old maximum memory value under WinXP was > 512MB. Is this just a coincidence or does this maximum value rise > to 1024MB > if I upgrade my system to 1024MB RAM? > > regards, > Benjamin > > > -----Original Message----- > > From: bioconductor-bounces at stat.math.ethz.ch > > [mailto:bioconductor-bounces at stat.math.ethz.ch]On Behalf Of James W. > > MacDonald > > Sent: 10 January 2006 20:02 > > To: Ettinger, Nicholas > > Cc: bioconductor at stat.math.ethz.ch > > Subject: Re: [BioC] affy & problems with memory > > > > > > Ettinger, Nicholas wrote: > > > Hello all! > > > > > > I have 16 Affy Hg-U133plus2 cel files that I am trying to > analyze. I am > > > having difficulties generating expression data because of memory > > > problems. I am working on a Dell desktop running WinXP. I can > > > successfully generate expression values if I use "RMA" but if > I try MAS5 > > > or GCRMA or LiWong instead then I consistently get an error > message with > > > "Cannot allocate vector of size 169362 Kb." To try to increase > > > available memory I have already done two things: (a) I > right-clicked on > > > the R icon and changed the box labeled 'Target' to read "C:\Program > > > Files\R\R-2.2.0\bin\Rgui.exe" --sdi --max-mem-size=2000M" and (b) I > > > have increased the virtual memory Total Page File size to 2000MB. But > > > it still runs out of memory. > > > > > > Help! Any suggestions -- I don't really have access to any bigger > > > computers? > > > > You don't need a bigger computer, you need more RAM, which is pretty > > cheap these days. I can't imagine it would take more than 512 Mb to do > > 16 U133 plus 2 chips, so you might look into adding a 512 Mb stick. > > > > Best, > > > > Jim > > > > > > > > > > Thanks. > > > ---Nick > > > > > > _______________________________________________ > > > Bioconductor mailing list > > > Bioconductor at stat.math.ethz.ch > > > https://stat.ethz.ch/mailman/listinfo/bioconductor > > > > > > -- > > James W. MacDonald > > Affymetrix and cDNA Microarray Core > > University of Michigan Cancer Center > > 1500 E. Medical Center Drive > > 7410 CCGC > > Ann Arbor MI 48109 > > 734-647-5623 > > > > _______________________________________________ > > Bioconductor mailing list > > Bioconductor at stat.math.ethz.ch > > https://stat.ethz.ch/mailman/listinfo/bioconductor > > > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor >
ADD REPLY
0
Entering edit mode
Roel Verhaak ▴ 70
@roel-verhaak-710
Last seen 10.2 years ago
Hi Nick, I'm currently working on that chip type myself on a system with 512MB RAM and had to increase the memory assigned to R too. However "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of how much memory is necessary for the hgu133plus2 maybe somebody out there could answer me this two related questions: * I'm running R under WinXp AND under Linux on the same machine. However there has never been any memory problem in linux. So is the memory allocation or the assignment of the max. memory size that can be used by R different in the two systems? * With my 512MB RAM system the old maximum memory value under WinXP was 512MB. Is this just a coincidence or does this maximum value rise to 1024MB if I upgrade my system to 1024MB RAM? regards, Benjamin
ADD COMMENT
0
Entering edit mode
My apologies for the previous (empty) mail, it slipped me :) > I'm currently working on that chip type myself on a system with 512MB RAM > and had to increase the memory assigned to R too. However > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of > how much memory is necessary for the hgu133plus2 maybe somebody out there > could answer me this two related questions: > * I'm running R under WinXp AND under Linux on the same machine. However > there has never been any memory problem in linux. So is the memory > allocation or the assignment of the max. memory size that can be used by R > different in the two systems? This is due to the much more flexible memory management of Linux as opposed to Windows. Theoretical limit on a 32-bit Windows machine as 2Gb; 4 Gb under Linux. Linux has also a more flexible way of swapping (finding memory on your local harddrive) then Windows (the /swp filesystem). > * With my 512MB RAM system the old maximum memory value under WinXP was > 512MB. Is this just a coincidence or does this maximum value rise to 1024MB > if I upgrade my system to 1024MB RAM? No, this is not a coincidence since the maximum value is related to the amount available on your system. Cheers, R.
ADD REPLY
0
Entering edit mode
Hi, would it be a solution to use RMAexpress to obtain expression values for a good amount of arrays and then import them into your R session? I haven?t got much experience on this program but it is said on the RMAexpress webpage that the program can process upto 250 arrays simultaneously ( I guess not U133plus) and in the history section it is stated that the 0.4alpha3 version can run 200 arrays. Any information on U133plus? David > My apologies for the previous (empty) mail, it slipped me :) > > > I'm currently working on that chip type myself on a system with 512MB RAM > > and had to increase the memory assigned to R too. However > > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of > > how much memory is necessary for the hgu133plus2 maybe somebody out there > > could answer me this two related questions: > > * I'm running R under WinXp AND under Linux on the same machine. However > > there has never been any memory problem in linux. So is the memory > > allocation or the assignment of the max. memory size that can be used by R > > different in the two systems? > > This is due to the much more flexible memory management of Linux as > opposed to Windows. Theoretical limit on a 32-bit Windows machine as > 2Gb; 4 Gb under Linux. Linux has also a more flexible way of swapping > (finding memory on your local harddrive) then Windows (the /swp > filesystem). > > > > * With my 512MB RAM system the old maximum memory value under WinXP was > > 512MB. Is this just a coincidence or does this maximum value rise to 1024MB > > if I upgrade my system to 1024MB RAM? > > No, this is not a coincidence since the maximum value is related to the > amount available on your system. > > Cheers, > R. > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor >
ADD REPLY
0
Entering edit mode
David, RMAExpress can process an essentially unlimited number of arrays of any chip type. You can download the latest version at http://RMAExpress.bmbolstad.com As output it produces a tab delimited text file. You should have little difficulty importing this into R. Best Wishes, Ben On Thu, 2006-01-12 at 14:40 +0100, kfbargad at ehu.es wrote: > Hi, > > would it be a solution to use RMAexpress to obtain expression values > for a good amount of arrays and then import them into your R session? > I haven?t got much experience on this program but it is said on the > RMAexpress webpage that the program can process upto 250 arrays > simultaneously ( I guess not U133plus) and in the history section it > is stated that the 0.4alpha3 version can run 200 arrays. > > Any information on U133plus? > > David > > > My apologies for the previous (empty) mail, it slipped me :) > > > > > I'm currently working on that chip type myself on a system with > 512MB RAM > > > and had to increase the memory assigned to R too. However > > > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the > problem of > > > how much memory is necessary for the hgu133plus2 maybe somebody > out there > > > could answer me this two related questions: > > > * I'm running R under WinXp AND under Linux on the same machine. > However > > > there has never been any memory problem in linux. So is the memory > > > allocation or the assignment of the max. memory size that can be > used by R > > > different in the two systems? > > > > This is due to the much more flexible memory management of Linux as > > opposed to Windows. Theoretical limit on a 32-bit Windows machine as > > 2Gb; 4 Gb under Linux. Linux has also a more flexible way of > swapping > > (finding memory on your local harddrive) then Windows (the /swp > > filesystem). > > > > > > > * With my 512MB RAM system the old maximum memory value under > WinXP was > > > 512MB. Is this just a coincidence or does this maximum value rise > to 1024MB > > > if I upgrade my system to 1024MB RAM? > > > > No, this is not a coincidence since the maximum value is related to > the > > amount available on your system. > > > > Cheers, > > R. > > > > _______________________________________________ > > Bioconductor mailing list > > Bioconductor at stat.math.ethz.ch > > https://stat.ethz.ch/mailman/listinfo/bioconductor > > > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor
ADD REPLY
0
Entering edit mode
justRMA in affy is more memory efficient. --Naomi At 08:40 AM 1/12/2006, kfbargad at ehu.es wrote: >Hi, > >would it be a solution to use RMAexpress to obtain expression values >for a good amount of arrays and then import them into your R session? >I haven?t got much experience on this program but it is said on the >RMAexpress webpage that the program can process upto 250 arrays >simultaneously ( I guess not U133plus) and in the history section it >is stated that the 0.4alpha3 version can run 200 arrays. > >Any information on U133plus? > >David > > > My apologies for the previous (empty) mail, it slipped me :) > > > > > I'm currently working on that chip type myself on a system with >512MB RAM > > > and had to increase the memory assigned to R too. However > > > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the >problem of > > > how much memory is necessary for the hgu133plus2 maybe somebody >out there > > > could answer me this two related questions: > > > * I'm running R under WinXp AND under Linux on the same machine. >However > > > there has never been any memory problem in linux. So is the memory > > > allocation or the assignment of the max. memory size that can be >used by R > > > different in the two systems? > > > > This is due to the much more flexible memory management of Linux as > > opposed to Windows. Theoretical limit on a 32-bit Windows machine as > > 2Gb; 4 Gb under Linux. Linux has also a more flexible way of >swapping > > (finding memory on your local harddrive) then Windows (the /swp > > filesystem). > > > > > > > * With my 512MB RAM system the old maximum memory value under >WinXP was > > > 512MB. Is this just a coincidence or does this maximum value rise >to 1024MB > > > if I upgrade my system to 1024MB RAM? > > > > No, this is not a coincidence since the maximum value is related to >the > > amount available on your system. > > > > Cheers, > > R. > > > > _______________________________________________ > > Bioconductor mailing list > > Bioconductor at stat.math.ethz.ch > > https://stat.ethz.ch/mailman/listinfo/bioconductor > > > >_______________________________________________ >Bioconductor mailing list >Bioconductor at stat.math.ethz.ch >https://stat.ethz.ch/mailman/listinfo/bioconductor Naomi S. Altman 814-865-3791 (voice) Associate Professor Dept. of Statistics 814-863-7114 (fax) Penn State University 814-865-1348 (Statistics) University Park, PA 16802-2111
ADD REPLY
0
Entering edit mode
I meant more memory efficient than loading in your cel files and then running "rma". I don't know anything about RMA express (and I am sure Ben Bolstad's advice is best on this). --Naomi At 04:06 PM 1/13/2006, Naomi Altman wrote: >justRMA in affy is more memory efficient. > >--Naomi > >At 08:40 AM 1/12/2006, kfbargad at ehu.es wrote: > >Hi, > > > >would it be a solution to use RMAexpress to obtain expression values > >for a good amount of arrays and then import them into your R session? > >I haven?t got much experience on this program but it is said on the > >RMAexpress webpage that the program can process upto 250 arrays > >simultaneously ( I guess not U133plus) and in the history section it > >is stated that the 0.4alpha3 version can run 200 arrays. > > > >Any information on U133plus? > > > >David > > > > > My apologies for the previous (empty) mail, it slipped me :) > > > > > > > I'm currently working on that chip type myself on a system with > >512MB RAM > > > > and had to increase the memory assigned to R too. However > > > > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the > >problem of > > > > how much memory is necessary for the hgu133plus2 maybe somebody > >out there > > > > could answer me this two related questions: > > > > * I'm running R under WinXp AND under Linux on the same machine. > >However > > > > there has never been any memory problem in linux. So is the memory > > > > allocation or the assignment of the max. memory size that can be > >used by R > > > > different in the two systems? > > > > > > This is due to the much more flexible memory management of Linux as > > > opposed to Windows. Theoretical limit on a 32-bit Windows machine as > > > 2Gb; 4 Gb under Linux. Linux has also a more flexible way of > >swapping > > > (finding memory on your local harddrive) then Windows (the /swp > > > filesystem). > > > > > > > > > > * With my 512MB RAM system the old maximum memory value under > >WinXP was > > > > 512MB. Is this just a coincidence or does this maximum value rise > >to 1024MB > > > > if I upgrade my system to 1024MB RAM? > > > > > > No, this is not a coincidence since the maximum value is related to > >the > > > amount available on your system. > > > > > > Cheers, > > > R. > > > > > > _______________________________________________ > > > Bioconductor mailing list > > > Bioconductor at stat.math.ethz.ch > > > https://stat.ethz.ch/mailman/listinfo/bioconductor > > > > > > >_______________________________________________ > >Bioconductor mailing list > >Bioconductor at stat.math.ethz.ch > >https://stat.ethz.ch/mailman/listinfo/bioconductor > >Naomi S. Altman 814-865-3791 (voice) >Associate Professor >Dept. of Statistics 814-863-7114 (fax) >Penn State University 814-865-1348 (Statistics) >University Park, PA 16802-2111 > >_______________________________________________ >Bioconductor mailing list >Bioconductor at stat.math.ethz.ch >https://stat.ethz.ch/mailman/listinfo/bioconductor Naomi S. Altman 814-865-3791 (voice) Associate Professor Dept. of Statistics 814-863-7114 (fax) Penn State University 814-865-1348 (Statistics) University Park, PA 16802-2111
ADD REPLY
0
Entering edit mode
Xiaofan Li ▴ 10
@xiaofan-li-1566
Last seen 10.2 years ago
Hi, I have met with similar problems (memory allocation limit) dealing with affy. I have seen that the maximum memory allocation for R in Windows XP is the same size as your physical memory, say, in my case it is 768M. Reading some 40 U133A CEL files will cause the error. I managed to *avoid* this error by *separately loading the files*, for example, I load 20 files at a time then save the image, close R, re-open it and reload the image. I have successfully loaded 1.4G CEL files (121 chips) on my 768M laptop, creating a 450M RData image. However there are further problems with various data pre-processing methods. With a certain pile of chips (about 20), it is very easy to cause a memory allocation fault when doing rma(), expresso() and normalize(), both in Windows XP and Linux. And the problem is no longer a typical "768M limit" one, but related to the size of the target object. Since R seems to have used no object compressing approaches, this might have been the real memory limit (in Windows: physical mem + virtual mem size; in Linux: physical mem + swap disk size). Since R doesn't seem to have a data compressing approach as well as cleaning and garbage collection mechanism, I would suggest that adding more RAMs into your computer is the only way to solve the problem (alternatively you can increase the swap/virtual memory size up to (4GB - physical mem), bearing that extremely poor system performance). Xiaofan Li DAMTP, University of Cambridge, CB3 0WA, UK Tel +44 7886 614030, Email xl252 at cam.ac.uk -----Original Message----- From: bioconductor-bounces@stat.math.ethz.ch [mailto:bioconductor-bounces at stat.math.ethz.ch] On Behalf Of kfbargad at ehu.es Sent: 12 January 2006 13:40 To: bioconductor at stat.math.ethz.ch Subject: Re: [BioC] affy & problems with memory Hi, would it be a solution to use RMAexpress to obtain expression values for a good amount of arrays and then import them into your R session? I haven?t got much experience on this program but it is said on the RMAexpress webpage that the program can process upto 250 arrays simultaneously ( I guess not U133plus) and in the history section it is stated that the 0.4alpha3 version can run 200 arrays. Any information on U133plus? David > My apologies for the previous (empty) mail, it slipped me :) > > > I'm currently working on that chip type myself on a system with 512MB RAM > > and had to increase the memory assigned to R too. However > > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the problem of > > how much memory is necessary for the hgu133plus2 maybe somebody out there > > could answer me this two related questions: > > * I'm running R under WinXp AND under Linux on the same machine. However > > there has never been any memory problem in linux. So is the memory > > allocation or the assignment of the max. memory size that can be used by R > > different in the two systems? > > This is due to the much more flexible memory management of Linux as > opposed to Windows. Theoretical limit on a 32-bit Windows machine as > 2Gb; 4 Gb under Linux. Linux has also a more flexible way of swapping > (finding memory on your local harddrive) then Windows (the /swp > filesystem). > > > > * With my 512MB RAM system the old maximum memory value under WinXP was > > 512MB. Is this just a coincidence or does this maximum value rise to 1024MB > > if I upgrade my system to 1024MB RAM? > > No, this is not a coincidence since the maximum value is related to the > amount available on your system. > > Cheers, > R. > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > _______________________________________________ Bioconductor mailing list Bioconductor at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/bioconductoro/bioconductor
ADD COMMENT
0
Entering edit mode
Simon Lin ▴ 210
@simon-lin-461
Last seen 10.2 years ago
An embedded and charset-unspecified text was scrubbed... Name: not available Url: https://stat.ethz.ch/pipermail/bioconductor/attachments/20060116/ 9919e2e9/attachment.pl
ADD COMMENT
0
Entering edit mode
R does have a garbage collection mechanism. See ?gc and the references therein. Invoking it explictly after a massive rm() just before data reading is always a good practice. There are ways to improve R IO performance. One of the easiest tricks that can be performed is to reduce the precision of the numeric values to be read (since the input file is stored as text in RAM before being pasted --depending on the IO function being invoked, of course, but it is the case with read.table and its cousins--). El lun, 16-01-2006 a las 14:21 -0600, Simon Lin escribi?: > Since R doesn't seem to have a data compressing approach as well as > cleaning > and garbage collection mechanism, I would suggest that adding more > RAMs into > your computer is the only way to solve the problem (alternatively you > can > increase the swap/virtual memory size up to (4GB - physical mem), > bearing > that extremely poor system performance). Sincerely, Carlos J. Gil Bellosta http://www.datanalytics.com
ADD REPLY

Login before adding your answer.

Traffic: 872 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6