justGCRMA memory problem
1
0
Entering edit mode
Jianping Jin ▴ 890
@jianping-jin-1212
Last seen 10.2 years ago
Dear list: My PC has 2 GB of RAM. I set the max memory size for R equal to 2000M. But R stopped running when memory size reached 1 GM. Current version I use is R.2.2.0. I do not run into the problem with R 2.1.0. Does anyone have the same problem with new version of R? > data <- justGCRMA() Computing affinities.Computing affinities.Done. Done. Adjusting for optical effect..Error: cannot allocate vector of size 424868 Kb > memory.limit() [1] 2097152000 > memory.size() [1] 1060229264 Thanks! Jianping xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx x Jianping Jin Ph.D. x x Bioinformatics scientist x x Center for bioinformatics x x 3133 Bioinformatics Building x x CB# 7104 x x University of North Carolina x x Chapel Hill, NC 27599 x x Tel: (919)843-6105 x x Fax: (919)843-3103 x x E-mail: jjin at email.unc.edu x xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
• 1.0k views
ADD COMMENT
0
Entering edit mode
@james-w-macdonald-5106
Last seen 2 hours ago
United States
Hi Jianping, Jianping Jin wrote: > Dear list: > > My PC has 2 GB of RAM. I set the max memory size for R equal to 2000M. But > R stopped running when memory size reached 1 GM. Current version I use is > R.2.2.0. I do not run into the problem with R 2.1.0. Does anyone have the > same problem with new version of R? I haven't noticed any differences in memory usage. However, there are certain aspects of Windows' memory usage that can cause problems. First, if a large block of memory has already been allocated (even if it is subsequently freed up), you may not be able to use it. Can you replicate the above problem if no other programs are running on your computer with a fresh instance of R? Also, memory.size() will simply give you the amount of RAM that R is currently using. More informative would be memory.size(max = TRUE). Additionally, there is a note in the R-FAQ for Windows that setting the memory size > 1.7Gb may be detrimental. I have never seen this problem myself, but you might try setting --max.memory.size = 1700M and try again. HTH, Jim -- James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623
ADD COMMENT
0
Entering edit mode
Hi Jim, thanks for your quick reply! I tried --max-mem-size=17000M and run just R 2.2.0 without any other programs running at the time. The message I got listed as the following: > data <- justGCRMA(filenames=files) Computing affinities.Computing affinities.Done. Done. Adjusting for optical effect..Error: cannot allocate vector of size 424868 Kb > memory.size(max=TRUE) [1] 1194139648 > memory.limit() [1] 1782579200 Any ideas about it? Thanks again! Jianping --On Thursday, February 09, 2006 10:21 AM -0500 "James W. MacDonald" <jmacdon at="" med.umich.edu=""> wrote: > Hi Jianping, > > Jianping Jin wrote: >> Dear list: >> >> My PC has 2 GB of RAM. I set the max memory size for R equal to 2000M. >> But R stopped running when memory size reached 1 GM. Current version I >> use is R.2.2.0. I do not run into the problem with R 2.1.0. Does anyone >> have the same problem with new version of R? > > I haven't noticed any differences in memory usage. However, there are > certain aspects of Windows' memory usage that can cause problems. First, > if a large block of memory has already been allocated (even if it is > subsequently freed up), you may not be able to use it. Can you replicate > the above problem if no other programs are running on your computer with > a fresh instance of R? > > Also, memory.size() will simply give you the amount of RAM that R is > currently using. More informative would be memory.size(max = TRUE). > > Additionally, there is a note in the R-FAQ for Windows that setting the > memory size > 1.7Gb may be detrimental. I have never seen this problem > myself, but you might try setting --max.memory.size = 1700M and try again. > > HTH, > > Jim > > > > -- > James W. MacDonald > Affymetrix and cDNA Microarray Core > University of Michigan Cancer Center > 1500 E. Medical Center Drive > 7410 CCGC > Ann Arbor MI 48109 > 734-647-5623 xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx x Jianping Jin Ph.D. x x Bioinformatics scientist x x Center for bioinformatics x x 3133 Bioinformatics Building x x CB# 7104 x x University of North Carolina x x Chapel Hill, NC 27599 x x Tel: (919)843-6105 x x Fax: (919)843-3103 x x E-mail: jjin at email.unc.edu x xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
ADD REPLY
0
Entering edit mode
Jianping Jin wrote: > Hi Jim, thanks for your quick reply! > > I tried --max-mem-size=17000M and run just R 2.2.0 without any other > programs running at the time. The message I got listed as the following: > >> data <- justGCRMA(filenames=files) > > Computing affinities.Computing affinities.Done. > Done. > Adjusting for optical effect..Error: cannot allocate vector of size > 424868 Kb > >> memory.size(max=TRUE) > > [1] 1194139648 > >> memory.limit() > > [1] 1782579200 > > Any ideas about it? Odd. How many chips and what type are they? Maybe I can try to replicate that here. Jim > > Thanks again! > > Jianping > > --On Thursday, February 09, 2006 10:21 AM -0500 "James W. MacDonald" > <jmacdon at="" med.umich.edu=""> wrote: > >> Hi Jianping, >> >> Jianping Jin wrote: >> >>> Dear list: >>> >>> My PC has 2 GB of RAM. I set the max memory size for R equal to 2000M. >>> But R stopped running when memory size reached 1 GM. Current version I >>> use is R.2.2.0. I do not run into the problem with R 2.1.0. Does anyone >>> have the same problem with new version of R? >> >> >> I haven't noticed any differences in memory usage. However, there are >> certain aspects of Windows' memory usage that can cause problems. First, >> if a large block of memory has already been allocated (even if it is >> subsequently freed up), you may not be able to use it. Can you replicate >> the above problem if no other programs are running on your computer with >> a fresh instance of R? >> >> Also, memory.size() will simply give you the amount of RAM that R is >> currently using. More informative would be memory.size(max = TRUE). >> >> Additionally, there is a note in the R-FAQ for Windows that setting the >> memory size > 1.7Gb may be detrimental. I have never seen this problem >> myself, but you might try setting --max.memory.size = 1700M and try >> again. >> >> HTH, >> >> Jim >> >> >> >> -- >> James W. MacDonald >> Affymetrix and cDNA Microarray Core >> University of Michigan Cancer Center >> 1500 E. Medical Center Drive >> 7410 CCGC >> Ann Arbor MI 48109 >> 734-647-5623 > > > > > xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx > x Jianping Jin Ph.D. x > x Bioinformatics scientist x > x Center for bioinformatics x > x 3133 Bioinformatics Building x > x CB# 7104 x > x University of North Carolina x > x Chapel Hill, NC 27599 x > x Tel: (919)843-6105 x > x Fax: (919)843-3103 x > x E-mail: jjin at email.unc.edu x > xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx -- James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623
ADD REPLY
0
Entering edit mode
I was reading HG-U133_Plus_2 chips. The total 86 chips I tried to upload. Thanks! Jianping --On Thursday, February 09, 2006 11:06 AM -0500 "James W. MacDonald" <jmacdon at="" med.umich.edu=""> wrote: > Jianping Jin wrote: >> Hi Jim, thanks for your quick reply! >> >> I tried --max-mem-size=17000M and run just R 2.2.0 without any other >> programs running at the time. The message I got listed as the following: >> >>> data <- justGCRMA(filenames=files) >> >> Computing affinities.Computing affinities.Done. >> Done. >> Adjusting for optical effect..Error: cannot allocate vector of size >> 424868 Kb >> >>> memory.size(max=TRUE) >> >> [1] 1194139648 >> >>> memory.limit() >> >> [1] 1782579200 >> >> Any ideas about it? > > Odd. How many chips and what type are they? Maybe I can try to replicate > that here. > > Jim > > >> >> Thanks again! >> >> Jianping >> >> --On Thursday, February 09, 2006 10:21 AM -0500 "James W. MacDonald" >> <jmacdon at="" med.umich.edu=""> wrote: >> >>> Hi Jianping, >>> >>> Jianping Jin wrote: >>> >>>> Dear list: >>>> >>>> My PC has 2 GB of RAM. I set the max memory size for R equal to 2000M. >>>> But R stopped running when memory size reached 1 GM. Current version I >>>> use is R.2.2.0. I do not run into the problem with R 2.1.0. Does >>>> anyone have the same problem with new version of R? >>> >>> >>> I haven't noticed any differences in memory usage. However, there are >>> certain aspects of Windows' memory usage that can cause problems. First, >>> if a large block of memory has already been allocated (even if it is >>> subsequently freed up), you may not be able to use it. Can you replicate >>> the above problem if no other programs are running on your computer with >>> a fresh instance of R? >>> >>> Also, memory.size() will simply give you the amount of RAM that R is >>> currently using. More informative would be memory.size(max = TRUE). >>> >>> Additionally, there is a note in the R-FAQ for Windows that setting the >>> memory size > 1.7Gb may be detrimental. I have never seen this problem >>> myself, but you might try setting --max.memory.size = 1700M and try >>> again. >>> >>> HTH, >>> >>> Jim >>> >>> >>> >>> -- >>> James W. MacDonald >>> Affymetrix and cDNA Microarray Core >>> University of Michigan Cancer Center >>> 1500 E. Medical Center Drive >>> 7410 CCGC >>> Ann Arbor MI 48109 >>> 734-647-5623 >> >> >> >> >> xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx >> x Jianping Jin Ph.D. x >> x Bioinformatics scientist x >> x Center for bioinformatics x >> x 3133 Bioinformatics Building x >> x CB# 7104 x >> x University of North Carolina x >> x Chapel Hill, NC 27599 x >> x Tel: (919)843-6105 x >> x Fax: (919)843-3103 x >> x E-mail: jjin at email.unc.edu x >> xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx > > > -- > James W. MacDonald > Affymetrix and cDNA Microarray Core > University of Michigan Cancer Center > 1500 E. Medical Center Drive > 7410 CCGC > Ann Arbor MI 48109 > 734-647-5623 xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx x Jianping Jin Ph.D. x x Bioinformatics scientist x x Center for bioinformatics x x 3133 Bioinformatics Building x x CB# 7104 x x University of North Carolina x x Chapel Hill, NC 27599 x x Tel: (919)843-6105 x x Fax: (919)843-3103 x x E-mail: jjin at email.unc.edu x xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
ADD REPLY
0
Entering edit mode
Hi Jim, FYI I ran justRMA on R 2.1.0 and it was fine: > files <- dir(pattern="CEL") > data<- justRMA(filenames=files) Background correcting Normalizing Calculating Expression > memory.size(max=TRUE) [1] 1536868352 > memory.limit() [1] 1572864000 I ran into the same problem when I ran justGCRMA on R 2.1.0 as on R 2.2.0. regards, JP- --On Thursday, February 09, 2006 11:19 AM -0500 Jianping Jin <jjin at="" email.unc.edu=""> wrote: > I was reading HG-U133_Plus_2 chips. The total 86 chips I tried to upload. > > Thanks! > > Jianping > > --On Thursday, February 09, 2006 11:06 AM -0500 "James W. MacDonald" > <jmacdon at="" med.umich.edu=""> wrote: > >> Jianping Jin wrote: >>> Hi Jim, thanks for your quick reply! >>> >>> I tried --max-mem-size=17000M and run just R 2.2.0 without any other >>> programs running at the time. The message I got listed as the following: >>> >>>> data <- justGCRMA(filenames=files) >>> >>> Computing affinities.Computing affinities.Done. >>> Done. >>> Adjusting for optical effect..Error: cannot allocate vector of size >>> 424868 Kb >>> >>>> memory.size(max=TRUE) >>> >>> [1] 1194139648 >>> >>>> memory.limit() >>> >>> [1] 1782579200 >>> >>> Any ideas about it? >> >> Odd. How many chips and what type are they? Maybe I can try to replicate >> that here. >> >> Jim >> >> >>> >>> Thanks again! >>> >>> Jianping >>> >>> --On Thursday, February 09, 2006 10:21 AM -0500 "James W. MacDonald" >>> <jmacdon at="" med.umich.edu=""> wrote: >>> >>>> Hi Jianping, >>>> >>>> Jianping Jin wrote: >>>> >>>>> Dear list: >>>>> >>>>> My PC has 2 GB of RAM. I set the max memory size for R equal to 2000M. >>>>> But R stopped running when memory size reached 1 GM. Current version >>>>> I use is R.2.2.0. I do not run into the problem with R 2.1.0. Does >>>>> anyone have the same problem with new version of R? >>>> >>>> >>>> I haven't noticed any differences in memory usage. However, there are >>>> certain aspects of Windows' memory usage that can cause problems. >>>> First, if a large block of memory has already been allocated (even if >>>> it is subsequently freed up), you may not be able to use it. Can you >>>> replicate the above problem if no other programs are running on your >>>> computer with a fresh instance of R? >>>> >>>> Also, memory.size() will simply give you the amount of RAM that R is >>>> currently using. More informative would be memory.size(max = TRUE). >>>> >>>> Additionally, there is a note in the R-FAQ for Windows that setting the >>>> memory size > 1.7Gb may be detrimental. I have never seen this problem >>>> myself, but you might try setting --max.memory.size = 1700M and try >>>> again. >>>> >>>> HTH, >>>> >>>> Jim >>>> >>>> >>>> >>>> -- >>>> James W. MacDonald >>>> Affymetrix and cDNA Microarray Core >>>> University of Michigan Cancer Center >>>> 1500 E. Medical Center Drive >>>> 7410 CCGC >>>> Ann Arbor MI 48109 >>>> 734-647-5623 >>> >>> >>> >>> >>> xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx >>> x Jianping Jin Ph.D. x >>> x Bioinformatics scientist x >>> x Center for bioinformatics x >>> x 3133 Bioinformatics Building x >>> x CB# 7104 x >>> x University of North Carolina x >>> x Chapel Hill, NC 27599 x >>> x Tel: (919)843-6105 x >>> x Fax: (919)843-3103 x >>> x E-mail: jjin at email.unc.edu x >>> xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx >> >> >> -- >> James W. MacDonald >> Affymetrix and cDNA Microarray Core >> University of Michigan Cancer Center >> 1500 E. Medical Center Drive >> 7410 CCGC >> Ann Arbor MI 48109 >> 734-647-5623 > > > > xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx > x Jianping Jin Ph.D. x > x Bioinformatics scientist x > x Center for bioinformatics x > x 3133 Bioinformatics Building x > x CB# 7104 x > x University of North Carolina x > x Chapel Hill, NC 27599 x > x Tel: (919)843-6105 x > x Fax: (919)843-3103 x > x E-mail: jjin at email.unc.edu x > xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx x Jianping Jin Ph.D. x x Bioinformatics scientist x x Center for bioinformatics x x 3133 Bioinformatics Building x x CB# 7104 x x University of North Carolina x x Chapel Hill, NC 27599 x x Tel: (919)843-6105 x x Fax: (919)843-3103 x x E-mail: jjin at email.unc.edu x xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
ADD REPLY
0
Entering edit mode
Jianping Jin wrote: > Hi Jim, > FYI I ran justRMA on R 2.1.0 and it was fine: > > >>files <- dir(pattern="CEL") >>data<- justRMA(filenames=files) > > Background correcting > Normalizing > Calculating Expression > >>memory.size(max=TRUE) > > [1] 1536868352 > >>memory.limit() > > [1] 1572864000 > > I ran into the same problem when I ran justGCRMA on R 2.1.0 as on R 2.2.0. Yeah, I am a little thick today it seems. Your error message was 'cannot allocate vector of size 424868 Kb' and your memory.size(max = TRUE) == 1194139648 You add those up (noting that the vector is listed in *Kb*), and you get 1618007648 bytes, which is probably less RAM than you have available for R (Windows XP takes quite a bit of RAM for itself). You could always get more RAM, but you will have to make R 'large address aware' to be able to access more than 2 Gb (IIRC). I believe this means you have to build R from source as well. There is something in the R-FAQ for Windows about this issue. Best, Jim > > regards, > > JP- > -- James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623
ADD REPLY
0
Entering edit mode
Thanks for your time, Jim! I will try that. best, JP- --On Thursday, February 09, 2006 4:14 PM -0500 "James W. MacDonald" <jmacdon at="" med.umich.edu=""> wrote: > Jianping Jin wrote: >> Hi Jim, >> FYI I ran justRMA on R 2.1.0 and it was fine: >> >> >>> files <- dir(pattern="CEL") >>> data<- justRMA(filenames=files) >> >> Background correcting >> Normalizing >> Calculating Expression >> >>> memory.size(max=TRUE) >> >> [1] 1536868352 >> >>> memory.limit() >> >> [1] 1572864000 >> >> I ran into the same problem when I ran justGCRMA on R 2.1.0 as on R >> 2.2.0. > > Yeah, I am a little thick today it seems. Your error message was 'cannot > allocate vector of size 424868 Kb' and your memory.size(max = TRUE) == > 1194139648 > > You add those up (noting that the vector is listed in *Kb*), and you get > 1618007648 bytes, which is probably less RAM than you have available for > R (Windows XP takes quite a bit of RAM for itself). > > You could always get more RAM, but you will have to make R 'large address > aware' to be able to access more than 2 Gb (IIRC). I believe this means > you have to build R from source as well. There is something in the R-FAQ > for Windows about this issue. > > Best, > > Jim > > > >> >> regards, >> >> JP- >> > > -- > James W. MacDonald > Affymetrix and cDNA Microarray Core > University of Michigan Cancer Center > 1500 E. Medical Center Drive > 7410 CCGC > Ann Arbor MI 48109 > 734-647-5623 xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx x Jianping Jin Ph.D. x x Bioinformatics scientist x x Center for bioinformatics x x 3133 Bioinformatics Building x x CB# 7104 x x University of North Carolina x x Chapel Hill, NC 27599 x x Tel: (919)843-6105 x x Fax: (919)843-3103 x x E-mail: jjin at email.unc.edu x xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
ADD REPLY

Login before adding your answer.

Traffic: 967 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6