Question: running out of memory
0
gravatar for Javier Pérez Florido
9.2 years ago by
Javier Pérez Florido840 wrote:
Dear list, I'm trying to normalize using threestep several CEL files (around 100 CEL files). I got the following error: Error: cannot allocate vector of size 6.5 Mb I've been reading the suggestions about this error in the mailing list, but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and I did the following: * I used the command line --max-mem-size=3071M in the R shortcut * I changed the boot.ini file to allow up to 3GB o [boot loader] timeout=30 default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS [operating systems] multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft Windows XP Professional" /3GB /noexecute=optin /fastdetect * I try to remove not used variables using the rm() command and the garbage collector gc() within the source code. But, even with those changes, I'm still running out of memory. It is strange, because, having a look at the tasks administrator, R doesn't use more than 1600 MB and the whole system never goes further than 2GB Any other tips? Thanks, Javier [[alternative HTML version deleted]]
• 519 views
ADD COMMENTlink modified 9.2 years ago • written 9.2 years ago by Javier Pérez Florido840
Answer: running out of memory
0
gravatar for Steve Lianoglou
9.2 years ago by
Denali
Steve Lianoglou12k wrote:
Hi, 2010/1/20 Javier P?rez Florido <jpflorido at="" gmail.com="">: > Dear list, > I'm trying to normalize using threestep several CEL files (around 100 > CEL files). I got the following error: > > Error: cannot allocate vector of size 6.5 Mb > > I've been reading the suggestions about this error in the mailing list, > but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and > I did the following: > > ? ?* I used the command line --max-mem-size=3071M in the R shortcut > ? ?* I changed the boot.ini file to allow up to 3GB > ? ? ? ? ?o [boot loader] > ? ? ? ? ? ?timeout=30 > ? ? ? ? ? ?default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS > ? ? ? ? ? ?[operating systems] > ? ? ? ? ? ?multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft > ? ? ? ? ? ?Windows XP Professional" /3GB /noexecute=optin /fastdetect > ? ?* I try to remove not used variables using the rm() command and the > ? ? ?garbage collector gc() within the source code. > > But, even with those changes, I'm still running out of memory. It is > strange, because, having a look at the tasks administrator, R doesn't > use more than 1600 MB and the whole system never goes further than 2GB I can't really comment on tweaking memory settings on windows systems, but if all you're doing is trying to normalize a boat-load of affy arrays together, I understand that the aroma.affymetrix package can do so while keeping memory requirements down. Perhaps you might consider looking into it until someone can give you better advice: http://groups.google.com/group/aroma-affymetrix/web/overview HTH, -steve -- Steve Lianoglou Graduate Student: Computational Systems Biology | Memorial Sloan-Kettering Cancer Center | Weill Medical College of Cornell University Contact Info: http://cbio.mskcc.org/~lianos/contact
ADD COMMENTlink written 9.2 years ago by Steve Lianoglou12k
Dear Steve, Thanks for your help, but, I'm trying to preprocess using RMA, GCRMA, VSN and dChip methods and I don't see any function in aroma.affymetrix that performs the whole preprocessing... Steve Lianoglou escribi?: > Hi, > > 2010/1/20 Javier P?rez Florido <jpflorido at="" gmail.com="">: > >> Dear list, >> I'm trying to normalize using threestep several CEL files (around 100 >> CEL files). I got the following error: >> >> Error: cannot allocate vector of size 6.5 Mb >> >> I've been reading the suggestions about this error in the mailing list, >> but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and >> I did the following: >> >> * I used the command line --max-mem-size=3071M in the R shortcut >> * I changed the boot.ini file to allow up to 3GB >> o [boot loader] >> timeout=30 >> default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS >> [operating systems] >> multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft >> Windows XP Professional" /3GB /noexecute=optin /fastdetect >> * I try to remove not used variables using the rm() command and the >> garbage collector gc() within the source code. >> >> But, even with those changes, I'm still running out of memory. It is >> strange, because, having a look at the tasks administrator, R doesn't >> use more than 1600 MB and the whole system never goes further than 2GB >> > > I can't really comment on tweaking memory settings on windows systems, > but if all you're doing is trying to normalize a boat-load of affy > arrays together, I understand that the aroma.affymetrix package can do > so while keeping memory requirements down. Perhaps you might consider > looking into it until someone can give you better advice: > > http://groups.google.com/group/aroma-affymetrix/web/overview > > HTH, > -steve > >
ADD REPLYlink written 9.2 years ago by Javier Pérez Florido840
Hello, How big are your 100 CEL files in total? At what point does it break? Just at the normalization step? Regards, Carlos J. Gil Bellosta http://www.datanalytics.com El d?a 20 de enero de 2010 17:24, Javier P?rez Florido <jpflorido at="" gmail.com=""> escribi?: > Dear Steve, > Thanks for your help, but, I'm trying to preprocess using RMA, GCRMA, VSN > and dChip methods and I don't see any function in aroma.affymetrix that > performs the whole preprocessing... > > > > Steve Lianoglou escribi?: >> >> Hi, >> >> 2010/1/20 Javier P?rez Florido <jpflorido at="" gmail.com="">: >> >>> >>> Dear list, >>> I'm trying to normalize using threestep several CEL files (around 100 >>> CEL files). I got the following error: >>> >>> Error: cannot allocate vector of size 6.5 Mb >>> >>> I've been reading the suggestions about this error in the mailing list, >>> but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and >>> I did the following: >>> >>> ? * I used the command line --max-mem-size=3071M in the R shortcut >>> ? * I changed the boot.ini file to allow up to 3GB >>> ? ? ? ? o [boot loader] >>> ? ? ? ? ? timeout=30 >>> ? ? ? ? ? default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS >>> ? ? ? ? ? [operating systems] >>> ? ? ? ? ? multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft >>> ? ? ? ? ? Windows XP Professional" /3GB /noexecute=optin /fastdetect >>> ? * I try to remove not used variables using the rm() command and the >>> ? ? garbage collector gc() within the source code. >>> >>> But, even with those changes, I'm still running out of memory. It is >>> strange, because, having a look at the tasks administrator, R doesn't >>> use more than 1600 MB and the whole system never goes further than 2GB >>> >> >> I can't really comment on tweaking memory settings on windows systems, >> but if all you're doing is trying to normalize a boat-load of affy >> arrays together, I understand that the aroma.affymetrix package can do >> so while keeping memory requirements down. Perhaps you might consider >> looking into it until someone can give you better advice: >> >> http://groups.google.com/group/aroma-affymetrix/web/overview >> >> HTH, >> -steve >> >> > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: > http://news.gmane.org/gmane.science.biology.informatics.conductor >
ADD REPLYlink written 9.2 years ago by Carlos J. Gil Bellosta 40
Hello, How big are your 100 CEL files in total? At what point does it break? Just at the normalization step? Regards, Carlos J. Gil Bellosta http://www.datanalytics.com El d?a 20 de enero de 2010 17:24, Javier P?rez Florido <jpflorido at="" gmail.com=""> escribi?: > Dear Steve, > Thanks for your help, but, I'm trying to preprocess using RMA, GCRMA, VSN > and dChip methods and I don't see any function in aroma.affymetrix that > performs the whole preprocessing... > > > > Steve Lianoglou escribi?: >> >> Hi, >> >> 2010/1/20 Javier P?rez Florido <jpflorido at="" gmail.com="">: >> >>> >>> Dear list, >>> I'm trying to normalize using threestep several CEL files (around 100 >>> CEL files). I got the following error: >>> >>> Error: cannot allocate vector of size 6.5 Mb >>> >>> I've been reading the suggestions about this error in the mailing list, >>> but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and >>> I did the following: >>> >>> ? * I used the command line --max-mem-size=3071M in the R shortcut >>> ? * I changed the boot.ini file to allow up to 3GB >>> ? ? ? ? o [boot loader] >>> ? ? ? ? ? timeout=30 >>> ? ? ? ? ? default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS >>> ? ? ? ? ? [operating systems] >>> ? ? ? ? ? multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft >>> ? ? ? ? ? Windows XP Professional" /3GB /noexecute=optin /fastdetect >>> ? * I try to remove not used variables using the rm() command and the >>> ? ? garbage collector gc() within the source code. >>> >>> But, even with those changes, I'm still running out of memory. It is >>> strange, because, having a look at the tasks administrator, R doesn't >>> use more than 1600 MB and the whole system never goes further than 2GB >>> >> >> I can't really comment on tweaking memory settings on windows systems, >> but if all you're doing is trying to normalize a boat-load of affy >> arrays together, I understand that the aroma.affymetrix package can do >> so while keeping memory requirements down. Perhaps you might consider >> looking into it until someone can give you better advice: >> >> http://groups.google.com/group/aroma-affymetrix/web/overview >> >> HTH, >> -steve >> >> > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: > http://news.gmane.org/gmane.science.biology.informatics.conductor >
ADD REPLYlink written 9.2 years ago by Carlos J. Gil Bellosta110
2010/1/20 Steve Lianoglou <mailinglist.honeypot at="" gmail.com="">: > Hi, > > 2010/1/20 Javier P?rez Florido <jpflorido at="" gmail.com="">: >> Dear list, >> I'm trying to normalize using threestep several CEL files (around 100 >> CEL files). I got the following error: >> >> Error: cannot allocate vector of size 6.5 Mb >> >> I've been reading the suggestions about this error in the mailing list, >> but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and >> I did the following: >> >> ? ?* I used the command line --max-mem-size=3071M in the R shortcut >> ? ?* I changed the boot.ini file to allow up to 3GB >> ? ? ? ? ?o [boot loader] >> ? ? ? ? ? ?timeout=30 >> ? ? ? ? ? ?default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS >> ? ? ? ? ? ?[operating systems] >> ? ? ? ? ? ?multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft >> ? ? ? ? ? ?Windows XP Professional" /3GB /noexecute=optin /fastdetect >> ? ?* I try to remove not used variables using the rm() command and the >> ? ? ?garbage collector gc() within the source code. >> >> But, even with those changes, I'm still running out of memory. It is >> strange, because, having a look at the tasks administrator, R doesn't >> use more than 1600 MB and the whole system never goes further than 2GB > > I can't really comment on tweaking memory settings on windows systems, > but if all you're doing is trying to normalize a boat-load of affy > arrays together, I understand that the aroma.affymetrix package can do > so while keeping memory requirements down. Perhaps you might consider > looking into it until someone can give you better advice: > > http://groups.google.com/group/aroma-affymetrix/web/overview FYI, there is now a new web site (we'll remove the documentation from the above soon): http://www.aroma-project.org/ /Henrik > > HTH, > -steve > > -- > Steve Lianoglou > Graduate Student: Computational Systems Biology > ?| Memorial Sloan-Kettering Cancer Center > ?| Weill Medical College of Cornell University > Contact Info: http://cbio.mskcc.org/~lianos/contact > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: http://news.gmane.org/gmane.science.biology.informatics.conductor >
ADD REPLYlink written 9.2 years ago by Henrik Bengtsson2.4k
Hello, You could try to perform your normalization on Amazon EC2. For a few dollars (possibly, a few cents) you can "rent" a server with lots of RAM for a few hours. I wrote a small guide on how to use R on such service. The link is: http://analisisydecision.es/probando-r-sobre-el-ec2-de-amazon/ It is in Spanish, though. You could load your data, set up a proper environment on the EC2 server (with bioconductor packages and the like), run your normalization and download the results. Best regards, Carlos J. Gil Bellosta http://www.datanalytics.com
ADD REPLYlink written 9.2 years ago by Carlos J. Gil Bellosta110
Answer: running out of memory
0
gravatar for James W. MacDonald
9.2 years ago by
United States
James W. MacDonald49k wrote:
Hi Javier, Javier P?rez Florido wrote: > Dear list, > I'm trying to normalize using threestep several CEL files (around 100 > CEL files). I got the following error: > > Error: cannot allocate vector of size 6.5 Mb > > I've been reading the suggestions about this error in the mailing list, > but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and > I did the following: > > * I used the command line --max-mem-size=3071M in the R shortcut > * I changed the boot.ini file to allow up to 3GB > o [boot loader] > timeout=30 > default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS > [operating systems] > multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft > Windows XP Professional" /3GB /noexecute=optin /fastdetect > * I try to remove not used variables using the rm() command and the > garbage collector gc() within the source code. > > But, even with those changes, I'm still running out of memory. It is > strange, because, having a look at the tasks administrator, R doesn't > use more than 1600 MB and the whole system never goes further than 2GB You won't be able to normalize 100 chips with 3 Gb RAM. If you really want to use threestep(), you will either have to come up with a 64 bit computer with more memory, or as Steve Lianoglou mentioned, use aroma.affymetrix (which IIRC includes routines from affyPLM). Best, Jim > > Any other tips? > Thanks, > Javier > > > [[alternative HTML version deleted]] > > _______________________________________________ > Bioconductor mailing list > Bioconductor at stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: http://news.gmane.org/gmane.science.biology.informatics.conductor -- James W. MacDonald, M.S. Biostatistician Douglas Lab University of Michigan Department of Human Genetics 5912 Buhl 1241 E. Catherine St. Ann Arbor MI 48109-5618 734-615-7826 ********************************************************** Electronic Mail is not secure, may not be read every day, and should not be used for urgent or sensitive issues
ADD COMMENTlink written 9.2 years ago by James W. MacDonald49k
Answer: running out of memory
0
gravatar for Javier Pérez Florido
9.2 years ago by
Javier Pérez Florido840 wrote:
Yep, at the normalization step. The size of CEL files are around 800MB in total, Javier Carlos J. Gil Bellosta escribi?: > Hello, > > How big are your 100 CEL files in total? At what point does it break? > Just at the normalization step? > > Regards, > > Carlos J. Gil Bellosta > http://www.datanalytics.com > > El d?a 20 de enero de 2010 17:24, Javier P?rez Florido > <jpflorido at="" gmail.com=""> escribi?: > >> Dear Steve, >> Thanks for your help, but, I'm trying to preprocess using RMA, GCRMA, VSN >> and dChip methods and I don't see any function in aroma.affymetrix that >> performs the whole preprocessing... >> >> >> >> Steve Lianoglou escribi?: >> >>> Hi, >>> >>> 2010/1/20 Javier P?rez Florido <jpflorido at="" gmail.com="">: >>> >>> >>>> Dear list, >>>> I'm trying to normalize using threestep several CEL files (around 100 >>>> CEL files). I got the following error: >>>> >>>> Error: cannot allocate vector of size 6.5 Mb >>>> >>>> I've been reading the suggestions about this error in the mailing list, >>>> but I couldn't fix it. My system (Win XP Prof, 32 bits) has 3 GB RAM and >>>> I did the following: >>>> >>>> * I used the command line --max-mem-size=3071M in the R shortcut >>>> * I changed the boot.ini file to allow up to 3GB >>>> o [boot loader] >>>> timeout=30 >>>> default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS >>>> [operating systems] >>>> multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Microsoft >>>> Windows XP Professional" /3GB /noexecute=optin /fastdetect >>>> * I try to remove not used variables using the rm() command and the >>>> garbage collector gc() within the source code. >>>> >>>> But, even with those changes, I'm still running out of memory. It is >>>> strange, because, having a look at the tasks administrator, R doesn't >>>> use more than 1600 MB and the whole system never goes further than 2GB >>>> >>>> >>> I can't really comment on tweaking memory settings on windows systems, >>> but if all you're doing is trying to normalize a boat-load of affy >>> arrays together, I understand that the aroma.affymetrix package can do >>> so while keeping memory requirements down. Perhaps you might consider >>> looking into it until someone can give you better advice: >>> >>> http://groups.google.com/group/aroma-affymetrix/web/overview >>> >>> HTH, >>> -steve >>> >>> >>> >> _______________________________________________ >> Bioconductor mailing list >> Bioconductor at stat.math.ethz.ch >> https://stat.ethz.ch/mailman/listinfo/bioconductor >> Search the archives: >> http://news.gmane.org/gmane.science.biology.informatics.conductor >> >> > >
ADD COMMENTlink written 9.2 years ago by Javier Pérez Florido840
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 313 users visited in the last hour