Cannot allocate vector of size xxxxxxkb
2
0
Entering edit mode
@brooks-anthony-b-2325
Last seen 9.7 years ago
An embedded and charset-unspecified text was scrubbed... Name: not available Url: https://stat.ethz.ch/pipermail/bioconductor/attachments/20070921/ 6d6c4ec9/attachment.pl
• 785 views
ADD COMMENT
0
Entering edit mode
@dermot-morris-2132
Last seen 9.7 years ago
Brooks, Anthony B <anthony.brooks at="" ...=""> writes: > > Hi Guys, > > I know this question has been asked many times before, but all the > "solutions" I try seem to fail. > > I have a Dell Optiplex, 2 X 3.4Ghz Dual Core Processor with 4Gb of RAM > running windows XP Professional. > > I am trying to read in a measly 16 HG-U133 Plus2.0 arrays using the > following script. > > celnames <- choose.files(caption="Choose CEL file(s)") > > celnames <- sort(celnames) > > Data <- ReadAffy(filenames=celnames) > > However, I get the "Cannot allocate vector of size" error. > > I have tried adding the --max-mem-size=2Gb tag onto my shortcut. > > Using memory.limit(size=4000) > > The thing is I've used the same PC to analyse 27 arrays before, so I'm > not sure what's chanaged. I'm shut down as many applications running on > my system as possible. > > Is there an idiot's guide on how to get R to use more memory anywhere? > > Thanks in advance > > Tony > Hi Tony, You first need to make sure Windows can use the extra memory bu inserting a /3Gb switch You can do this by editing boot.ini file according to the procedure outlined in http://support.microsoft.com/kb/823440/ My boot.ini under Windows XP PRO looks like this: [boot loader] timeout=30 default=multi(0)disk(0)rdisk(0)partition(2)\WINDOWS [operating systems] multi(0)disk(0)rdisk(0)partition(2)\WINDOWS="1. Microsoft Windows XP Professional" /fastdetect multi(0)disk(0)rdisk(0)partition(2)\WINDOWS="2. /3GB Microsoft Windows XP Professional" /fastdetect /3GB multi(0)disk(0)rdisk(0) partition(2)\WINDOWS="3. /3GB /USERVA 3072 Microsoft Windows XP Professional" /fastdetect /3GB /USERVA=3072 Note: Option and 3 are broadly similar you only need one of them. This can be very risky as if you don't do it correctly you won't be able to re- boot computer so make sure you know what you are doing or get someone who does. Regards, Dermot > [[alternative HTML version deleted]] > > _______________________________________________ > Bioconductor mailing list > Bioconductor at ... > https://stat.ethz.ch/mailman/listinfo/bioconductor > Search the archives: http://news.gmane.org/gmane.science.biology.informatics.conductor > >
ADD COMMENT
0
Entering edit mode
@park-ng-zaneta-2140
Last seen 9.7 years ago
Hi Tony, Have you tried running the ReadAffy code WITHOUT first running the memory.limit(size=4000) line of code? In my recent experience, some things will only load if I don't first run the memory.limit line of code. Whereas if I run the memory.limit(size=4000) line of code first, when I try loading the data I get the same "Cannot allocate vector of size..." error as you are reporting. Weird eh? Hope that this helps you out :-) Cheers, Zaneta -----Original Message----- From: bioconductor-bounces@stat.math.ethz.ch [mailto:bioconductor-bounces at stat.math.ethz.ch] On Behalf Of Brooks, Anthony B Sent: Saturday, 22 September 2007 2:28 a.m. To: bioconductor at stat.math.ethz.ch Subject: [BioC] Cannot allocate vector of size xxxxxxkb Hi Guys, I know this question has been asked many times before, but all the "solutions" I try seem to fail. I have a Dell Optiplex, 2 X 3.4Ghz Dual Core Processor with 4Gb of RAM running windows XP Professional. I am trying to read in a measly 16 HG-U133 Plus2.0 arrays using the following script. celnames <- choose.files(caption="Choose CEL file(s)") celnames <- sort(celnames) Data <- ReadAffy(filenames=celnames) However, I get the "Cannot allocate vector of size" error. I have tried adding the --max-mem-size=2Gb tag onto my shortcut. Using memory.limit(size=4000) The thing is I've used the same PC to analyse 27 arrays before, so I'm not sure what's chanaged. I'm shut down as many applications running on my system as possible. Is there an idiot's guide on how to get R to use more memory anywhere? Thanks in advance Tony [[alternative HTML version deleted]] _______________________________________________ Bioconductor mailing list Bioconductor at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/bioconductor Search the archives: http://news.gmane.org/gmane.science.biology.informatics.conductor ====================================================================== = Attention: The information contained in this message and/or ...{{dropped}}
ADD COMMENT

Login before adding your answer.

Traffic: 386 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6