memory - global test
3
0
Entering edit mode
Simo.rossi ▴ 70
@simorossi-2293
Last seen 9.6 years ago
Hi, I get an error saying " Cannot allocate vector of size 95Mb", using 'Global test' function. I used as input a matrix, not an Expression Set, I have to use it to limit memory? Or it is not important? My datasets contains 280 samples and 22215 probes: It's a problem? I increased memory size. Thanks, Simona -- Email.it, the professional e-mail, gratis per te: http://www.email.it/f Sponsor: In REGALO 'Meravigliosa Creatura' la super hit di GIANNA NANNINI Clicca qui: http://adv.email.it/cgi- bin/foclick.cgi?mid=6615&d=20070730
• 991 views
ADD COMMENT
0
Entering edit mode
Seth Falcon ★ 7.4k
@seth-falcon-992
Last seen 9.6 years ago
"Simo.rossi" <simo.rossi at="" email.it=""> writes: > Hi, > > I get an error saying " Cannot allocate vector of size 95Mb", using 'Global > test' function. > I used as input a matrix, not an Expression Set, I have to use it to limit > memory? Or it is not important? > > My datasets contains 280 samples and 22215 probes: It's a problem? > > I increased memory size. I suspect you simply do not have enought RAM to run 280 samples on your machine. Do you want to give us sessionInfo() output and details about your system such as how much RAM is there? Note also that it will help if you give us some context and description of the operations you are attempting. + seth -- Seth Falcon | Computational Biology | Fred Hutchinson Cancer Research Center BioC: http://bioconductor.org/ Blog: http://userprimary.net/user/
ADD COMMENT
0
Entering edit mode
Seth Falcon ★ 7.4k
@seth-falcon-992
Last seen 9.6 years ago
"Simo.rossi" <simo.rossi at="" email.it=""> writes: > > Hi Seth, thank you in advance! > > I have 1.5 Gb RAM on my PC. sessionInfo()? We'll assume Windows. 1.5GB RAM may not be enough to do the computations you are interested in... > I try to elaborate the following operations: > >>esetdata <- read.table (file="H_S.txt", sep="\t",header=TRUE,row.names=1 ) > >>x<-as.matrix(esetdata); >>xnorm<-normalize.quantiles(x) >>go <- as.list(hgu133aGO2ALLPROBES); >>go <- lapply(go, function(x) x[!is.na(names(x)) & (names(x) !="IEA")]); >>GO.cellcycle <- go[["GO:0016520"]] >>gt<-globaltest(xnorm, y, GO.cellcycle); >>sampled.gt <- sampling(gt) > > I'm not able to performe global test for 281 samples, It is ok for 171 > samples. You might be able to handle a few more arrays if after creating xnorm, y, and GO.cellcycle, you save these variables to disk using save(), restart R and then load them up again. + seth -- Seth Falcon | Computational Biology | Fred Hutchinson Cancer Research Center BioC: http://bioconductor.org/ Blog: http://userprimary.net/user/
ADD COMMENT
0
Entering edit mode
Simo.rossi ▴ 70
@simorossi-2293
Last seen 9.6 years ago
Hi Seth, thank you in advance! I have 1.5 Gb RAM on my PC. I try to elaborate the following operations: >esetdata <- read.table (file="H_S.txt", sep="\t",header=TRUE,row.names=1 ) >x<-as.matrix(esetdata); >xnorm<-normalize.quantiles(x) >go <- as.list(hgu133aGO2ALLPROBES); >go <- lapply(go, function(x) x[!is.na(names(x)) & (names(x) !="IEA")]); >GO.cellcycle <- go[["GO:0016520"]] >gt<-globaltest(xnorm, y, GO.cellcycle); >sampled.gt <- sampling(gt) I'm not able to performe global test for 281 samples, It is ok for 171 samples. Thanks, Simona > > >> >> I suspect you simply do not have enought RAM to run 280 samples on >> your machine. Do you want to give us sessionInfo() output and details >> about your system such as how much RAM is there? >> >> Note also that it will help if you give us some context and >> description of the operations you are attempting. >> >> + seth >> --------- Original Message -------- Da: Seth Falcon <sfalcon at="" fhcrc.org=""> To: Cc: bioconductor at stat.math.ethz.ch Oggetto: Re: [BioC] memory - global test Data: 30/07/07 16:41 > > > > "Simo.rossi" <simo.rossi at="" email.it=""> writes: > > > Hi, > > > > I get an error saying " Cannot allocate vector of size 95Mb", using 'Global > > test' function. > > I used as input a matrix, not an Expression Set, I have to use it to limit > > memory? Or it is not important? > > > > My datasets contains 280 samples and 22215 probes: It's a problem? > > > > I increased memory size. > > I suspect you simply do not have enought RAM to run 280 samples on > your machine. Do you want to give us sessionInfo() output and details > about your system such as how much RAM is there? > > Note also that it will help if you give us some context and > description of the operations you are attempting. > > + seth > > -- > Seth Falcon | Computational Biology | Fred Hutchinson Cancer Research Center > BioC: http://bioconductor.org/ > Blog: http://userprimary.net/user/ > > > -- Email.it, the professional e-mail, gratis per te: http://www.email.it/f Sponsor: In REGALO 'Meravigliosa Creatura' la super hit di GIANNA NANNINI Clicca qui: http://adv.email.it/cgi- bin/foclick.cgi?mid=6615&d=20070730
ADD COMMENT

Login before adding your answer.

Traffic: 777 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6