win 64bit and general BioC computing power?
4
0
Entering edit mode
@matthew-hannah-621
Last seen 9.6 years ago
Hi, This has been discussed before, but please humour me and let me know your opinions. Currently I have 2Gz and 1GB RAM, win2k. I understand that 64bit R/BioC is only available for Linux at present, I currently install R using the win32.exe build and libraries provided, when is it likely that win64.exe will be available, or it is possible to source build (not that I really understand what that means or if it is difficult..). More general is windows really 64bit, does this help with things like running out of memory in excel? Or would a good 32bit processor be the better buy? Thanks to the justGCRMA team I can now process the 60 affy chips I need to, but this may in the next year go to 120. I guess I'd like to be able to ReadAffy, gcrma, fitaffyPLM and use LIMMA certainly with 60, but hopefully with more, without it being painfully slow. I'd like to be able to use various clustering methods, I've tried with hclust but it runs out of memory with more than several thousand genes (there's 23k on the chip). I've heard that hierarchical clustering is not feasible on such large amounts of genes due to the exponential increase in memory usage - is this true, how much RAM would you need for 23k genes, 60-100 chips? When using some functions, such as 4 x 4 display of image plots of AffyPLM or scatter- plots of multiple chip comparisons, they can take an age to display. As BioC is mostly 2D, would a good graphics card have much effect or is it just a processor/RAM thing, and leave the graphics cards for gamers? Generally I just want a faster machine as in the long term, not having to wait so much would save a lot. But I want to upgrade to something without finding it is not up to the job in 6 months or a year. I use too many general windows programs, and I'm networked so I don't really want to move away from windows unless there's a huge gain to be made... So, what would you consider before deciding... Cheers Matt
GO Clustering affy limma gcrma PROcess GO Clustering affy limma gcrma PROcess • 1.1k views
ADD COMMENT
0
Entering edit mode
@james-w-macdonald-5106
Last seen 13 hours ago
United States
On Windows this is a moot point at present. The 64 bit version is still in beta testing, so if you want to use a 64 bit OS, you will have to use a *nix of some sort. Best, Jim James W. MacDonald Affymetrix and cDNA Microarray Core University of Michigan Cancer Center 1500 E. Medical Center Drive 7410 CCGC Ann Arbor MI 48109 734-647-5623 >>> "Matthew Hannah" <hannah@mpimp-golm.mpg.de> 06/17/04 10:45AM >>> Hi, This has been discussed before, but please humour me and let me know your opinions. Currently I have 2Gz and 1GB RAM, win2k. I understand that 64bit R/BioC is only available for Linux at present, I currently install R using the win32.exe build and libraries provided, when is it likely that win64.exe will be available, or it is possible to source build (not that I really understand what that means or if it is difficult..). More general is windows really 64bit, does this help with things like running out of memory in excel? Or would a good 32bit processor be the better buy? Thanks to the justGCRMA team I can now process the 60 affy chips I need to, but this may in the next year go to 120. I guess I'd like to be able to ReadAffy, gcrma, fitaffyPLM and use LIMMA certainly with 60, but hopefully with more, without it being painfully slow. I'd like to be able to use various clustering methods, I've tried with hclust but it runs out of memory with more than several thousand genes (there's 23k on the chip). I've heard that hierarchical clustering is not feasible on such large amounts of genes due to the exponential increase in memory usage - is this true, how much RAM would you need for 23k genes, 60-100 chips? When using some functions, such as 4 x 4 display of image plots of AffyPLM or scatter- plots of multiple chip comparisons, they can take an age to display. As BioC is mostly 2D, would a good graphics card have much effect or is it just a processor/RAM thing, and leave the graphics cards for gamers? Generally I just want a faster machine as in the long term, not having to wait so much would save a lot. But I want to upgrade to something without finding it is not up to the job in 6 months or a year. I use too many general windows programs, and I'm networked so I don't really want to move away from windows unless there's a huge gain to be made... So, what would you consider before deciding... Cheers Matt _______________________________________________ Bioconductor mailing list Bioconductor@stat.math.ethz.ch https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor
ADD COMMENT
0
Entering edit mode
@adaikalavan-ramasamy-675
Last seen 9.6 years ago
If you need RMA, GCRMA and other pre-processing tools done once only (hopefully), it might be just worthwhile asking someone with Unix/Linux servers to do this for you. It might be also useful for you to have an account on *nix. Using R on Unix is very similar to using it on Windows, so not a steep learning curve. You can even login into your account from Windows using Xwindows tools like ssh or Exceed. This way you have access to both operating systems at the same time. In a years time, memory will be cheaper and GCRMA may be rewritten (like RMA was) more efficiently. I would be rather buy a reasonable machine now and add 1 GB memory in a years time or connect to *nix than buy a super-expensive machine now. On Thu, 2004-06-17 at 15:45, Matthew Hannah wrote: > Hi, > > This has been discussed before, but please humour me and let me know your opinions. > > Currently I have 2Gz and 1GB RAM, win2k. > > I understand that 64bit R/BioC is only available for Linux at present, I currently install > R using the win32.exe build and libraries provided, when is it likely that win64.exe > will be available, or it is possible to source build (not that I really understand what > that means or if it is difficult..). More general is windows really 64bit, does this > help with things like running out of memory in excel? Or would a good 32bit processor > be the better buy? > > Thanks to the justGCRMA team I can now process the 60 affy chips I need to, but this > may in the next year go to 120. I guess I'd like to be able to ReadAffy, gcrma, > fitaffyPLM and use LIMMA certainly with 60, but hopefully with more, without it being > painfully slow. > > I'd like to be able to use various clustering methods, I've tried with hclust but it > runs out of memory with more than several thousand genes (there's 23k on the chip). > I've heard that hierarchical clustering is not feasible on such large amounts of genes > due to the exponential increase in memory usage - is this true, how much RAM would you > need for 23k genes, 60-100 chips? > > When using some functions, such as 4 x 4 display of image plots of AffyPLM or scatter- > plots of multiple chip comparisons, they can take an age to display. As BioC is mostly > 2D, would a good graphics card have much effect or is it just a processor/RAM thing, and > leave the graphics cards for gamers? > > Generally I just want a faster machine as in the long term, not having to wait so much > would save a lot. But I want to upgrade to something without finding it is not > up to the job in 6 months or a year. I use too many general windows programs, and I'm > networked so I don't really want to move away from windows unless there's a huge gain to > be made... > > So, what would you consider before deciding... > > Cheers > Matt > > _______________________________________________ > Bioconductor mailing list > Bioconductor@stat.math.ethz.ch > https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor >
ADD COMMENT
0
Entering edit mode
Liaw, Andy ▴ 360
@liaw-andy-125
Last seen 9.6 years ago
Nowadays you can get a dual Opteron box with 16GB of memory preloaded with Linux for x86-64. I would not call that `super-expensive'. How much does it cost to run a 60- or 120-chip experiment? To add to what Adaikalavan and others have said: We are in an essentially all-Windoze environment, but we were able to convince our management that to do more serious computing, Windoze just doesn't cut it. We have a few Linux boxes fitted with lots of memory, that are used only for computation, and nothing else. The boxes sit in a server room, out of everyone's sight, and we just login remotely using VNC from our standard Windoze laptop. One need not move away from Windows entirely, just for the large computation part. Cheers, Andy > From: Adaikalavan Ramasamy > > If you need RMA, GCRMA and other pre-processing tools done once only > (hopefully), it might be just worthwhile asking someone with > Unix/Linux > servers to do this for you. > > It might be also useful for you to have an account on *nix. Using R on > Unix is very similar to using it on Windows, so not a steep learning > curve. You can even login into your account from Windows > using Xwindows > tools like ssh or Exceed. This way you have access to both operating > systems at the same time. > > In a years time, memory will be cheaper and GCRMA may be > rewritten (like > RMA was) more efficiently. I would be rather buy a reasonable machine > now and add 1 GB memory in a years time or connect to *nix than buy a > super-expensive machine now. > > > On Thu, 2004-06-17 at 15:45, Matthew Hannah wrote: > > Hi, > > > > This has been discussed before, but please humour me and > let me know your opinions. > > > > Currently I have 2Gz and 1GB RAM, win2k. > > > > I understand that 64bit R/BioC is only available for Linux > at present, I currently install > > R using the win32.exe build and libraries provided, when is > it likely that win64.exe > > will be available, or it is possible to source build (not > that I really understand what > > that means or if it is difficult..). More general is > windows really 64bit, does this > > help with things like running out of memory in excel? Or > would a good 32bit processor > > be the better buy? > > > > Thanks to the justGCRMA team I can now process the 60 affy > chips I need to, but this > > may in the next year go to 120. I guess I'd like to be able > to ReadAffy, gcrma, > > fitaffyPLM and use LIMMA certainly with 60, but hopefully > with more, without it being > > painfully slow. > > > > I'd like to be able to use various clustering methods, I've > tried with hclust but it > > runs out of memory with more than several thousand genes > (there's 23k on the chip). > > I've heard that hierarchical clustering is not feasible on > such large amounts of genes > > due to the exponential increase in memory usage - is this > true, how much RAM would you > > need for 23k genes, 60-100 chips? > > > > When using some functions, such as 4 x 4 display of image > plots of AffyPLM or scatter- > > plots of multiple chip comparisons, they can take an age to > display. As BioC is mostly > > 2D, would a good graphics card have much effect or is it > just a processor/RAM thing, and > > leave the graphics cards for gamers? > > > > Generally I just want a faster machine as in the long term, > not having to wait so much > > would save a lot. But I want to upgrade to something > without finding it is not > > up to the job in 6 months or a year. I use too many general > windows programs, and I'm > > networked so I don't really want to move away from windows > unless there's a huge gain to > > be made... > > > > So, what would you consider before deciding... > > > > Cheers > > Matt > > > > _______________________________________________ > > Bioconductor mailing list > > Bioconductor@stat.math.ethz.ch > > https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor > > > > _______________________________________________ > Bioconductor mailing list > Bioconductor@stat.math.ethz.ch > https://www.stat.math.ethz.ch/mailman/listinfo/bioconductor > >
ADD COMMENT
0
Entering edit mode
Liaw, Andy ▴ 360
@liaw-andy-125
Last seen 9.6 years ago
> From: Liaw, Andy > > Nowadays you can get a dual Opteron box with 16GB of memory > preloaded with > Linux for x86-64. Apologies: I missed one part of that sentence: for $9500. I would not call that `super-expensive'. > How much does > it cost to run a 60- or 120-chip experiment? > > To add to what Adaikalavan and others have said: > > We are in an essentially all-Windoze environment, but we were able to > convince our management that to do more serious computing, > Windoze just > doesn't cut it. We have a few Linux boxes fitted with lots > of memory, that > are used only for computation, and nothing else. The boxes > sit in a server > room, out of everyone's sight, and we just login remotely > using VNC from our > standard Windoze laptop. > > One need not move away from Windows entirely, just for the > large computation > part. > > Cheers, > Andy
ADD COMMENT

Login before adding your answer.

Traffic: 740 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6