Installation memory demands (GO, humanLLMappings)
1
0
Entering edit mode
Jan T. Kim ▴ 70
@jan-t-kim-1050
Last seen 9.6 years ago
Dear Bioconductors, yesterday, I tried to install Bioconductor on a notebook running Linux, with 512 MB RAM, 1 GB swap, Linux 2.6.0, gcc 3.3.2, Debian, R 2.0.1 compiled from source (also yesterday). However, the installation got stuck upon installing GO. While working on GOTERM, the system started swapping in excessive amounts, slowing the R process down to a crawl, finally to 0.1% of CPU time. At this point, I lost patience and killed the process. I found a report on the same kind of problem by searching the mailing list archive: https://stat.ethz.ch/pipermail/bioconductor/2004-November/006881.html I have desktop Linux box on which Bioconductor installs without such problems, this system has 1 GB RAM which, perhaps, makes the difference. I created a local installation of Bioconductor on that and copied the GO directory across to the notebook, which seems to have fixed the problem for now. Continuing the installation from there, humanLLMapping ran into the same kind of problem, so I killed R again and "installed" that package by again rsyncing the directory form the desktop. I'm somewhat puzzled because a number of weeks ago, I installed Bioconductor on a notebook with similar hardware, without such problems. I have the following questions / remarks: * does the installation generally require this much memory, or is this a quirk of that particular notebook (e.g. a buggy library version with an unfortunate memory leak or the like)? * are there workarounds, such as installing these packages manually in some clever way? The standard approach of downloading GO_1.6.8.tar.gz and R CMD INSTALLing that doesn't improve anything... * if this is a general thing, I'd suggest to mention this in the installation howto or the FAQ -- I googled for "memory installation site:www.bioconductor.org", with no relevant results. * what do you think about my "cross-installation" fix, is that reasonable here? With the same version of R and a reasonably similar platform, it seemed worth a try and there are no obvious problems, but I haven't been able to test this thoroughly until now. Best regards & thanks in advance for any comment, Jan -- +- Jan T. Kim -------------------------------------------------------+ | *NEW* email: jtk@cmp.uea.ac.uk | | *NEW* WWW: http://www.cmp.uea.ac.uk/people/jtk | *-----=< hierarchical systems are for files, not for humans >=-----*
GO PROcess GO PROcess • 869 views
ADD COMMENT
0
Entering edit mode
@marion-hakanson-1005
Last seen 9.6 years ago
> * does the installation generally require this much memory, or > is this a quirk of that particular notebook (e.g. a buggy library > version with an unfortunate memory leak or the like)? It happens here, too (SPARC/Solaris-9, Sun Studio 8 compilers, 64-bit). It's most noticeable on building the metadata packages -- I've seen the R virtual memory size grow to ~1GB when building *CHRLOC and *LLMappings packages. Fortunately, my "build" machine has 2GB of RAM. > * what do you think about my "cross-installation" fix, is that > reasonable here? With the same version of R and a reasonably > similar platform, it seemed worth a try and there are no > obvious problems, but I haven't been able to test this > thoroughly until now. I routinely build/test on one machine and deploy on others. However we are pretty careful to make sure identical OS, compilers, and shared libraries are installed on all our systems. The shared libraries are probably the most critical, as far as R/BioC are concerned. Regards, Marion
ADD COMMENT
0
Entering edit mode
Hi, The meta-data packages were just rebuilt (and with every rebuild they just keep getting bigger). I think that something odd (unintended odd) is going on at install time but have not had a chance to check on just what it is. In principle there is no reason for this to be a huge problem, but in practice it does seem to be some times. Hopefully we can find a reasonable solution, soon. Robert On Jan 20, 2005, at 9:57 AM, Marion Hakanson wrote: >> * does the installation generally require this much memory, or >> is this a quirk of that particular notebook (e.g. a buggy >> library >> version with an unfortunate memory leak or the like)? > > It happens here, too (SPARC/Solaris-9, Sun Studio 8 compilers, > 64-bit). It's > most noticeable on building the metadata packages -- I've seen the R > virtual > memory size grow to ~1GB when building *CHRLOC and *LLMappings > packages. > Fortunately, my "build" machine has 2GB of RAM. > > >> * what do you think about my "cross-installation" fix, is that >> reasonable here? With the same version of R and a reasonably >> similar platform, it seemed worth a try and there are no >> obvious problems, but I haven't been able to test this >> thoroughly until now. > > I routinely build/test on one machine and deploy on others. However we > are pretty careful to make sure identical OS, compilers, and shared > libraries > are installed on all our systems. The shared libraries are probably > the most > critical, as far as R/BioC are concerned. > > Regards, > > Marion > > _______________________________________________ > Bioconductor mailing list > Bioconductor@stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/bioconductor > > +--------------------------------------------------------------------- -- ----------------+ | Robert Gentleman phone: (206) 667-7700 | | Head, Program in Computational Biology fax: (206) 667-1319 | | Division of Public Health Sciences office: M2-B865 | | Fred Hutchinson Cancer Research Center | | email: rgentlem@fhcrc.org | +--------------------------------------------------------------------- -- ----------------+
ADD REPLY

Login before adding your answer.

Traffic: 872 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6