Entering edit mode
Malene Herbsleb
▴
20
@malene-herbsleb-692
Last seen 10.2 years ago
Dear list members,
I have a question about memory limitations. I know the issue is
frequently debated on the list, and I have checked the archive without
finding the answers I need.
I want to use AffylimmaGUI to analyse 80 cel files based on HGU133a
chips from Affy. Initially, I tried to analyse the estrogen dataset
and it worked very fine. Based on my own data I conducted a target
file with the three column headlines Name FileName Target. However,
when I open a new file and try to load the cel files throug the target
file the loading process starts but then I recieve the message "Not
enough memory". I have tried to change the memory allocated to R by
the command memory.limit(size=2000). (I started wiht the size=200 and
elevated the number.) i also tried to allocate more memory using the
target file option under Properties.
I use a 1.6 GHz Intel Pentium M processor, Windows XP and have 1 GB
ram. My R version is R 1.9.1.
Now, I consider what solution that will be the best for my future work
wiht R, Bioconductor and limma:1) to upgrade my machine to 2 GB ram -
would that be enough to handle my 80 cel files and what number of cel
files will be the limit with 2 GB ram?
2) Alternatively, to buy a stationary computer: what would you
recommend as a minimum size of processor and memory?
In the long term I will have access to a continuous flow of cel files,
so I want a solution that can handle more than "just" my present 80
cel files.
3) Are there any advantages to work on a unix system instead of a
windows platform?
I really hope some of you could give me some good advices before I
start to convince my superviser that we need to make a big investment.
Thanks in advance!
Best regards,
Malene Herbsleb
Malene Herbsleb, MSc, Ph.D student
Molecular Diagnostic Laboratory
Aarhus University Hospital, Skejby Sygehus
Brendstrupgaardsvej
DK-8200 Aarhus N
Denmark
+45 89 49 51 29/ +45 89 42 31 31