Question: DEXSeq: How to reduce compute time for estimating dispersions for large datasets.
gravatar for mjrarcher
8 months ago by
mjrarcher0 wrote:


I am using DEXSeq to infer differential exon usage across 3 conditions for over 100 samples each with more than 500,000 exons.

After estimating size factors, the estimatedispersion step is taking quite a long time to complete. I'm running it on 32 cores and have dedicated a 100 GB of memory, but after 15 hours, its barely made any progress and is at 13%.

I read on an older post from Alejandro to use a former TRT implementation, but apparently that is now no longer in use. 

Is there any way around this issue? Appreciate any suggestions.



ADD COMMENTlink written 8 months ago by mjrarcher0
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 230 users visited in the last hour