I am using DEXSeq to infer differential exon usage across 3 conditions for over 100 samples each with more than 500,000 exons.
After estimating size factors, the estimatedispersion step is taking quite a long time to complete. I'm running it on 32 cores and have dedicated a 100 GB of memory, but after 15 hours, its barely made any progress and is at 13%.
I read on an older post from Alejandro to use a former TRT implementation, but apparently that is now no longer in use.
Is there any way around this issue? Appreciate any suggestions.