Question: DESeq error thrown during estimateDispersions w/ coxReid method
gravatar for Steve Lianoglou
7.6 years ago by
Steve Lianoglou12k wrote:
Hi, (maybe this is more appropriate for bioc-devel, but ...) Using R-2.15-patched, DESeq_1.9.4 DESeq isn't liking 1 row in my count data, and throws an error in the `estimateAndFitDispersionsWithCoxReid` function. Specifically this error: Error in, y, family = MASS::negative.binomial(initialGuess), : NA/NaN/Inf in 'x' The count data looks like this, where w1,w2,w3 are replicates of experiment w: w1 w2 w3 x1 x2 x3 y1 y2 z1 z2 0 0 18 0 52 0 0 0 1 1 Ok -- it's weird, I'll grant you that. Still, instead of killing the entire run (it's a little time consuming) I was curious if something could be done about such troublesome count rows? For instance, in the `apply` loop we could wrap into a tryCatch() and just set the dispersion for this row as NA. When all is said and done, perhaps emit a warning about "Can not estimate dispersions for XX rows" and set their dispersion to `max(disps)`. You could even set as an attribute of the object that is ultimately returned the indices of the "bad" rows that the user could then remove after wards. Would that be a reasonable thing to do? Also, would you accept a patch to the `estimateAndFitDispersionsWithCoxReid` that parallelizes it in a similar way that DEXseq parallelizes some of its cpu-intensive bits? Thanks, -steve -- Steve Lianoglou Graduate Student: Computational Systems Biology ?| Memorial Sloan-Kettering Cancer Center ?| Weill Medical College of Cornell University Contact Info:
cancer dexseq • 451 views
ADD COMMENTlink written 7.6 years ago by Steve Lianoglou12k
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 394 users visited in the last hour