User: mnaymik

gravatar for mnaymik
mnaymik10
Reputation:
10
Status:
New User
Location:
United States
Last seen:
2 weeks, 3 days ago
Joined:
2 years, 7 months ago
Email:
m******@asu.edu

Posts by mnaymik

<prev • 20 results • page 1 of 2 • next >
0
votes
0
answers
127
views
0
answers
FEM Errors out after Monte Carlo Runs
... What exactly does this error mean? Are there some parameters I can change to avoid this?   Error in sgcN.lo[[selMod.idx[m]]] :    attempt to select less than one element   ...
fem written 7 months ago by mnaymik10
0
votes
0
answers
312
views
0
answers
SCDE for differential expression
... I really like the approach taken in the SCDE package for single cell differential expression. In the examples differential expression analysis is performed between two cell types but would it be appropriate to use this package for differential expression in other scenarios? For example, all cell typ ...
differential expression single cell scde written 15 months ago by mnaymik10
0
votes
2
answers
747
views
2
answers
Comment: C: edgeR subsetting DGEList by column/sample
... Since I was using the time column as my group I had set the samples$group=NULL. Later I had been setting group = time which if I do before subsetting It works just fine. I did not realize group was that sensitive. Thanks! ...
written 16 months ago by mnaymik10
0
votes
2
answers
747
views
2
answers
Comment: C: edgeR subsetting DGEList by column/sample
... > library(edgeR) Loading required package: limma > sessionInfo() R version 3.3.1 (2016-06-21) Platform: x86_64-apple-darwin13.4.0 (64-bit) Running under: OS X 10.11.5 (El Capitan)   locale: [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8   attached base packages: [ ...
written 16 months ago by mnaymik10
3
votes
2
answers
747
views
2
answers
edgeR subsetting DGEList by column/sample
... I saw this post from a while ago regarding a similar issue: https://support.bioconductor.org/p/61016/ >d$samples[1:6,] sample                                                   lib.size    norm.factor  type    time preExercise_TAGGCTGACTTGAG.1      856    1.1020236     B  pre preExercise_TCCATC ...
edger dgelist subsetting written 16 months ago by mnaymik10 • updated 16 months ago by Gordon Smyth32k
0
votes
1
answers
433
views
1
answers
Comment: C: Using Scran normalization with in edgeR
... Yeah, I noticed that the change is very small. I went through the differential expression workflow in edgeR on the same dataset ordered differently and the results are almost identical. Thanks again for the help! ...
written 16 months ago by mnaymik10
0
votes
1
answers
433
views
1
answers
Comment: C: Using Scran normalization with in edgeR
... I noticed also that if I change the order or the samples in the data frame and normalize via the quick cluster method that the size factors slightly change. Is there a random seed or something that is causing this? ...
written 16 months ago by mnaymik10
0
votes
1
answers
2.2k
views
1
answers
Comment: C: how to get normalized counts from edgeR
... By this method are the new 'nc' table values in counts per million? ...
written 16 months ago by mnaymik10
0
votes
1
answers
433
views
1
answers
Comment: C: Using Scran normalization with in edgeR
... Thank you very much! ...
written 16 months ago by mnaymik10
3
votes
1
answer
433
views
1
answer
Using Scran normalization with in edgeR
... I am trying to use a custom size factors (computed using the Scran package) in edgeR when doing the workflow for differential expression and get the following error: d2=estimateDisp(d2,design.mat,mixed.df=T,offset=sce$size_factor) Error in estimateDisp.default(y = y$counts, design = design, group = ...
edger sizefactors estimatedisp scran written 16 months ago by mnaymik10 • updated 16 months ago by Aaron Lun17k

Latest awards to mnaymik

Scholar 2.6 years ago, created an answer that has been accepted. For A: BiSeq trimClusters FDR.loc confusion

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.2.0
Traffic: 349 users visited in the last hour