DiffBind question: How to extract read counts?
1
0
Entering edit mode
k.panov • 0
@kpanov-15226
Last seen 6.0 years ago

Hi Rory,

I wonder, if it is possible to extract counting data (after performing dba.count) from a resulting DBA object in csv format where consensus peak names are in the first column, peak positions  are in the second/third/forth colums (chromosome, start, end) and  normalized number of reads for each sample are in subsequent columns.

Regards

Konstantin

diffbind read counts extract data • 2.3k views
ADD COMMENT
0
Entering edit mode
Rory Stark ★ 5.2k
@rory-stark-5741
Last seen 14 days ago
Cambridge, UK

I'm not sure what you mean by consensus peak names? The peak intervals are specified only by their positions, the closest they have t oa name is a number (1:numpeaks).

Here's one way of getting the columns you want in a csv (with the peak number as the peak name):

> normCounts <- dba.peakset(myDBA, bRetrieve=TRUE, DataType=DBA_DATA_FRAME)
> write.csv(normCounts, file="normalized_counts.csv")

 

ADD COMMENT
0
Entering edit mode

Thank you very much Rory, that what I need . Regarding peak names, I just was not sure if any peak names (e.g. Peak 1, Peak 2 and so for) were given in addition to coordinates.

Regards

Konstantin

ADD REPLY
0
Entering edit mode

Hi Rory,

Just to clarify, the counts extracted by dba.peakset function, aren't normalized to the library size, aren't they?

Regards

Konstantin

 

ADD REPLY
0
Entering edit mode

The values to be returned are controlled by the score parameter passed to dba.count(). The default values are TMM normalized, which takes library size into account. (You can tell they're normalized because they are not integers).

You can change the score without having to re-count the reads by calling dba.count() with peaks=NULL and score=DBA_SCORE_READS (or any of the other scores detailed on the dba.count() man page).

ADD REPLY

Login before adding your answer.

Traffic: 1032 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6