I am using v2.0.2 of featureCounts
on a linux machine to summarize reads. My effective command is:
featureCounts -a annots.saf -F SAF -o counts.out -T 2 -p -A aliases.csv aligns.sorted.bam
This command fails with the error:
Segmentation fault (core dumped)
The largest chromosome in this organism is over 2Gbp. I suspected that it may be because of these large chromosomes. To test this, I systematically removed all annotations for chromosomes starting with the largest one and moving on to the second largest one, and so on. featureCounts
continued to fail with the same error until I removed the three largest chromosomes from the SAF file. Those chromosomes have the lengths 2006429847, 1877449469 and 1725456226. Once I removed all rows with these three chromosomes from the SAF file, the featureCounts
command succeeds with the same BAM file without any errors. Is there a limit to the size of the chromosomes that featureCounts
expects? How can I avoid this error?
I constructed a minimum reproducible example by copying a few genes from the largest chromosome into a SAF file and 1000 alignments from the same range into a SAM file. I don't see a way to attach files here but I would be happy to provide them if needed.
Can you please send the link to the SAM and SAF files to
yang.liao@monash.edu
? It will be very helpful for us to find the reason of the problem.I sent an email with the attachments. Thank you!