Trimming FCS files doesn't decrease size
0
0
Entering edit mode
@nick-england-8702
Last seen 8.7 years ago
United Kingdom

I have some FCS files which are too large to load into Kaluza which is the software scientists in my lab wish to use.

I'm trying to trim them down to just the first 5million events, so they can load them for analysis, and have encountered an oddity.

The starting file is 801MB in size. I can load it into R which then reports 9,993,224 cells, using

library(flowCore)
x <- read.FCS(filename,transformation=FALSE)

(I tried using the which.lines parameter, but that resulted in R hanging indefinitely.)

I then choose a subset of the file with:

trimed <- x[1:5000000,]

which then reports it has 5million rows and write this out with:

write.FCS(trimed,"filepath",delimiter='|')

I added the delimiter parameter after finding I couldn't otherwise re-load the written file into R. The resulting file, however, is 802MB in size. Loading it back into R still only loads 5,000,000 cells, so I am unsure as to why the file has grown slightly in size when I was expecting it to approximately halve in size. I've also trimmed a file with 22 million rows (1.8GB) down to 5 million rows, and it also is 802MB. Both were read into Kaluza correctly, so it seems to have worked, I'm just not sure why the file is double the size of the other 5 million row files written out by the FACS machine.

I am not that familiar with FCS files, so there is probably something I am missing in the file specification. Can anyone be of assistance?

Thanks!

FCS • 1.0k views
ADD COMMENT

Login before adding your answer.

Traffic: 480 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6