Question: read.FCS memory limit? - $PnRNAis larger than R's numeric limit
gravatar for lloyd.izard
9 months ago by
lloyd.izard30 wrote:

Good morning,

I am trying to read from a FCS file but I get the following error:

Error in readFCSdata(con, offsets, txt, transformation, which.lines, scale,  : 
  $PnRNAis larger than R's numeric limit:1.79769313486232e+308
In addition: Warning message:
In readFCSdata(con, offsets, txt, transformation, which.lines, scale,  :
  NAs introduced by coercion

The thing is, when I try to read the same file in .csv format, I can read it!

# library


# data

data <- read.table('file.csv', dec = ".", sep = ",")
data <- read.FCS('file.fcs', alter.names = F, transformation = "linearize")

My question is: is there a memory limit for read.FCS()? And if so, how do I extend it?

All the best, Lloyd Izard

ADD COMMENTlink written 9 months ago by lloyd.izard30


I don't suspect a memory limit there. The read.FCS has been extended recently to cope with file bigger than 2GB. The error sounds like a problem in reading and interpreting the TEXT segment of the FCS file. Of course reading a CSV file avoids this problem because there is no such meta data. I think you should report the error onto You should report the sessionInfo() result and share some information about the FCS: instrument...

Meanwhile, you could verify that the FCS complies with FCS standard using flowIO (CLIP, Prague) or (Zach Bjornson).


ADD REPLYlink modified 9 months ago • written 9 months ago by SamGG190
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 359 users visited in the last hour