Question: read.FCS memory limit? - $PnRNAis larger than R's numeric limit
0
gravatar for lloyd.izard
11 months ago by
lloyd.izard40
lloyd.izard40 wrote:

Good morning,

I am trying to read from a FCS file but I get the following error:

Error in readFCSdata(con, offsets, txt, transformation, which.lines, scale,  : 
  $PnRNAis larger than R's numeric limit:1.79769313486232e+308
In addition: Warning message:
In readFCSdata(con, offsets, txt, transformation, which.lines, scale,  :
  NAs introduced by coercion

The thing is, when I try to read the same file in .csv format, I can read it!

# library

library('flowCore')
library('Biobase')
library('data.table')

# data

data <- read.table('file.csv', dec = ".", sep = ",")
data <- read.FCS('file.fcs', alter.names = F, transformation = "linearize")

My question is: is there a memory limit for read.FCS()? And if so, how do I extend it?

All the best, Lloyd Izard

ADD COMMENTlink written 11 months ago by lloyd.izard40
1

Hi,

I don't suspect a memory limit there. The read.FCS has been extended recently to cope with file bigger than 2GB. The error sounds like a problem in reading and interpreting the TEXT segment of the FCS file. Of course reading a CSV file avoids this problem because there is no such meta data. I think you should report the error onto https://github.com/RGLab/flowCore. You should report the sessionInfo() result and share some information about the FCS: instrument...

Meanwhile, you could verify that the FCS complies with FCS standard using flowIO http://bioinformin.cesnet.cz/flowIO/ (CLIP, Prague) or https://primitybio.github.io/fcs-validator/ (Zach Bjornson).

Best.

ADD REPLYlink modified 11 months ago • written 11 months ago by SamGG190
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 221 users visited in the last hour