read.FCS memory limit? - $PnRNAis larger than R's numeric limit
0
0
Entering edit mode
lloyd.izard ▴ 80
@lloydizard-15069
Last seen 5.3 years ago

Good morning,

I am trying to read from a FCS file but I get the following error:

Error in readFCSdata(con, offsets, txt, transformation, which.lines, scale,  : 
  $PnRNAis larger than R's numeric limit:1.79769313486232e+308
In addition: Warning message:
In readFCSdata(con, offsets, txt, transformation, which.lines, scale,  :
  NAs introduced by coercion

The thing is, when I try to read the same file in .csv format, I can read it!

# library

library('flowCore')
library('Biobase')
library('data.table')

# data

data <- read.table('file.csv', dec = ".", sep = ",")
data <- read.FCS('file.fcs', alter.names = F, transformation = "linearize")

My question is: is there a memory limit for read.FCS()? And if so, how do I extend it?

All the best, Lloyd Izard

flowCore read.FCS memory_limit • 1.4k views
ADD COMMENT
1
Entering edit mode

Hi,

I don't suspect a memory limit there. The read.FCS has been extended recently to cope with file bigger than 2GB. The error sounds like a problem in reading and interpreting the TEXT segment of the FCS file. Of course reading a CSV file avoids this problem because there is no such meta data. I think you should report the error onto https://github.com/RGLab/flowCore. You should report the sessionInfo() result and share some information about the FCS: instrument...

Meanwhile, you could verify that the FCS complies with FCS standard using flowIO http://bioinformin.cesnet.cz/flowIO/ (CLIP, Prague) or https://primitybio.github.io/fcs-validator/ (Zach Bjornson).

Best.

ADD REPLY

Login before adding your answer.

Traffic: 493 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6