Hi Rory et al.,
I'm hitting the memory limits of my server (96GB RAM) when using
DiffBind::dba.count(), which results in my job getting killed.
I'm trying to generate a count matrix from many samples (>30), which translates to many sites/peaks. I suspect the massive matrix cannot be allocated by R into memory.
I've seen the argument bLowMem mentioned in some previous discussions, but it doesn't seem to be recognised by
dba.count() any longer, is that right?
Is there any way to use
dba.count() in this scenario? Would something like the
bigmemory package be helpful here?