runing NetBenchmark on my own datasets
1
0
Entering edit mode
Angel ▴ 40
@angel-7981
Last seen 7.1 years ago
Berlin

sorry friends,

i have a txt file named mycounts.txt (rows are my 76 RNA-seq samples and columns are my geneids) containing the raw count of reads derived from RNA-seq experiments...i was going to use Netbenchmark (http://www.bioconductor.org/packages/release/bioc/html/netbenchmark.html), to infer GRN with embedded methods but i got error,
what i should replace in this code please?
top20.aupr <- netbenchmark(methods="all",datasources.names = "Toy",
                               local.noise=20,global.noise=10,
                               noiseType=c("normal","lognormal"),
                               datasets.num = 2,experiments = 40,
                               seed=1422976420,verbose=FALSE)
i know the R the least,,,i read the manual but really i could not find any clue. i really need your help because i don't know in my case what should be replaced in the script not to get error.
thank you
 

software error network • 1.3k views
ADD COMMENT
0
Entering edit mode

I've notified the author so hopefully they will post back. To help them answer your question please show the exact error message and the output of sessionInfo().

Valerie

ADD REPLY
0
Entering edit mode

thank you so much

i think i found some part of solution but i another error,

if possible tell me a solution

net <- aracne.wrap(mycounts)

Error: cannot allocate vector of size 7.9 Gb
In addition: Warning messages:
1: In cor(dataset, method = estimator, use = "complete.obs") :
  Reached total allocation of 8088Mb: see help(memory.size)
2: In cor(dataset, method = estimator, use = "complete.obs") :
  Reached total allocation of 8088Mb: see help(memory.size)
3: In cor(dataset, method = estimator, use = "complete.obs") :
  Reached total allocation of 8088Mb: see help(memory.size)
4: In cor(dataset, method = estimator, use = "complete.obs") :
  Reached total allocation of 8088Mb: see help(memory.size)
> help(memory.size)
> memory.size(max = TRUE)
[1] 349.44

my computer has more than 1 terabites and this a strong fedora...this matrix has 60 rows and 32550 columns..

thank you

ADD REPLY
2
Entering edit mode
pau.bellot ▴ 20
@paubellot-7828
Last seen 8.5 years ago
Spain/Barcelona/Universitat Politecnica…

Dear Fereshteh,

First of all, sorry for the delay in the answer. The netbenchmark package was conceived to benchmark sever GRN inference methods against a golden standard. There is a function in the development version that is meant to deal with your own data but also needs the golden standard, the function is named "netbenchmark.data".

Of course you can use alone the wrappers of the GRN inference methods, the data should be numeric. Take into account that they have different requirements like the genes does not have 0 standard deviation. Every wrapper documentation cite the original paper and package, so read if they can deal with RNA-seq data.

Best,
Pau

ADD COMMENT
0
Entering edit mode

thank you for your clarifying

ADD REPLY
0
Entering edit mode

sorry Dear Pau,

i run netbenchark on my own data sets, for example by aracne wrapper i have a weighted adjacency matrix by which i was going to evaluate with true.net(adjacency matrix drived from my own gold standard) but resulted confusion matrix is weird. i pasted the script i used, below. if possible help me to get what i did wrong or what i should do to calculate fscore.

> mycounts <- read.table("aracne_net.txt", header = T, sep = "\t", row.names=1)
> head(mycounts[,1:4])
          AT1G01060 AT1G01170 AT1G01180 AT1G01260
AT1G01060 0.0000000 0.6174969  1.410928 0.0000000
AT1G01170 0.6174969 0.0000000  0.000000 0.2309380
AT1G01180 1.4109279 0.0000000  0.000000 0.0000000
AT1G01260 0.0000000 0.2309380  0.000000 0.0000000
AT1G01380 2.6555107 0.0000000  1.363213 0.5188701
AT1G01490 2.4668978 2.5065047  1.513176 1.5809029
> # watching the dimension of matrix
> dim(mycounts)
[1] 3123 3123
> # read as matrix
> mycounts <- as.matrix(mycounts)
> # viewing mycounts class
> class(mycounts)
[1] "matrix"
> # watching the head of rawcounts file
> head(mycounts[,1:4])
          AT1G01060 AT1G01170 AT1G01180 AT1G01260
AT1G01060 0.0000000 0.6174969  1.410928 0.0000000
AT1G01170 0.6174969 0.0000000  0.000000 0.2309380
AT1G01180 1.4109279 0.0000000  0.000000 0.0000000
AT1G01260 0.0000000 0.2309380  0.000000 0.0000000
AT1G01380 2.6555107 0.0000000  1.363213 0.5188701
AT1G01490 2.4668978 2.5065047  1.513176 1.5809029
> # reading the adjacency matrix derived by converting gold standard to adj MATRIX in matlab
> mycounts1 <- read.table("goldstandard_net.txt", header = T, sep = "\t", row.names=1)
> # watching the head of  file
> head(mycounts1[,1:4])
          AT1G01060 AT1G01170 AT1G01180 AT1G01183
AT1G01060         0         0         0         0
AT1G01170         0         0         0         0
AT1G01180         0         0         0         0
AT1G01183         0         0         0         0
AT1G01260         0         0         0         0
AT1G01380         0         0         0         0
> # watching the dimension of matrix
> dim(mycounts1)
[1] 3515 3515
> # read as matrix
> mycounts1 <- as.matrix(mycounts1)
> # viewing mycounts class
> class(mycounts1)
[1] "matrix"
> # watching the head of  file
> head(mycounts1[,1:4])
          AT1G01060 AT1G01170 AT1G01180 AT1G01183
AT1G01060         0         0         0         0
AT1G01170         0         0         0         0
AT1G01180         0         0         0         0
AT1G01183         0         0         0         0
AT1G01260         0         0         0         0
AT1G01380         0         0         0         0
> # loading the libraries
> library(netbenchmark)
> tbl <-  evaluate(mycounts,mycounts1,sym=TRUE,extend=0)
> View(tbl)
> head(tb1[1:4,1:4])
  TP FP      TN   FN
1  0  1 6171605 4249
2  0  2 6171604 4249
3  0  3 6171603 4249
4  0  4 6171602 4249
as you consider the confusion matrix is strange
> max(fscore(tbl))
[1] 0.002752041
> dev <- pr.plot(tbl, col="green", type="l")
> aupr(tbl)
[1] 0.0003045263
> idx <- which.max(fscore(tbl))
thank you for any comment because long time i am trying evaluate matrices 
ADD REPLY

Login before adding your answer.

Traffic: 1045 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6