Question: GenomicDataCommons request timeouts when cases() %>% ... %>% results_all()
1
gravatar for mk
3 months ago by
mk20
mk20 wrote:

This issue was already posted on Biostars, just reporting it here for completeness.

https://www.biostars.org/p/359486/#359489

ADD COMMENTlink modified 3 months ago by Sean Davis21k • written 3 months ago by mk20
Answer: GenomicDataCommons request timeouts when cases() %>% ... %>% results_all()
2
gravatar for Sean Davis
3 months ago by
Sean Davis21k
United States
Sean Davis21k wrote:

I should clean up the documentation, but results_all() is a convenience wrapper that is not too smart in that it simply tries to return all results in one trip to the server. This can fail for multiple reasons related to the size of result sets. The better approach (and the only one in the case of large results sets) is to page through the results:

proj <- 'TCGA-COAD'
query = cases() %>%
    GenomicDataCommons::filter(~ project.project_id == proj) %>%
    GenomicDataCommons::expand('diagnoses')
count = query %>% count()
size = 50
reslist = lapply(seq(1,count, size), function(page) {
    query %>% 
        results(size=size, from = page) %>%
        as_tibble()
})
case_data = bind_rows(reslist)

Unfortunately, the size parameter really requires trial-and-error to find the largest "working" setting since the results can vary quite significantly in volume. Instead, I usually just choose a smallish number like 50 or so and wait a few extra seconds. These calls can, in theory, be parallelized using something like BiocParallel to get really fancy (and introduce complexity).

ADD COMMENTlink modified 3 months ago • written 3 months ago by Sean Davis21k
1

Thanks @Sean Davis this is most helpful.

ADD REPLYlink written 3 months ago by mk20
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 164 users visited in the last hour