## User: pesquivel

Reputation:
0
Status:
New User
Last seen:
3 years, 10 months ago
Joined:
3 years, 11 months ago
Email:
p********@truveris.com

Profile information, website and location are not shown for new users.

This helps us discourage the inappropriate use of our site.

#### Posts by pesquivel

<prev • 11 results • page 1 of 2 • next >
1
1.4k
views
1
... Michael, thanks for providing technical details. Turns out that the answer to my particular question is easy to find. Instead of using findOverlaps, I simply used overlapsAny(): overlaps <- as.data.table(overlapsAny(a, b)) names(overlaps) <- c("group", "patient.id", "has.overlaps") overlaps ...
written 3.8 years ago by pesquivel0
1
1.4k
views
1
... Hi Michael, Here is a minimal example: a <- data.frame(as.character(c(seq(1:4), 4)), c("A", "A", "A", "A", "A"), c(as.Date("2013-10-30"), as.Date("2013-11-15"), as.Date("2013-09-05"), as.Date("2013-11-29"), as.Date("2013-12-27")), ...
written 3.8 years ago by pesquivel0
1
1.4k
views
1
... Hi, Maybe this is useful to note in the context of the 'different number of rows' error: The number of rows in data.table A can be expected to differ from the number of rows in data.table B. But the number of patient.id values should be identical in each data.table because both data.tables refer to ...
written 3.8 years ago by pesquivel0
1
1.4k
views
1
... Hi, I've added to my 'Initial code and error message' section the error-causing line of code, which I originally forgot: x <- findOverlaps(red.A, red.B) As for data that works, I have CSVs that become the data.tables A and B for drugs A and B (respectively). Would you want me to provide these ...
written 3.8 years ago by pesquivel0
1
1.4k
views
1
... Hi, Background I have two drug utilization datasets for two drugs A and B. Each row in the datasets represents a prescription, described by patient.id, drug.name, start.date , and  days.supply. Both datasets have been filtered so they contain only seven-day-long prescriptions (days.supply = 7 for a ...
written 3.8 years ago by pesquivel0
1
874
views
1
... Hi Michael,  Thank you so much for your help thus far! I've spot-checked the output of your code and have found it working great! Three million rows for 4,000 patients are done in 0.11 seconds. Here is the final function I used: gapruler <- function (Claims) { ClaimsByMember <- with(Cla ...
written 3.9 years ago by pesquivel0
1
874
views
1
... In the code below, LTV is the variable of interest. It is conceptualized as the length of time that patients are on a given medication. We calculate it only for each included patient. LTV is defined as the difference between the maximum of the patient's enddates and the minimum of his startdates. Th ...
written 3.9 years ago by pesquivel0
1
874
views
1
... Hi Michael, Thanks again for your advice. I oversimplified what my protocol does in real life. The threshold gap length is not a hardcoded 60 days. Rather, it is found as the 90th- (or 99th, or 99.9th-) percentile length; the threshold percentile is itself a variable. It is still true, though, that ...
written 3.9 years ago by pesquivel0
1
874
views
1
... Hi Michael, Thank you for being a lifesaver! I could have been a bit clearer in saying that we remove any patient who has __even one__ too-long gap. It nevertheless inspired the following code, which differs from yours merely in that it does not employ max(). This approach can process 3 million r ...
written 3.9 years ago by pesquivel0
1
874
views
1
... Hi all, I am an advanced beginner in R who is extremely thankful for IRanges! It has accelerated the process I describe below by ~10-fold. Desired advice: Because I am working with millions of records, I wonder if there are further speed improvements that can be obtained by creatively combining IR ...
written 3.9 years ago by pesquivel0

#### Latest awards to pesquivel

Popular Question 3.8 years ago, created a question with more than 1,000 views. For IRanges: reduce(), findOverlaps(), and intersect() by group

Content
Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.