-
Notifications
You must be signed in to change notification settings - Fork 7
/
Copy pathfaq.Rmd
315 lines (222 loc) · 10.8 KB
/
faq.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
---
title: "Frequently Asked Questions"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Frequently Asked Questions}
%\VignetteEncoding{UTF-8}
%\VignetteEngine{knitr::rmarkdown}
editor_options:
chunk_output_type: console
---
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>"
)
```
```{r setup}
library(clock)
library(magrittr)
```
## Why can't I do day arithmetic on a year-month-day?
It might seem intuitive that since you can do:
```{r}
x <- year_month_day(2019, 1, 5)
add_months(x, 1)
```
That you should also be able to do:
```{r, error=TRUE}
add_days(x, 1)
```
Generally, calendars don't support day based arithmetic, nor do they support arithmetic at more precise precisions than day. Instead, you have to convert to a time point, do the arithmetic there, and then convert back (if you still need a year-month-day after that).
```{r}
x %>%
as_naive() %>%
add_days(1) %>%
as_year_month_day()
```
The first reason for this is performance. A year-month-day is a _field_ type, implemented as multiple parallel vectors holding the year, month, day, and all other components separately. There are two ways that day based arithmetic could be implemented for this:
- Increment the day field, then check the year and month field to see if they need to be incremented, accounting for months having a differing number of days, and leap years.
- Convert to naive-time, add days, convert back.
Both approaches are relatively expensive. One of the goals of the low-level API of clock is to make these expensive operations explicit. This helps make it apparent that when you need to chain together multiple operations, you should try and do all of your _calendrical_ arithmetic steps first, then convert to a time point (i.e. the second bullet point from above) to do all of your _chronological_ arithmetic.
The second reason for this has to do with invalid dates, such as the three in this vector:
```{r}
odd_dates <- year_month_day(2019, 2, 28:31)
odd_dates
```
What does it mean to "add 1 day" to these? There is no obvious answer to this question. Since clock requires that you first convert to a time point to do day based arithmetic, you'll be forced to call `invalid_resolve()` to handle these invalid dates first. After resolving them manually, then day based arithmetic again makes sense.
```{r}
odd_dates %>%
invalid_resolve(invalid = "next")
odd_dates %>%
invalid_resolve(invalid = "next") %>%
as_naive() %>%
add_days(2)
odd_dates %>%
invalid_resolve(invalid = "overflow")
odd_dates %>%
invalid_resolve(invalid = "overflow") %>%
as_naive() %>%
add_days(2)
```
## Why can't I add time to a zoned-time?
If you have a zoned-time, such as:
```{r}
x <- zoned_parse("1970-04-26 01:30:00-05:00[America/New_York]")
x
```
You might wonder why you can't add any units of time to it:
```{r, error=TRUE}
add_days(x, 1)
add_seconds(x, 1)
```
In clock, you can't do much with zoned-times directly. The best way to understand this is to think of a zoned-time as containing 3 things: a sys-time, a naive-time, and a time zone name. You can access those things with:
```{r}
x
# The printed time with no time zone info
as_naive(x)
# The equivalent time in UTC
as_sys(x)
zoned_zone(x)
```
Calling `add_days()` on a zoned-time is then an ambiguous operation. Should we add to the sys-time or the naive-time that is contained in the zoned-time? The answer changes depending on the scenario.
Because of this, you have to extract out the relevant time point that you care about, operate on that, and then convert back to zoned-time. This often produces the same result:
```{r}
x %>%
as_naive() %>%
add_seconds(1) %>%
as_zoned(zoned_zone(x))
x %>%
as_sys() %>%
add_seconds(1) %>%
as_zoned(zoned_zone(x))
```
But not always! When daylight saving time is involved, the choice of sys-time or naive-time matters. Let's try adding 30 minutes:
```{r, error=TRUE}
# There is a DST gap 1 second after 01:59:59,
# which jumps us straight to 03:00:00,
# skipping the 2 o'clock hour entirely
x %>%
as_naive() %>%
add_minutes(30) %>%
as_zoned(zoned_zone(x))
x %>%
as_sys() %>%
add_minutes(30) %>%
as_zoned(zoned_zone(x))
```
When adding to the naive-time, we got an error. With the sys-time, everything seems okay. What happened?
The sys-time scenario is easy to explain. Technically this converts to UTC, adds the time there, then converts back to your time zone. An easier way to think about this is that you sat in front of your computer for exactly 30 minutes (1800 seconds), then looked at the clock. Assuming that that clock automatically changes itself correctly for daylight saving time, it should read 3 o'clock.
The naive-time scenario makes more sense if you break down the steps. First, we convert to naive-time, dropping all time zone information but keeping the printed time:
```{r}
x
x %>%
as_naive()
```
We add 30 minutes to this. Because we don't have any time zone information, this lands us at 2 o'clock, which isn't an issue when working with naive-time:
```{r}
x %>%
as_naive() %>%
add_minutes(30)
```
Finally, we convert back to zoned-time. If possible, this tries to keep the printed time, and just attaches the relevant time zone onto it. However, in this case that isn't possible, since 2 o'clock didn't exist in this time zone! This _nonexistent time_ must be handled explicitly by setting the `nonexistent` argument of `as_zoned()`. We can choose from a variety of strategies to handle nonexistent times, but here we just roll forward to the next valid moment in time.
```{r}
x %>%
as_naive() %>%
add_minutes(30) %>%
as_zoned(zoned_zone(x), nonexistent = "roll-forward")
```
As a general rule, it often makes the most sense to add:
- Years, quarters, and months to a _calendar_.
- Weeks and days to a _naive time_.
- Hours, minutes, seconds, and subseconds to a _sys time_.
This is what the high-level API for POSIXct does. However, this isn't always what you want, so the low-level API requires you to be more explicit.
## Where did my POSIXct subseconds go?
```{r}
old <- options(digits.secs = 6, digits = 22)
```
Consider the following POSIXct:
```{r}
x <- as.POSIXct("2019-01-01 01:00:00.2", "America/New_York")
x
```
It looks like there is some fractional second information here, but converting it to naive-time drops it:
```{r}
as_naive(x)
```
This is purposeful. clock treats POSIXct as a _second precision_ data type. The reason for this has to do with the fact that POSIXct is implemented as a vector of doubles, which have a limit to how precisely they can store information. For example, try parsing a slightly smaller or larger fractional second:
```{r}
y <- as.POSIXct(
c("2019-01-01 01:00:00.1", "2019-01-01 01:00:00.3"),
"America/New_York"
)
# Oh dear!
y
```
It isn't printing correctly, at the very least. Let's look under the hood:
```{r}
unclass(y)
```
Double vectors have a limit to how much precision they can represent, and this is bumping up against that limit. So our `.1` seconds is instead represented as `.099999etc`.
This precision loss gets worse the farther we get from the epoch, 1970-01-01, represented as `0` under the hood. For example, here we'll use a number of seconds that represents the year 2050, and add 5 microseconds to it:
```{r}
new_utc <- function(x) {
class(x) <- c("POSIXct", "POSIXt")
attr(x, "tzone") <- "UTC"
x
}
year_2050 <- 2524608000
five_microseconds <- 0.000005
new_utc(year_2050)
# Oh no!
new_utc(year_2050 + five_microseconds)
# Represented internally as:
year_2050 + five_microseconds
```
Because of these issues, clock treats POSIXct as a second precision data type, dropping all other information. Instead, you should parse directly into a subsecond clock type:
```{r}
naive_parse(
c("2019-01-01 01:00:00.1", "2019-01-01 01:00:00.3"),
precision = "millisecond"
) %>%
as_zoned("America/New_York")
```
```{r}
# Reset old options
options(old)
```
## Why doesn't this work with data.table?
While the entire high-level API for R's native date (Date) and date-time (POSIXct) types will work fine with data.table, if you try to put any of the major clock types into a data.table, you will probably see this error message:
```{r, eval=FALSE}
library(data.table)
data.table(x = year_month_day(2019, 1, 1))
#> Error in dimnames(x) <- dn :
#> length of 'dimnames' [1] not equal to array extent
```
You won't see this issue when working with data.frames or tibbles.
As of now, data.table doesn't support the concept of _record types_. These are implemented as a list of vectors of equal length, that together represent a single idea. The `length()` of these types should be taken from the length of the vectors, not the length of the list. If you unclass any of the clock types, you'll see that they are implemented in this way:
```{r}
ymdh <- year_month_day(2019, 1, 1:2, 1)
unclass(ymdh)
unclass(as_naive(ymdh))
```
I find that record types are extremely useful data structures for building upon R's basic atomic types in ways that otherwise couldn't be done. They allow calendar types to hold information about each component, enabling instant access for retrieval, modification, and grouping. They also allow calendars to represent invalid dates, such as `2019-02-31`, without any issues. Time points use them to store up to nanosecond precision date-times, which are really C++ `int64_t` types that don't nicely fit into any R atomic type (I am aware of the bit64 package, and made a conscious decision to implement as a record type instead. This partly had to do with how missing values are handled, and how that integrates with vctrs).
The idea of a record type actually isn't new. R's own POSIXlt type is a record type:
```{r}
x <- as.POSIXct("2019-01-01", "America/New_York")
# POSIXct is implemented as a double
unclass(x)
# POSIXlt is a record type
unclass(as.POSIXlt(x))
```
data.table doesn't truly support POSIXlt either. Instead, you get a warning about them converting it to a POSIXct. This is pretty reasonable considering their focus on performance.
```{r, eval=FALSE}
data.table(x = as.POSIXlt("2019-01-01", "America/New_York"))
#> x
#> 1: 2019-01-01
#> Warning message:
#> In as.data.table.list(x, keep.rownames = keep.rownames, check.names = check.names, :
#> POSIXlt column type detected and converted to POSIXct. We do not recommend use of POSIXlt at all because it uses 40 bytes to store one date.
```
It was previously a bit difficult to create record types in R because there were few examples and no resources to build on. In vctrs, we've added a `vctrs_rcrd` type that serves as a base to build new record types on. Many S3 methods have been written for `vctrs_rcrd`s in a way that should work for any type that builds on top of it, giving you a lot of scaffolding for free.
I am hopeful that as more record types make their way into the R ecosystem built on this common foundation, it might be possible for data.table to enable this as an approved type in their package.