Skip to content

Commit 4bffcf5

Browse files
MaxGekkdongjoon-hyun
authored andcommitted
[SPARK-29275][SQL][DOC] Describe special date/timestamp values in the SQL migration guide
### What changes were proposed in this pull request? Updated the SQL migration guide regarding to recently supported special date and timestamp values, see #25716 and #25708. Closes #25834 ### Why are the changes needed? To let users know about new feature in Spark 3.0. ### Does this PR introduce any user-facing change? No Closes #25948 from MaxGekk/special-values-migration-guide. Authored-by: Maxim Gekk <max.gekk@gmail.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
1 parent cc852d4 commit 4bffcf5

File tree

1 file changed

+14
-0
lines changed

1 file changed

+14
-0
lines changed

docs/sql-migration-guide.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -201,6 +201,20 @@ license: |
201201
</tr>
202202
</table>
203203

204+
- Since Spark 3.0, special values are supported in conversion from strings to dates and timestamps. Those values are simply notational shorthands that will be converted to ordinary date or timestamp values when read. The following string values are supported for dates:
205+
- `epoch [zoneId]` - 1970-01-01
206+
- `today [zoneId]` - the current date in the time zone specified by `spark.sql.session.timeZone`
207+
- `yesterday [zoneId]` - the current date - 1
208+
- `tomorrow [zoneId]` - the current date + 1
209+
- `now` - the date of running the current query. It has the same notion as today
210+
For example `SELECT date 'tomorrow' - date 'yesterday';` should output `2`. Here are special timestamp values:
211+
- `epoch [zoneId]` - 1970-01-01 00:00:00+00 (Unix system time zero)
212+
- `today [zoneId]` - midnight today
213+
- `yesterday [zoneId]` - midnight yesterday
214+
- `tomorrow [zoneId]` - midnight tomorrow
215+
- `now` - current query start time
216+
For example `SELECT timestamp 'tomorrow';`.
217+
204218
## Upgrading from Spark SQL 2.4 to 2.4.1
205219

206220
- The value of `spark.executor.heartbeatInterval`, when specified without units like "30" rather than "30s", was

0 commit comments

Comments
 (0)