Skip to content

Commit

Permalink
[SPARK-27035][SQL] Get more precise current time
Browse files Browse the repository at this point in the history
## What changes were proposed in this pull request?

In the PR, I propose to replace `System.currentTimeMillis()` by `Instant.now()` in the `CurrentTimestamp` expression. `Instant.now()` uses the best available clock in the system to take current time. See [JDK-8068730](https://bugs.openjdk.java.net/browse/JDK-8068730) for more details. In JDK8, `Instant.now()` provides results with millisecond resolution but starting from JDK9 resolution of results is increased up to microseconds.

## How was this patch tested?

The changes were tested by `DateTimeUtilsSuite` and by `DateFunctionsSuite`.

Closes apache#23945 from MaxGekk/current-time.

Authored-by: Maxim Gekk <[email protected]>
Signed-off-by: Sean Owen <[email protected]>
  • Loading branch information
MaxGekk authored and srowen committed Mar 6, 2019
1 parent 5fa4ba0 commit 6001258
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 3 deletions.
4 changes: 3 additions & 1 deletion docs/sql-migration-guide-upgrade.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,9 @@ displayTitle: Spark SQL Upgrading Guide

- the JDBC options `lowerBound` and `upperBound` are converted to TimestampType/DateType values in the same way as casting strings to TimestampType/DateType values. The conversion is based on Proleptic Gregorian calendar, and time zone defined by the SQL config `spark.sql.session.timeZone`. In Spark version 2.4 and earlier, the conversion is based on the hybrid calendar (Julian + Gregorian) and on default system time zone.

- In Spark version 2.4 and earlier, invalid time zone ids are silently ignored and replaced by GMT time zone, for example, in the from_utc_timestamp function. Since Spark 3.0, such time zone ids are rejected, and Spark throws `java.time.DateTimeException`.
- In Spark version 2.4 and earlier, invalid time zone ids are silently ignored and replaced by GMT time zone, for example, in the from_utc_timestamp function. Since Spark 3.0, such time zone ids are rejected, and Spark throws `java.time.DateTimeException`.

- In Spark version 2.4 and earlier, the `current_timestamp` function returns a timestamp with millisecond resolution only. Since Spark 3.0, the function can return the result with microsecond resolution if the underlying clock available on the system offers such resolution.

## Upgrading From Spark SQL 2.3 to 2.4

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
package org.apache.spark.sql.catalyst.expressions

import java.sql.Timestamp
import java.time.LocalDate
import java.time.{Instant, LocalDate}
import java.time.temporal.IsoFields
import java.util.{Locale, TimeZone}

Expand Down Expand Up @@ -96,7 +96,7 @@ case class CurrentTimestamp() extends LeafExpression with CodegenFallback {
override def dataType: DataType = TimestampType

override def eval(input: InternalRow): Any = {
System.currentTimeMillis() * 1000L
instantToMicros(Instant.now())
}

override def prettyName: String = "current_timestamp"
Expand Down

0 comments on commit 6001258

Please sign in to comment.