-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-29819][SQL] Introduce an enum for interval units #26455
Conversation
@srowen @dongjoon-hyun @cloud-fan Does this make sense for you? |
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/IntervalUtils.scala
Show resolved
Hide resolved
object IntervalUnit extends Enumeration { | ||
type IntervalUnit = Value | ||
|
||
val Nanosecond = Value(0, "nanosecond") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the first value (here 0) used for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just be sure that enum values are ordered, so Microsecond
< ... < Second
< ... < Year
.
@@ -28,6 +28,22 @@ import org.apache.spark.unsafe.types.{CalendarInterval, UTF8String} | |||
|
|||
object IntervalUtils { | |||
|
|||
object IntervalUnit extends Enumeration { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is intended to be internal only?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes
Thank you for pinging me, @MaxGekk . |
val Day = Value(6, "day") | ||
val Week = Value(7, "week") | ||
val Month = Value(8, "month") | ||
val Year = Value(9, "year") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you use all capitals like YEAR
please, @MaxGekk ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This contradicts to the Scala coding style guide: https://github.com/databricks/scala-style-guide#naming-convention Enums should be PascalCase.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I knew it already, but it's not in Apache Spark. That's the reason I ask like this. #26455 (comment) .
@MaxGekk . The enum convention is a little mixed in Apache Spark, but all capitals are dominant in |
Test build #113503 has finished for PR 26455 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please fix the case.
Test build #113527 has finished for PR 26455 at commit
|
jenkins, retest this, please |
Test build #113531 has finished for PR 26455 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Merged to master.
Thank you, @MaxGekk .
What changes were proposed in this pull request?
In the PR, I propose an enumeration for interval units with the value
YEAR
,MONTH
,WEEK
,DAY
,HOUR
,MINUTE
,SECOND
,MILLISECOND
,MICROSECOND
andNANOSECOND
.Why are the changes needed?
Does this PR introduce any user-facing change?
No
How was this patch tested?
By existing test suites
ExpressionParserSuite
andIntervalUtilsSuite