Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-29819][SQL] Introduce an enum for interval units #26455

Closed
wants to merge 3 commits into from

Conversation

MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Nov 9, 2019

What changes were proposed in this pull request?

In the PR, I propose an enumeration for interval units with the value YEAR, MONTH, WEEK, DAY, HOUR, MINUTE, SECOND, MILLISECOND, MICROSECOND and NANOSECOND.

Why are the changes needed?

  • This should prevent typos in interval unit names
  • Stronger type checking of unit parameters.

Does this PR introduce any user-facing change?

No

How was this patch tested?

By existing test suites ExpressionParserSuite and IntervalUtilsSuite

@MaxGekk
Copy link
Member Author

MaxGekk commented Nov 9, 2019

@srowen @dongjoon-hyun @cloud-fan Does this make sense for you?

object IntervalUnit extends Enumeration {
type IntervalUnit = Value

val Nanosecond = Value(0, "nanosecond")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the first value (here 0) used for?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just be sure that enum values are ordered, so Microsecond < ... < Second < ... < Year.

@@ -28,6 +28,22 @@ import org.apache.spark.unsafe.types.{CalendarInterval, UTF8String}

object IntervalUtils {

object IntervalUnit extends Enumeration {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is intended to be internal only?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

@dongjoon-hyun
Copy link
Member

Thank you for pinging me, @MaxGekk .

val Day = Value(6, "day")
val Week = Value(7, "week")
val Month = Value(8, "month")
val Year = Value(9, "year")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you use all capitals like YEAR please, @MaxGekk ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This contradicts to the Scala coding style guide: https://github.com/databricks/scala-style-guide#naming-convention Enums should be PascalCase.

Copy link
Member

@dongjoon-hyun dongjoon-hyun Nov 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I knew it already, but it's not in Apache Spark. That's the reason I ask like this. #26455 (comment) .

@dongjoon-hyun
Copy link
Member

@MaxGekk . The enum convention is a little mixed in Apache Spark, but all capitals are dominant in core and sql modules at least. For the others, this PR looks reasonable and pending Jenkins.

@SparkQA
Copy link

SparkQA commented Nov 9, 2019

Test build #113503 has finished for PR 26455 at commit 15d0038.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the case.

@SparkQA
Copy link

SparkQA commented Nov 10, 2019

Test build #113527 has finished for PR 26455 at commit 1667a33.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@MaxGekk
Copy link
Member Author

MaxGekk commented Nov 10, 2019

jenkins, retest this, please

@SparkQA
Copy link

SparkQA commented Nov 10, 2019

Test build #113531 has finished for PR 26455 at commit 1667a33.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Merged to master.
Thank you, @MaxGekk .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants