-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow spark dependency to be configured dynamically #1326
Conversation
Signed-off-by: Ahmed Hussein <[email protected]> Fixes NVIDIA#1316 Allow user-tools to pick the SPARK dependencies based on a runtime env_var. The value format follows the same format of `buildver` in the scala pom file. Currently 333 and 350 (default) are supported. If user specifies an invalid value, there will be a warning message, then the process fails running the java cmd. **Changes** - Add dependency key to the platform config-file - A platform can define its own default dependency versions using `activeBuildVer` key - Add a default `RUNTIME_BUILDVER` in the `__init__.py` to allow upgrades of spark release during official releases - Read an env_var `RAPIDS_USER_TOOLS_RUNTIME_BUILDVER` to pick the correct dependency. - Currently, only `333` and `350` are supported. Default is `350`
user_tools/src/spark_rapids_pytools/resources/onprem-configs.json
Outdated
Show resolved
Hide resolved
Signed-off-by: Ahmed Hussein <[email protected]>
Signed-off-by: Ahmed Hussein <[email protected]>
user_tools/src/spark_rapids_pytools/resources/databricks_aws-configs.json
Show resolved
Hide resolved
Signed-off-by: Ahmed Hussein <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @amahussein. Tested the changes. LGTME.
Signed-off-by: Ahmed Hussein <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @amahussein. LGTME
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm. Thanks!
Can we also make sure to document the env variables
Signed-off-by: Ahmed Hussein (amahussein) <[email protected]> Followup on NVIDIA#1326 to set the default spark version to 3.4.2 for onPrem to avoid the bug described in NVIDIA#1316 without need to do something on customer side.
Signed-off-by: Ahmed Hussein (amahussein) <[email protected]> Followup on #1326 to set the default spark version to 3.4.2 for onPrem to avoid the bug described in #1316 without need to do something on customer side.
Signed-off-by: Ahmed Hussein [email protected]
Fixes #1316
Allow user-tools to pick the SPARK dependencies based on a runtime env_var. The value format follows the same format of
buildver
in the scala pom file.Currently 333 and 350 (default) are supported.
If user specifies an invalid value, there will be a warning message, then the process fails running the java cmd.
Changes
activeBuildVer
keyRUNTIME_BUILDVER
in the__init__.py
to allow upgrades of spark release during official releasesRAPIDS_USER_TOOLS_SPARK_DEP_VERSION
to pick the correct dependency.333
and350
are supported. Default is350
Docs changes
RAPIDS_USER_TOOLS_SPARK_DEP_VERSION
Possible followups