-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: update rockcraft project, add sanity test, and tox.ini file #85
base: main
Are you sure you want to change the base?
Conversation
serve: | ||
override: replace | ||
kubeflow-volumes: | ||
override: merge |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
changing override to merge
because there are envs set in the charm's pebble layer that are needed.
Although I'm not sure about the behavior in case of replace
in the rock if the envs will still be set, but having it as merge
will make sure they are set.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small comments. Can we also add link to dockerfile to the top ?
python3 setup.py bdist_wheel | ||
cp dist/kubeflow-1.1-py3-none-any.whl $CRAFT_STAGE |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So we decided to remove this but this means the built wheel stays in build phase for ROCK and wont be available in the final image. I think we need to keep this.
|
||
mkdir -p $CRAFT_STAGE/frontend-src/node_modules/kubeflow | ||
cp -r $CRAFT_STAGE/frontend-lib-src/dist/kubeflow/* $CRAFT_STAGE/frontend-src/node_modules/kubeflow | ||
|
||
npm run build -- --output-path=./dist/default --configuration=production |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same problem as above we are actually never moving the built components to staging phase
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
are you sure? but we are doing cd $CRAFT_STAGE/frontend-src/
on L88 before running the build so this means the built components are in the staging directory
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left some comments here, but I'm completely new here (no experience with rocks or team process).
@@ -4,117 +4,133 @@ description: | | |||
This web app is responsible for allowing the user to manipulate PVCs in their Kubeflow | |||
cluster. To achieve this it provides a user friendly way to handle the lifecycle of PVC | |||
objects. | |||
version: v1.7.0_1 | |||
version: "1.8.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there reason to removing v
? In the description, we are still mention volumes-web-app:v1.8.0
, is that ok?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's the new convention for all ROCKs in the team to not have the v
The description was a typo, good observation
rock_version = check_rock.get_version() | ||
LOCAL_ROCK_IMAGE = f"{rock_image}:{rock_version}" | ||
|
||
# assert the rock contains the expected files |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we also check the stdout of those runs below? Since like this we only check that src directory exists and /src/entrypoint.py exists.
LOCAL_ROCK_IMAGE = f"{rock_image}:{rock_version}" | ||
|
||
# assert the rock contains the expected files | ||
subprocess.run( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
idea: Maybe using docker-py
will be easier to use.
LOCAL_CHARM_DIR=charm_repo | ||
|
||
[testenv:pack] | ||
passenv = * |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not-blocking: I think that passenv = *
can be defined as "global" in [testenv]
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point
closes #84
Summary
Testing
/src
Expected output:
Because the container image needs kubeconfig and proper expose of a the serving port, the expected output will be followed by the following errors:
Another test you can do is run the upstream image and compare the output, it should be the same: