-
Notifications
You must be signed in to change notification settings - Fork 225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change cpuonly
to a conda mutex
#488
Conversation
Yep I've gotten lost here a few times. I don't know the answers, so let's try to ping @seemethere and @malfet as the experts here. If one of you can help answer these questions, then @scopatz can return the favour by updating the README with the info he's currently missing. |
It looks like everything is based off of what is in nightly? |
Yes this repository is a mess unfortunately :(
docker run --rm -it \
-e PACKAGE_TYPE=conda \
-e DESIRED_CUDA=cu101 \
-e DESIRED_PYTHON=3.8 \
-e PYTORCH_BUILD_VERSION=1.5.0 \
-e PYTORCH_BUILD_NUMBER=1 \
-e OVERRIDE_PACKAGE_VERSION=1.5.0 \
-e TORCH_CONDA_BUILD_FOLDER='pytorch-nightly' \
-v ${path_to_pytorch}:/pytorch \
-v ${path_to_build}:/builder \
-v "$(pwd):/final_pkgs" \
pytorch/conda-cuda \
/builder/conda/build_pytorch.sh |& tee build.log Which is admittedly pretty awful.
|
Hi @seemethere - thanks! This was extremely helpful. I was able to get it successfully building locally. I have refactored the recipe to build with a There is clearly more work to do here in terms of cleanup and documentation, which I will be getting to next week. I wanted to mention this now, just as a status update and give the opportunity for comments and feedback. |
Hi folks, I believe that this working now. I have tried it locally in conjunction with the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That sounds good! This PR LGTM as far as I can judge.
What would be the steps to deploy this? E.g., since this changes the current cpuonly
, would it require a coordinates release of cpuonly
, pytorch
, torchvision
, etc.?
conda/cpuonly/meta.yaml
Outdated
- pytorch-proc * cpu | ||
|
||
outputs: | ||
# A meta-package to select CPU or GPU build for faiss. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This comment is a little confusing - why specifically for faiss
? I assume it's for pytorch
and possibly other packages too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oopps, this change shouldn't have made it in
The first step would be to get the nightly conda packages onto the new mutex scheme, so we can iron out bugs. This is going to need coordination with a release manager, e.g., @seemethere. |
BTW I have fixed the conflicts here and this is good to go, as far as I am concerned. Let me know if you want to sync on it @seemethere! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR looks mostly good, going to hold off on merging until after the 1.7 release
Hi @scopatz! Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention. You currently have a record in our system, but we do not have a signature on file. In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at [email protected]. Thanks! |
Hey! quick bump -- We'd really like to use this! |
@kev-zheng thanks for the ping. Just curious, where are you planning to use this? |
I see the CLA check turned red, @scopatz would you mind signing it (takes about 10 seconds)? |
We have a large conda environment that include |
Hi All, On the CLA issue, all code here was written on behalf of Quansight. Is this for some reason not covered by Quansight's corporate CLA (https://code.facebook.com/cla/corporate)? |
We filled out both personal and corporate CLAs for each team member. You're not on the corporate CLA, because it was signed in early Nov just after you left (it's not retroactive). |
Alright, in order to move this along, I have fully read through and signed the CLA. Sorry it took so long |
Thanks @scopatz! |
So I'm in the process of trying to merge this and I'm running into a weird issue when attempting to build:
When I added a print statement to see what was the string it was getting caught up on I found it was interpreting @rgommers or @scopatz do you guys have any idea what might be happening? |
maybe @mattip might also have some ideas |
Digging around in the sources, it seems the metadata parser will always prefer to convert to int? |
would be great to have this! |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
Thanks so much for your work on this @scopatz , @rgommers, @seemethere and @ezyang! :) FYI this issue has just started blocking PyTorch GPU installation with mamba working. @wolfv tells me that this PR is required in order to get it working again. It looks like it was just about ready to merge AFAIK, but perhaps it got forgotten? |
Seems like it did. Also, I would like to remove my facebook CLA at this time. Can this please be merged / forked / etc ASAP? |
I'll follow up on this today. |
pytorch/pytorch#54900 showed that this PR no longer worked as expected 3 months ago, something had had changed in the meantime. It needs more work. Trying to hash out a plan together with @seemethere and @malfet. Probably we'll close and resubmit - should have it figured out by tomorrow. |
Okay closing, we'll resubmit together with a more extended and documented test plan - will comment once that is done, so everyone who is interested knows where to look for updates.
Please go ahead @scopatz, should be fine now. Thanks for the work on this! |
This is in an effort to address pytorch/pytorch#40213
But I think I have ended up with more questions than solutions.
meta.yaml
actually live?build_pytorch.sh
seems like it would be better suited to being in Python or some kind of cross-platform language that actually has string formatting. I am curious as to why it is in Bash...pytorch-vX.Y.Z/
directory.CC @ezyang @rgommers & Thanks in advance!