Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MPI builds #132

Closed
njzjz opened this issue Nov 13, 2023 · 2 comments · Fixed by #133
Closed

MPI builds #132

njzjz opened this issue Nov 13, 2023 · 2 comments · Fixed by #133
Labels
question Further information is requested

Comments

@njzjz
Copy link
Member

njzjz commented Nov 13, 2023

Comment:

It seems to me that MPI is not enabled in this recipe. I'd like to know if there is any specific reason for it, or if I can contribute to enable MPI.

@njzjz njzjz added the question Further information is requested label Nov 13, 2023
@mattwthompson
Copy link
Member

I don't think there's a reason for it, except that it hasn't been important to our knowledge. Production runs are generally not going to be using these builds, which value portability and reliability over performance.

I think a PR adding separate MPI builds would be welcome, but I'd like to ensure programs running in serial aren't affected. I'm not actually sure the best way to do this; in the GROMACS feedstock we just build several variants and muck with the build number, which is not what the build number is meant to do, but it works: https://github.com/conda-forge/gromacs-feedstock/tree/main/recipe

@jaimergp
Copy link
Member

Yea, just lack of volunteer time to implement it and maintain it. Same with CUDA stuff. Ambertools takes a while to build so each attempt is very time consuming if things stop working due to a change somewhere.

@njzjz njzjz mentioned this issue Jan 4, 2024
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants