Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default version capping included by pixi add? #639

Closed
paugier opened this issue Jan 9, 2024 · 19 comments · Fixed by #1508
Closed

Default version capping included by pixi add? #639

paugier opened this issue Jan 9, 2024 · 19 comments · Fixed by #1508
Labels
✨ enhancement Feature request

Comments

@paugier
Copy link

paugier commented Jan 9, 2024

Problem description

pixi automatically adds upper bounds for all dependencies. I don't think this is a reasonable choice for Python libraries (see for example https://iscinumpy.dev/post/bound-version-constraints/). I used Poetry which also does this and had to switch to PDM to avoid this behavior.

pixi even includes upper bounds for dependencies not using semver...

@paugier paugier added the ✨ enhancement Feature request label Jan 9, 2024
@ruben-arts
Copy link
Contributor

He @paugier,

We do this deliberately, checkout the blogpost @wolfv made: https://prefix.dev/blog/the_python_packaging_debate

There is another issue that talks about overriding the version of certain packages. #620.

To add to this we might allow to override the whole specifying a version automatically through a machine local configuration.

@paugier
Copy link
Author

paugier commented Jan 10, 2024

Thanks for the quick reply! I guess you know what you are doing but I still don't understand the motivations for including strict upper bounds for every dependency.

There are cases for which it is for sure incorrect, for example capping pip or pytest does not make sense. But even for Python library (for example Pixi wrote mpi4py = ">=3.1.5,<3.2"), I don't understand how having so many strict upper bounds will not lead to incompatible versions of different Python libraries (i.e. what we experienced on PyPI).

But this may be due to my misunderstanding of Pixi. I'd like to invest time on Pixi mostly to be able to maintain in the repository of some Python libraries what is needed to create/maintain the conda-forge recipe (in particular with a check in the CI that the conda package builds without issue). So I identify the dependencies in pixi.toml to what will be used in the conda-forge recipe. I don't know if it is correct?

It seems to me that with upper bounds of dependencies, we have one syntax to say two very different things:

  • "I'm sure that it works with this range and a version higher than X might leads to issues."
  • "I know that this code needs a version smaller than X"

If I publish a conda-forge package with mpi4py = ">=3.1.5,<3.2", I guess that environments using this package will never use mpi4py 3.2. It's good because we are sure that it's going to work but it's bad because my package will be incompatible with a package using mpi4py = ">=3.2.0,<3.3", whereas there is a very high probability that they are in fact compatible. It's also bad because users won't be able to enjoy potential improvements in mpi4py 3.2.

Therefore, I feel that there is something wrong with generalization of strict upper bounds and even after reading the blogpost I don't envision a good short term solution.

Maybe tools like Pixi should be able to generate from soft requirements (without upper bounds if we don't know that there is an existing issue) + a successful lockfile, a set of strong requirements (with upper bounds) that lead with a very high probability to a working environments ? Then, one could ask for different things

  • a working environment
  • an environment with the most recent versions of the dependencies (useful also to check if new versions break things)

@ruben-arts
Copy link
Contributor

We've changed it to a range in this PR: #536 because of this issue: #285.

I can't seem to find it but we have been talking about making the way we deal with this range globally configurable. Because I see why this would be different for python libraries.

That said, it is a helper for the user and they aren't forced to use this helper requirement, the requirement is in this format to help the user edit the requirement to what they really want the dependency to be.

If you use pixi add to add multiple dependencies it will solve them at once and make sure the requirements in the pixi.toml are the highest set of requirements supported by the combination of the packages.

It would help us if you would describe what your perfect world solution would look like, with some use-cases.

@baszalmstra
Copy link
Contributor

@paugier Thanks for your input.

I think the "strictness" requirement of the dependencies is highly dependent on the ecosystem. Although a minor version bump in a python package is (not uncommonly) still compatible this can be very different for C++ packages where an ABI break may have silently been introduced.

Its hard for us to know upfront whether a package is a python package or not so we choose to be relatively restrictive. Of course as a user you are free to alter the requirement anyway you please but for current pixi projects which are generally not published I think the strictness by default suffices.

Like @ruben-arts said in the future it would be great if we had a project or global configuration that allows configuring the default behavior. We could even write a less restrictive configuration to the project configuration by default when you initialize a python project.

I would also be interested to know what you think we should default to? Not have a upperbound at all? Or upperbound to the major version?

Maybe tools like Pixi should be able to generate from soft requirements (without upper bounds if we don't know that there is an existing issue) + a successful lockfile, a set of strong requirements (with upper bounds) that lead with a very high probability to a working environments ? Then, one could ask for different things

Id be interested to know how that could be done because that would be awesome! I see some issues because most of the incompatibility issues are discovered at run- or buildtime, not while resolving an environment.

@paugier
Copy link
Author

paugier commented Jan 10, 2024

this can be very different for C++ packages

Yes, it makes sense!

most of the incompatibility issues are discovered at run- or buildtime, not while resolving an environment.

Yes sure! I was not explicit enough when I wrote "a successful lockfile". I meant a lockfile for which all the tests succeed.

Wouldn't it be enough to have minimum versions in the pixi.toml + upper bounds for known issues + the date of the last lockfile that leads to successful build and tests ? With this data, it should be possible to compute a working environment, isn't it? We would just need a command to tell Pixi that this lockfile leads to successful build and tests.

I remembered that I was able to recreate successful environments with https://pypi.org/project/pypi-timemachine/

@adriendelsalle
Copy link

Pixi looks great, thanks for that!

I'm also very confused about this automatic pinning, because I expect the CLI to help me to fill the toml file to reflect my specification, not adding from the resolution process some extra information I didn't wanted to put there.

pixi add cmake adds cmake = ">=3.28.3,<3.29" in the toml file. So we lost the initial specification from the user.

Would it be acceptable to pass an extra flag to the add subcommand, such as --pin=minor or --pin=*.*, in case we want some pinning level (major, minor, patch, ...) and some capability to set this as a default configuration (at user, project or system level)? It would be a much more flexible design.

It would also make sense to have a pixi update to re-run from specs (toml file) and ask the user to update the env (and the lock file in the same time). I'm new to pixi and I may have missed some commands (sorry for that if it's the case!)

Also confused that we can't have packages without version spec in deps lists, cmake instead of having a cmake = "*" or cmake = "". It makes the toml file a bit less friendly. I guess that if we don't need to edit the file by hand because the CLI allow to specify the pinning we want -or not, it's maybe not that important.

@baszalmstra
Copy link
Contributor

baszalmstra commented Feb 19, 2024

The reason we do this is that if you do not specify a spec we basically take the "current" version. This is pinned exactly in the lock-file. After that a developer starts building their software using that version. Later, once you add more packages, or remove the lock-file this could completely break if the spec is very "loose" due to the chaotic nature of package resolution.

Without a spec you could end up with a wildly different version than what you initially developed with which can cause annoying dependency conflicts.

With the current approach we try to have a middleground for people that dont really care too much but where there is still the posibility to allow for flexibility and simple upgrade paths without (semver) breakage. We think its a sane default which is not uncommon in other package managers (cargo does the same thing, so does any node package manager)

Note that if you do specify a spec alongside your package name that that spec is used verbatim (e.g. "cmake *" will result in that spec being added exactly like that).

That being said I do like your suggestion of the --pin flag! That sounds like a good approach for people to modify the default behavior if so desired.

As for having to always specify a requirement, this is on purpose too. I think its very important to think about the bounds you provide. From my personal experience having too loose bounds will always bite you later.

@ruben-arts ruben-arts added the needs-decision Undecided if this should be done label Feb 29, 2024
@traversaro
Copy link
Contributor

Would it be acceptable to pass an extra flag to the add subcommand, such as --pin=minor or --pin=*.*, in case we want some pinning level (major, minor, patch, ...) and some capability to set this as a default configuration (at user, project or system level)? It would be a much more flexible design.

As I typically run command like pixi add rust c-compiler cxx-compiler pkg-config ninja libarrow cmake and then manually edit the pixi.toml to have "*", it would be indeed quite convenient to also have a --pin=unconstrained or similar option.

@baszalmstra
Copy link
Contributor

After careful consideration and the feedback here we conclude:

  1. The design will be to add the --pin flag to pixi add.
  2. In the future the default value for this flag can be specified in global configuration (which could also be overwritten per project).
  3. The options are:
    • --pin * or --pin unconstrained or --no-pin -- E.g. don't pin at all. (>=2.1.3)
    • --pin major or --pin x -- E.g. pin to the major version. (>=2.1.3,<3`)
    • --pin minor or --pin x.x -- E.g. pin to the minor version. (>=2.1.3,<2.2`)
    • --pin last -- E.g. pin to the last version segment specified (for 2.1.3 it would be same as minor, but for 2.1 it would equal to major) (This is currently the default)

Let us know what you think, do you agree on the naming?

Feel free to pick this up! If we get around to it before that time we will assign someone to this issue.

@traversaro
Copy link
Contributor

I agree on the naming, thanks for the update.

@pavelzw
Copy link
Contributor

pavelzw commented Mar 4, 2024

won't --pin * automatically expand to all files in the CWD in most shells?

Maybe --pin semver as a default where the pinning behavior depends on whether the project is <1.0.0 or >=1.0.0?

I'm also wondering whether we might persist the choices in pixi.toml for a potential pixi upgrade...

@ruben-arts ruben-arts removed the needs-decision Undecided if this should be done label Mar 4, 2024
@ruben-arts
Copy link
Contributor

The --pin * is a good point 😄 could do the --pin unconstrained and --no-pin only.

Do you mean semver to replace the last name or would it actually change the logic?

I think the potential pixi upgrade should support the same --pin option. But let's not add it yet as we want this to be user configuration for now.

@pavelzw
Copy link
Contributor

pavelzw commented Mar 4, 2024

Do you mean semver to replace the last name or would it actually change the logic?

I would prefer also changing the logic.

❯ pixi add fastapi
✔ Added fastapi
❯ pixi add pydantic
✔ Added pydantic
❯ pixi add custom-dep
✔ Added custom-dep
❯ cat pixi.toml
[project]
# ...

[dependencies]
pydantic = ">=2.6.3,<3"
fastapi = ">=0.110.0,<0.111"
custom-dep = ">=0.0.5,<0.0.6"

Notice that pydantic has 3 as upper bound (because it's >=1.0.0) and fastapi hast 0.111 as upper bound (because it's <1.0.0). I personally would prefer using the next version that might contain breaking changes as an upper bound.

@ruben-arts
Copy link
Contributor

I see, that sounds like a good idea!

@adriendelsalle
Copy link

Awesome! Thanks a lot!

@pavelzw
Copy link
Contributor

pavelzw commented Jun 13, 2024

I think being able to specify one's favorite pinning style should be configurable in the global configuration file.

# config.toml
default-pin = "semver" # or major, minor, exact, unconstrained

@ruben-arts
Copy link
Contributor

Yeah exactly! I'd like that as well! @baszalmstra is working on pixi add after that it should be an easy addition.

@tdejager
Copy link
Contributor

tdejager commented Jun 18, 2024

Re-opening, missed the rest of the discussion here before, as multiple features are being discussed.

@Hofer-Julian
Copy link
Contributor

Closing in favor of #1562 and #1516

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
✨ enhancement Feature request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants