-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updated dependency to pin everything w/ pip-compile #82
Conversation
Okay, it actually seems not to be too easy for different python versions :) I just updated to use requirements.in for the actions. Here's the thread from pip-compile on the topic: jazzband/pip-tools#639 |
There is also a paragraph here |
I like the general method to get reproducible builds, but i am not so sure how practical it is, if other packages in your enviroment have other versions of a package pinned. But it is still way better to provide a tested build, then to assume that everything is fine with newest versions (see #80) |
After thinking a bit more about it, i am all for it and we should use pip-compile to generate known working enviroments for each python version. We can still hint to the requirements.in without any guarantee that it will work. The main branch can continue to be tested with current versions and we will pin dependencies at fixed points, most likely every minor release. |
Yes, I just found out that we can ditch the requirements.in completely, as pip-compile can also just get the requirements from setup.cfg. I'm still figuring out how to deal with the extra requirements (which is just plot for us), because I think it would be nice to have three requirements, one for dev (with testing etc.) maybe named Edit: Ok just found the |
The last remaining question for me is how to easily generate these requirements files. We could either just write a bash script for doing it or use tox for it (which seems to be a really nice tool, but I'm still figuring out how to use it properly). Tox gives us also easy access to different python versions and we could also do tests for any version locally (... and then use tox in the action to reproduce the exact same environment 😎) Here's a nice tutorial on tox. |
Some more thoughts on requirements for different python versions. So mostly the requirements file will be used for people developing or for reproducing an exact environment (i.e. in docker). Therefore, I would not feel too badly about only supporting one or maybe two main python versions we support there. |
Tox seems very nice for the job. Or we just run a manual github action with the respective python version and do a pip-compile. |
According to #72 I wanted to introduce a better dependency management. As pipenv is very inconsistently released and poetry seems to be a whole project management suite with publishing, env creation etc. which I found a little overboard I opted for the nice pip-compile method which just creates a pinned requirements.txt from a requirements.in file (the requirements.in file is basically the previous requirements.txt file).
Here's the link to pip-tools.
@MarJMue Do you like this solution, too?