Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better way of downloding wheels for the final Debian packages #6

Open
kushaldas opened this issue Nov 2, 2018 · 5 comments
Open

Better way of downloding wheels for the final Debian packages #6

kushaldas opened this issue Nov 2, 2018 · 5 comments

Comments

@kushaldas
Copy link
Contributor

kushaldas commented Nov 2, 2018

For all of the Debian package building, we are not using any binary wheels from https://pypi.org. Instead we download source tarballs from there, and build binary wheels locally and then use those in the final Debian packages.

We are going to take securedrop-client as an example, the project is under ~/code/securedrop-client directory.

Steps needs to be done before doing any packaging

1. Sync the wheels locally

In our main securedrop-debian-packaging we can download all of the already built wheels and sources by the following command (not yet merged).

make syncwheels

2. Thing required in the package/code level

From the securedrop-debian-packaging directory,

PKG_DIR=/home/user/code/securedrop-client make requirements

This will create the proper requirements.txt file in the project directory along with the binary wheel
hashes from our own PyPI.

If we are missing any wheels from our cache/build/PyPI, it will let you know with a following message.

The following dependent wheel(s) are missing:
pytest==3.10.1

Please build the wheel by using the following command.
	PKG_DIR=/home/user/code/securedrop-client make build-wheels
Then sync the newly built wheels and sources to the s3 bucket.
Also update the index HTML files accordingly and sync to s3.
After these steps, please rerun the command again.

So, the next step is to build the wheels. To do this step, you will need the GPG key of @kushaldas and @conorsch @redshiftzero @emkll on the same user as the actual list of hashes will be signed by one of us.

PKG_DIR=/home/user/code/securedrop-client make build-wheels

This above command will let you know about any new wheels+sources. It will build/download sources from PyPI (by verifying it against the sha256sums from the Pipfile.lock of the project).

python3 setup.py sdist

And then go to the very end of this comment for the make file target to build (example)

Sync the localwheels directory back to the s3 bucket. (if only any update)

This has to be manual step for security reason. In future all of these wheel building steps should be done by a different system, not at the devloper's laptop.

cd localwheels/
aws s3 sync . s3://dev-bin.ops.securedrop.org/localwheels/

This is an important step, we should any sync updates/new wheel into the s3 bucket.

Look out for the names of the sources and wheels in the is step.

4. Update the index files for the bucket (no need before release)

If there is any new package (source/wheel), then we will have to update our index.

./scripts/createdirs.py ~/code/securedrop-proxy/requirements.txt

Then update the corresponding packages's index.html.

If new package, then update the main index.

./scripts/updateindex.py

Finally sync the index.

cd simple/
s3 sync . s3://dev-bin.ops.securedrop.org/simple/

Build any of the python based debian packages

PKG_PATH=~/code/securedrop-client/dist/securedrop-client-0.0.5.tar.gz PKG_VERSION=0.0.5 make securedrop-client
PKG_PATH=/home/user/code/securedrop-proxy/dist/securedrop-proxy-0.1.0.tar.gz PKG_VERSION=0.1.0 make securedrop-proxy

The final rules files should only use our index.

For Pure development purpose, developer can modify the rules file to download from upstream pypi for the testing on a system.

Also, this means thanks @msheiny I managed to deply https://dev-bin.ops.securedrop.org/simple ❤️

@kushaldas
Copy link
Contributor Author

Here is a demo of the automation (related to freedomofpress/securedrop-builder#6 ) 13MB in size.

@kushaldas
Copy link
Contributor Author

We can also remove all hash generation steps if we use our index in pipenv itself, https://pipenv.readthedocs.io/en/latest/advanced/#specifying-package-indexes and https://pipenv.readthedocs.io/en/latest/advanced/#using-a-pypi-mirror

@sssoleileraaa
Copy link

Hey, I'm new to the project so I have some basic questions/ follow-up comments after my meeting with @kushaldas today (sorry kushal if you already answered any of these questions).

Questions:

For all of the Debian package building, we are not using any binary wheels from https://pypi.org. Instead we download source tarballs from there, and build binary wheels locally and then use those in the final Debian packages.

Why don't we download the wheels from PyPI? Why do we instead download the source tarballs and build them locally?

make syncwheels

You mentioned this is not yet merged, and I don't think you mentioned this during our meeting. Is this a new step that we didn't need before or is replacing some other step(s)? If I recall correctly, it seemed like the first step was running make build-wheels which copies the wheels and source distributions from PyPI, correct?

@kushaldas
Copy link
Contributor Author

Why don't we download the wheels from PyPI? Why do we instead download the source tarballs and build them locally?

Because we can trust the binary wheels which we built than build by a third-party (it can be any of the upstream developers).

You mentioned this is not yet merged, and I don't think you mentioned this during our meeting. Is this a new step that we didn't need before or is replacing some other step(s)? If I recall correctly, it seemed like the first step was running make build-wheels which copies the wheels and source distributions from PyPI, correct?

Yes, as the docs need more updates. The Makefile now has a fetch-wheels target for the same.

@sssoleileraaa
Copy link

Because we can trust the binary wheels which we built than build by a third-party (it can be any of the upstream developers).

Ah, so even if we verified the wheels that we download from PyPI, we still can't be 100% sure that what we're verifying only contains the code from the source distribution. By building the code directly ourselves, on our own build system, we can see exactly how the build and deploy process is done. If that's what you mean!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants