-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement "hook" support for package signature verification. #1035
Comments
Thanks for this ticket! Immediate thoughts are that I think package signing at this stage in the game is premature as there are other avenues of very serious attacks available... However the proposed system is not really related strictly to signing. It could also be used to implement something like https://pypi.python.org/pypi/peep. So personally I'm going to think about this ticket a little bit before hand to figure out if I believe it's going to provide a useful feature without serious short comings in the near term. |
This has another useful purpose too, companies or organizations could use it to disallow installing items that haven't been through a security audit or license review or what have you. For instance OpenStack could potentially use it to help ensure that an unapproved dependency isn't added. |
So would a use case be something like verifying a dependency graph of packages' checksums and metadata? |
I'm pretty interested in implementing this. Should I knock up a strawman Pull Request? |
So I've thought about this some more, and it's really started to grow on me. Some thoughts on what I'd like to see: I think the hook should be a python hook, that allows us to pass data about the thing we are trying to install into the hook easily, and receive more complex return types than pass/fail. If someone wants a simple call a command and the subprocess module is simple to use so the python portion of the hook in that case would be a small shim. I think there needs to be more return types than Pass/Fail. In my mind there are four distinct return values. They are Pass, Warn, Retry, Fail. The defintions of them (again in my mind) would be: Pass: The installation looks fine, go ahead and install it At least that's what I think :) I'd love a PR that implements this hook feature. |
What are the
Pip package lists are specified as requirement specifiers in requirements.txt files. So, in order to verify a list (a topologically sorted dependency graph) of python packages required for an environment, it is/will/would_be necessary to determine the path to the |
I don't think it should be shelling out to executable by default. It should call a python function as a hook and use a python return value. If people want their particular instance of the hook to shell out that's a simple python wrapper that they can shell out on their own. |
... http://pythonhosted.org/distlib/tutorial.html#verifying-signatures |
Distlib's Signature support is inherently broken. You cannot just pipe out to GPG and trust whatever keys are in the trustdb. Just because you trust me for X does not mean you trust me for Y. |
So remove
http://stevedore.readthedocs.org/en/latest/ may be useful for adding hooks / plugins / extension points and/or as a reference for [setuptools entry_point configuration]
Mercurial hooks and extensions pass something like a context
How and when should I sanitize this input? What is the best way to specify the command arguments?
From a shell script, is there then a way to differentiate between failed and sig-check-failed for an I would be in favor of either and/or both:
|
... These were the stevedore documentation links I was looking for:
|
Is the package signature hook called for .zip, .egg, and .whl packages AND for editable distributions? There are new metadata attributes for package source locations.
|
How so? You can specify the keystore to use. If necessary to support a potentially different keystore for each file, this could be accommodated via an extra argument to the |
Because throwing cryptography at a problem without providing a solution to the actual problem does do anything. Your solution uses gpg, GPG has a built in trust model which doesn't work for PyPI style packaging where it's a free for all. GPG web of trust validates identity, but it doesn't validate that a person is alllowed to sign for a particular file. You say that you can just point to a different trustdb in that case, but that still doesn't solve the underlying problem of how something gets into the trustdb to begin with. Implementing packaging signing needs to start with a proper trust model, just slapping some crypto on top of it doesn't solve the problem. |
I see what you mean, but how something gets into the trust database is not really up to |
Keys you trust for what? |
A key you trust to verify the signature of a specific package you downloaded. This will be the package publisher's public key (the corresponding private key having been used by the publisher to sign the package you downloaded), which you will have obtained through some trusted channel (so that you know the key belongs to the publisher, rather than someone claiming to be the publisher). This is easier said than done, but certainly doable for specific packages and publishers, with their cooperation. |
Ok, so what's the mechanism of specifying that a certain key is only trusted for a certain package? |
|
.
.
So the trust model must include a mechanism for specifying which keys are valid for which packages?
|
See, this is the entirety of the hard part of the problem domain, but you've neatly tucked it away in a single sentence. Actual signing and verifying has been easy for the past decade. It's so mechanically easy it's hardly worth implementing (and possibly even dangerous to do so, as you may give users a false sense of security) until you have a rigorous design for problem number 1, how do I get keys for people I trust and how to I decide what the heck I trust them with, and when, and for what. I'd liken implementing package signing and verification without a well-thought identity, ownership and trust model overlying it, to implementing SSL in a browser without a PKI or certificate verification. |
Is this a signed graph with typed edges? With SSL, certs are tied to DNS (technically "Common Name") identifiers. Not all packages are on PyPi, so a PyPi URN wouldn't solve for as many cases as just mapping Keys to Package URIs with 'types' (or 'roles'?): {"committer", [...], "-er" }. To me, this seems like a useful metadata requirement to impose upon software project teams. Could such "Key <-> Package mappings" metadata be inligned (topologically) with checksums in
[EDIT]
|
Why write a screed when a short sentence will do? This has been discussed elsewhere many times.
Well, I've implemented it for my own use, and others can use that implementation or not, just as they choose :-)
You're saying you shouldn't provide a solution for some people unless you provide a solution for everyone? I don't agree with this argument - it's a bit like saying PKI shouldn't have been invented at all, or that C shouldn't have been invented until the problem of buffer overflow exploits was solved ;-) There are scenarios where one can obtain and use trusted keys, and I have used PKI and GnuPG successfully in such scenarios. And a "false sense of security" can even bite seasoned security pros - just look at all the exploits around SSL - but that doesn't mean we should have nothing in its place. |
It's worth noting that the complexity of the trust problem for package distribution is the main reason http://www.python.org/dev/peps/pep-0458/ and "The Update Framework" itself exist. In relation to idea of "implement a hook that assumes any already verified GPG trust DB", well that's the same reason I signed off on Daniel's embedded signature support in PEP 427 - he had a constrained environment where he wanted to use that feature, and it was easy enough for everyone else to just ignore. Same goes for folks that have sorted out their GPG trust issues. As far as this issue goes, +1 from me for the notion of making the verification step pluggable - we just need to be careful how those plugins get configured, because indirect attack vectors are always fun for all involved :) |
As there is no native support available I am using a workaround based on Verifying PyPI and Conda Packages for my packages. Examples: yaml4rst, hlc |
I'm going to close this, I don't think we're going to implement it (nor do I think we want to implement it) and TUF will provide a better mechanism for signed packages once that is implemented. |
@dstufft We really need a feature like that nowadays. As you might have noticed multiple websites get compromised. Sample of handbrake. Users need to be able to verify the source via GPG to ensure no modifications in transit or on the server were made. This is especially important as a lot of users use pip to download their python modules. Simply because they are not available on the operating system or just because lots of google posts suggest this. Especially because most of them suggest to install via Please add an option for GPG verification and also suggest the user to verify the source if signatures are available (and display the fingerprint to the user). |
It's almost certain there is not going to be an option to verify GPG signatures within pip. GPG signatures are practically worthless on their own unless you have a trust model (and the built in web of trust is not good enough) and any effort that goes into implementing a trust model around GPG that works for us would be better spent implementing TUF. |
@dstufft you specify the trusted key in the install command as written above. And the website that requires to install those deps will also list the fingerprints of the signed sources. Then pip compares the provided fingerprints on the pypi server with the command line. This way a pypi server side hack will be noticed. This is a general problem of crypto. But you cant excuse with the statement that its not 100% failsafe and gpg is not usable with this limitation. Its the best and only real solution we have to verify sources. And if you make it not too complicated for the usecases above its a fairly simple process. |
- "GPG signing - how does that really work with PyPI?"
pypa/twine#157
…On Thursday, May 18, 2017, Nico ***@***.***> wrote:
@dstufft <https://github.com/dstufft> you specify the trusted key in the
install command as written above. And the website that requires to install
those deps will also list the fingerprints of the signed sources. Then pip
compares the provided fingerprints on the pypi server with the command
line. This way a pypi server side hack will be noticed.
This is a general problem of crypto. But you cant excuse with the
statement that its not 100% failsafe and gpg is not usable with this
limitation. Its the best and only real solution we have to verify sources.
And if you make it not too complicated for the usecases above its a fairly
simple process.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1035 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AADGy_HRbRJ255PP0JPkonn5RCCLq0Ftks5r7KB9gaJpZM4AzGWz>
.
|
@NicoHood We're thoroughly skeptical of claims that this is in high demand or a major end user security concern, as we have zero commercial pip redistributors reporting sufficient customer demand for them to invest engineering time in improving the security model of the tooling. Instead, they either cache the published hashes, or cache entire artifacts, such that PyPI compromises after the initial release won't have any impact on them and their customers. Similarly, publishers can detect any such post-publication compromises for themselves by maintaining a list of previously published hashes, and checking them against what PyPI is providing (or what redistributors are providing, for that matter - assuming they're republishing unmodified sources without applying any downstream patches). Signatures are only useful as a way of verifying publishers, and GPG has no trust model to enable that in a useful form for an open platform like PyPI (this isn't like a Linux distro where you'd just be trusting the GPG key used in the distro's build system). |
Is there demand for end-to-end security in a continuous deployment workflow?
...
OS Packages
What could solve for this?
"signature": {
"type": ["MerkleProof2017", "Extension"],
"merkleRoot": "68f3ede17fdb67ffd4a5164b5687a71f9fbb68da803b803935720f2aa38f7728",
"targetHash": "c9ead76a54426b4ce4899bb921e48f5b55ea7592e5cee4460c86ebf4698ac3a6",
"proof": [{
"right": "7fef060cb17614fdfddd8c558e102fbb96433f5281e96c80f805459773e51163"
}],
"anchors": [{
"sourceId": "8623beadbc7877a9e20fb7f83eda6c1a1fc350171f0714ff6c6c4054018eb54d",
"type": "BTCOpReturn"
}]
}
|
@westurner You've been warned multiple times on multiple projects not to post random link dumps into tracker issues (and elsewhere). Please voluntarily refrain from doing so, so it doesn't need to escalate to another block. |
Excuse yourself. I am offended.
You have not presented a solution.
Nor have you assisted other conversation participants with this type of
security.
A signed ACL list in a DHT would certainly done solve a need for
cryptographic signatures here. (Where, again, GPG does not solve for
authorization).
…On Friday, May 19, 2017, Nick Coghlan ***@***.***> wrote:
@westurner <https://github.com/westurner> You've been warned multiple
times on multiple projects not to post random link dumps into tracker
issues (and elsewhere). Please voluntarily refrain from doing so, so it
doesn't need to escalate to another block.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1035 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AADGyzj5GmoZMdFG3onK8ZWHVPEZ1s_Vks5r7m8qgaJpZM4AzGWz>
.
|
@westurner We've had a defined technical solution to this problem for years, and Donald referred to it above: The Update Framework. The details are covered in two PEPs:
This was also one of the key points of concern I raised in my overview of the state of Python packaging last year: http://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#making-pypi-security-independent-of-ssl-tls It is not a technical problem now, and hasn't been since those PEPs were written. Throwing more technical ideas or evidence of unfunded demand at the PyPA developers does nothing to advance the situation. Instead, it's a funding and sustainability problem, that requires folks either to lobby commercial redistributors to tackle this problem comprehensively on behalf of their customers, or else to make the case for why the PSF should fund this when vendors with a strong reputation for handling open source security management concerns on behalf of their customers decline to do so. Either way, the PyPA developers are not the right people to be directing any advocacy towards. |
So, with TUF, IIUC:
- that's centralized PKI
- pypi is then the SPOF?
- "ACL list" ~= root.json
- I would suggest ld-signatures as a future-proof standard for JSON
document signing.
- was this written by the same person who choose to write Warehouse in
pyramd? Thanks.
I believe this is the correct issue in which to discuss this (and other
out-of-band ways of validating software packages) because the question is
specifically requesting a way to verify (signed) hashes.
…On Saturday, May 20, 2017, Nick Coghlan ***@***.***> wrote:
@westurner <https://github.com/westurner> We've had a defined technical
solution to this problem for years, and Donald referred to it above: The
Update Framework.
The details are covered in two PEPs:
- PEP 458 -- Surviving a Compromise of PyPI
<https://www.python.org/dev/peps/pep-0458/>
- PEP 480: Surviving a Compromise of PyPI: The Maximum Security Model
<https://www.python.org/dev/peps/pep-0480/>
This was also one of the key points of concern I raised in my overview of
the state of Python packaging last year: http://www.curiousefficiency.
org/posts/2016/09/python-packaging-ecosystem.html#making-pypi-security-
independent-of-ssl-tls
It is *not* a technical problem now, and hasn't been since those PEPs
were written. Throwing more technical ideas or evidence of unfunded demand
at the PyPA developers does nothing to advance the situation.
Instead, it's a funding and sustainability problem, that requires folks
either to lobby commercial redistributors to tackle this problem
comprehensively on behalf of their customers, or else to make the case for
why the PSF should fund this when vendors with a strong reputation for
handling open source security management concerns on behalf of their
customers decline to do so. Either way, the PyPA developers are *not* the
right people to be directing any advocacy towards.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1035 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AADGy6izGE4Gay9pb7Uir5eV496UNoZRks5r76hCgaJpZM4AzGWz>
.
|
pypi is then the SPOF
Can I use TUF with devpi instead of pypi/warehouse? (With centralized PKI)
…On Sunday, May 21, 2017, Wes Turner ***@***.***> wrote:
So, with TUF, IIUC:
- that's centralized PKI
- pypi is then the SPOF?
- "ACL list" ~= root.json
- I would suggest ld-signatures as a future-proof standard for JSON
document signing.
- was this written by the same person who choose to write Warehouse in
pyramd? Thanks.
I believe this is the correct issue in which to discuss this (and other
out-of-band ways of validating software packages) because the question is
specifically requesting a way to verify (signed) hashes.
On Saturday, May 20, 2017, Nick Coghlan ***@***.***
***@***.***');>> wrote:
> @westurner <https://github.com/westurner> We've had a defined technical
> solution to this problem for years, and Donald referred to it above: The
> Update Framework.
>
> The details are covered in two PEPs:
>
> - PEP 458 -- Surviving a Compromise of PyPI
> <https://www.python.org/dev/peps/pep-0458/>
> - PEP 480: Surviving a Compromise of PyPI: The Maximum Security Model
> <https://www.python.org/dev/peps/pep-0480/>
>
> This was also one of the key points of concern I raised in my overview of
> the state of Python packaging last year: http://www.curiousefficiency.o
> rg/posts/2016/09/python-packaging-ecosystem.html#making-
> pypi-security-independent-of-ssl-tls
>
> It is *not* a technical problem now, and hasn't been since those PEPs
> were written. Throwing more technical ideas or evidence of unfunded demand
> at the PyPA developers does nothing to advance the situation.
>
> Instead, it's a funding and sustainability problem, that requires folks
> either to lobby commercial redistributors to tackle this problem
> comprehensively on behalf of their customers, or else to make the case for
> why the PSF should fund this when vendors with a strong reputation for
> handling open source security management concerns on behalf of their
> customers decline to do so. Either way, the PyPA developers are *not*
> the right people to be directing any advocacy towards.
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <#1035 (comment)>, or mute
> the thread
> <https://github.com/notifications/unsubscribe-auth/AADGy6izGE4Gay9pb7Uir5eV496UNoZRks5r76hCgaJpZM4AzGWz>
> .
>
|
Each repository is responsible for it's own security, so if you're using PyPI, then packages installed from PyPI derive their trust from a PyPI specific set of root keys. If you're using DevPI it will be up to DevPI to support TUF with it's own instance specific DevPI set of root keys. DevPI would/could validate the trust from PyPI before mirroring it onto DevPI and signing it itself. |
This issue should be re-opened. I'm not asking for the system to be perfect I will download the gpg public keys for the packages I want to be able to install via pip. I simply want pip to only allow installation of packages that match those signatures. If someone changes the key (or removed it) it's my problem to figure out if the key was legitimately changed or if someone compromised the package. It's really no different that what I do for deb repos for example. |
@rhuddleston If you're willing to trust the GPG key management practices of arbitrary publishers, then it's already entirely feasible to implement your own pip wrapper that adds the check you're seeking. You don't need anyone's permission for that, and you certainly don't need to wait for hook support in the official pip client. (As a previous example of something like this, checking downloads against previously recorded hashes started out as a But we're not going to recommend GPG as a general measure, because the web of trust model doesn't scale adequately for an open publishing platform with arbitrary publishers: it relies on the assumption that the signing keys are managed securely, and we simply don't agree that that's a well-founded assumption in the context of PyPI. |
So instead of using (the not perfect) GPG you simply leave it as it is without any kind of verification? |
No, we use the only verification we can currently meaningfully offer:
Unlike Linux distros, where GPG signatures provide assurance that the software you're installing was actually published by the distro, GPG signatures provide no meaningful assurance in the context of an open publication platform like PyPI - believing they do is only possible in the absence of clearly defined threat modelling that identifies the actors and actions you're aiming to defend against, and the kinds of trust you're aiming to enable. It is possible to create a trust management system that would meaningfully improve the state of PyPI security by reducing the reliance on the HTTPS CA system for delivery assurance (see the links to PEP 458 and PEP 480 above), but "just add GPG!" isn't it. |
I disagree that GPG signature checking is any more useless than distro GPG
signature checking.
Should signing keys be distributed over a different channel than packages
(HTTPS (TLS/SSL))? YES.
Other channels for GPG key distribution:
- blockcerts (JSONLD ld-signatures)
- GPG keyserver protocol
- SSH
There needs to be a way to specify which keys are valid for which package;
for both PyPi and distros.
How would providing a pip option to fail installation if GPG keys are
absent/invalid provide any more of a false sense of security than failing
if hashes don't match previously-admitted hashes (like peep)?
…On Sunday, September 17, 2017, Nick Coghlan ***@***.***> wrote:
No, we use the only verification we can currently meaningfully offer:
- hash checking to ensure that previously downloaded artifacts don't
change
- completely out-of-band signature checking that bypasses PyPI and the
PyPA tooling entirely (as if you genuinely don't trust the PyPI admins, you
can't trust any package signatures that PyPI publishes, nor any signature
checking tools obtained from PyPI).
Unlike Linux distros, where GPG signatures provide assurance that the
software you're installing was actually published by the distro, GPG
signatures provide no meaningful assurance in the context of an open
publication platform like PyPI - believing they do is only possible in the
absence of clearly defined threat modelling that identifies the actors and
actions you're aiming to defend against, and the kinds of trust you're
aiming to enable.
It *is* possible to create a trust management system that would
meaningfully improve the state of PyPI security by reducing the reliance on
the HTTPS CA system for delivery assurance (see the links to PEP 458 and
PEP 480 above), but "just add GPG!" isn't it.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1035 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AADGy_cZmeRxrHo79DD3WZVK9Ra8l0lSks5sjM-IgaJpZM4AzGWz>
.
|
To be clear, if end users correctly managed a trust store that mapped project names to GPG keys then it is fine and that would add an additional layer of security over what currently exists. The issue is ultimately one of impact. Due to differences in the distro vs PyPI/pip case, we do not currently have the mechanism in place to automatically map projects to gpg keys, which means that end users will be responsible for doing this themselves. It is my opinion that the vast bulk of people will simply not bother and thus we will have added this feature for little benefit except for a minority of users. Now, one could argue that adding a feature that a user can ignore doesn't cost them anything-- but in my opinion it does. It adds additional overhead in the things they need to understand in order to actually use pip, more things they need to weed through. On the maintenance side it also adds additional complexity which means that it's harder to test and develop and maintain pip in the long run, particularly for something that we're pretty sure we're not going to be using. The other problem here is an ecosystem one. By providing a way to validate GPG keys we're implicitly telling people that they should be signing their packages with GPG, however we're already pretty sure that we're not going to be using that so it is effectively going to be making work for people that they're going to want to throw away at some point. |
So what of GPG-signed distro repacks?
…On Sunday, September 17, 2017, Donald Stufft ***@***.***> wrote:
To be clear, if end users correctly managed a trust store that mapped
project names to GPG keys then it is fine and that would add an additional
layer of security over what currently exists.
The issue is ultimately one of impact. Due to differences in the distro vs
PyPI/pip case, we do not currently have the mechanism in place to
automatically map projects to gpg keys, which means that end users will be
responsible for doing this themselves. It is my opinion that the vast bulk
of people will simply not bother and thus we will have added this feature
for little benefit except for a minority of users.
Now, one could argue that adding a feature that a user can ignore doesn't
cost them anything-- but in my opinion it *does*. It adds additional
overhead in the things they need to understand in order to actually use
pip, more things they need to weed through. On the maintenance side it also
adds additional complexity which means that it's harder to test and develop
and maintain pip in the long run, particularly for something that we're
pretty sure we're not going to be using.
The other problem here is an ecosystem one. By providing a way to validate
GPG keys we're implicitly telling people that they should be signing their
packages with GPG, however we're already pretty sure that we're not going
to be using that so it is effectively going to be making work for people
that they're going to want to throw away at some point.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1035 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AADGy-XfjIjp2OGb-qqqdD6N5AvkXDI8ks5sjU0hgaJpZM4AzGWz>
.
|
I think that is a very important point. We do not need to create busywork for FOSS maintainers if it does not really improve things. |
The reason GPG signing is effective in the Linux distro case is that the main purpose of it is for the publishers of the distro itself to ensure the integrity of the link from the distro's build system to end user installations of the distro even when that link traverses untrusted systems like public mirrors and the internet: the publishing system and the consumption system are controlled by the same entity, and you have to go through some form of review process to get access to the publishing end. The meaningful assurances of trustworthiness then come from the combination of GPG content signing and pre-publication review and publisher key management, not the content signing alone. (The Linux distros also take care of ensuring that GPG key management infrastructure is available and working for both publishers and consumers, whereas 'GPG will already be available and working' is an entirely invalid assumption on non-Linux systems) Aside from the UX train wreck that is attempting to set up GPG signature checking on non-Linux systems, the key architectural differences in the PyPI case are that there is no pre-publication review process, and no standardised process for publisher key management. Adding only the GPG content signing part without addressing either of those aspects thus becomes purely a matter of security theatre, adding minimal value beyond the link integrity protection offered by HTTPS. The lack of end-to-end signing support (outside the embedded signature support in the wheel file format) does mean that both the PyPI admins and the Fastly CDN admins constitute an "insider threat" for all consumers of content from PyPI. Now, it is possible for us to design and develop a system to inherently neutralise that threat (and PEP 480 describes one such system), but it's also possible to neutralise it through less mathematically sophisticated methods, like folks publishing expected artifact hashes through an independent registry, and publishers explicitly checking that the artifacts that PyPI publishes are the ones they uploaded. However, effectively designing such a system requires people to actually define and document the threat model they're attempting to defend against, and choose the appropriate tools and techniques to provide the greatest increase in integrity assurance at the lowest cost in time and effort for publishers, infrastructure maintainers, and end users, rather than simply assuming that because a particular technique (i.e. GPG content signing) works well in the context of a Linux distribution, that same technique will be able to provide meaningful assurances in the context of an open publication platform like PyPI. |
gpg checks are much better than checksums. It tells me that the person who published this package continues to be the same person that published it, since when I first started using the package. Even if just the minority of people downloaded the keys via a different channels and only installed packages correctly signed with specific signatures, those minority (e.g. security professionals) would notice when popular packages keys changes and would investigate. Basically it's a second factor. For example if I managed to login into someone's pypi account now for a popular package I could now replace this package with my slightly modified version that includes an extra backdoor or malware no one would easily notice. The thinking that GPG is not perfect so it's better to have nothing is misguided. For example that is what happened here conda/conda#1395 It seems there has been discussion about including TUF since 2011 yet is there any imminent plans of when this might be released? If there is a commitment to implement TUF are there any major roadblocks to getting this done? Also another thing with TUF is it's not easy for many people to understand. The only place I've seen TUF implemented is with notary and docker. For example all "official" repos on dockerhub are signed and anyone can use use DOCKER_CONTENT_TRUST=1 to pull and validate these images. On the other hand almost no one else signs images when uploading to dockerhub. I think a big part of this is because it's a lot more complicated that gpg to setup and use. I'm hoping to create some easy step by step articles to make this easier so more people will use it. I worry that even if TUF is implemented on pypi that if it's too difficult for developers to use then no one will bother. If I build a perfect security system and then no-one uses it then overall we will not be better off. For example If we can get a majority of maintainers to add gpg signatures that will be a big improvement. If you required gpg or TUF before someone is allowed to publish to pypi this could be a big advantage. When TUF is implemented on pypi will this be a requirement? How can we make sure this is a simple process so developers won't find it to be a burden? Basically it will need to be easier than dockerhub if that is the indication on adoption rate. |
On 6 October 2017 at 05:12, rhuddleston ***@***.***> wrote:
gpg checks are much better than checksums. It tells me that the person who
published this package continues to be the same person that published it,
since when I first started using the package. Even if just the minority of
people downloaded the keys via a different channels and only installed
packages correctly signed with specific signatures, those minority (e.g.
security professionals) would notice when popular packages keys changes and
would investigate. Basically it's a second factor.
For example if I managed to login into someone's pypi account now for a
popular package I could now replace this package with my slightly modified
version that includes an extra backdoor or malware no one would easily
notice.
And if we were to support distributing signing keys through PyPI, then
you'd just post a "Hey, I lost the old key, here's my new key" notice.
In both cases, the real world defence is the same: the *publisher* notices
that someone that wasn't them made a new release, alerts the PyPI admins,
the malicious package gets taken down, and a security notification gets
issued by the PSRT. (The prohibition on replacing existing versions with
artifacts that have a different hash already prevents silent replacement of
previous releases)
Adding 2FA support so publisher's accounts are harder to compromise in the
first place will also be a useful hardening mechanism, and is one of the
items high on the post-Warehouse-migration todo list.
The thinking that GPG is not perfect so it's better to have nothing is
misguided.
No, that's not the thinking. The thinking is that the folks that claim to
see meaningful value in this approach are refusing to invest the time &
energy to build it themselves (or pay a vendor to build & operate it for
them), so they clearly don't actually care all that much - they just want
to claim that someone else is preventing them for doing that work, rather
than closely examining their real reasons for not wanting to build it.
Remember, the PyPA devs and PyPI administrators are potentially part of
your threat model when it comes to end-to-end signing, so it's necessary to
make sure that a single rogue admin or developer can't easily compromise
whatever you decide to set up. That's a lot easier to do if the security
mechanism is independent of the services we operate and the tools we
provide. (This is in stark contrast to the Linux distro model, where GPG
signatures are used to protect the distro->user link, nothing more, with
each distro relying on its own internal mechanisms to ensure that what gets
signed is what they intended to publish).
For example that is what happened here conda/conda#1395
<conda/conda#1395>
Making misleading security claims is indeed worse than not making any
security claims at all.
It seems there has been discussion about including TUF since 2011 yet is
there any imminent plans of when this might be released? If there is a
commitment to implement TUF are there any major roadblocks to getting this
done?
I last summarised the major roadblocks to that here:
http://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#making-pypi-security-independent-of-ssl-tls
Mozilla have now provided funding for the legacy PyPI shutdown effort
through a Foundational MOSS grant, and the PSF staff will also be actively
working on more sustainable operational funding models for PyPI as part of
that effort.
Also another thing with TUF is it's not easy for many people to
understand. The only place I've seen TUF implemented is with notary and
docker. For example all "official" repos on dockerhub are signed and anyone
can use use DOCKER_CONTENT_TRUST=1 to pull and validate these images. On
the other hand almost no one else signs images when uploading to dockerhub.
I think a big part of this is because it's a lot more complicated that gpg
to setup and use. I'm hoping to create some easy step by step articles to
make this easier so more people will use it.
GPG is also incredibly complicated to set up and use, especially on Windows.
In most cases though, publishers aren't bothering with content signing on
DockerHub for the same reason they don't bother with it on PyPI: because
they don't personally care, and nobody else cares enough about it to pay
them to care.
As a result, the only places we've seen signing work in practice is when
there's a trusted intermediary acting as a signing authority (e.g. Linux
distros, verified images on container image registries, Microsoft backed
driver & installer signatures for Windows, the various mobile and desktop
app stores). Even HTTPS relies on that trusted intermediary approach (by
way of browser and operating system CA certificate bundles).
I worry that even if TUF is implemented on pypi that if it's too difficult
for developers to use then no one will bother. If I build a perfect
security system and then no-one uses it then overall we will not be better
off. For example If we can get a majority of maintainers to add gpg
signatures that will be a big improvement. If you required gpg or TUF
before someone is allowed to publish to pypi this could be a big advantage.
Have you even *read* PEPs 458 and 480? The challenge of publisher adoption
is discussed explicitly, and is in fact the entire reason there are two
distinct PEPs (since we expect real-world adoption of the publisher
dependent end-to-end variant described in PEP 480 to be so close to zero as
to be barely worth implementing)
|
Synopsis
Some people want package signature verification during their pip installs. Other people think relying on authenticated package repository connections (such as over TLS) is sufficient for their needs.
Of those who want package signature verification, there is disagreement about how to tell PIP which signatures to trust (and how users will manage package signing public keys).
Rationale
The rationale for this ticket is to provide a mechanism in mainline pip for signature verification enthusiasts to experiment with different approaches. If a particular approach becomes popular, pip could consider incorporating that particular approach.
In the meantime, rather than have endless committee-style-arguments about how to do package verification, we should have a system that lets users choose for themselves, but only if they opt in.
Also, it keeps package verification cleanly separate from the pip codebase.
Criteria
This ticket may be marked as wontfix, or some other status to indicate that the pip developers reject this proposal.
This ticket may be marked closed, only when these conditions are met:
Implementation Details
I prefer a hook api where the config specifies a path to an executable in pip's config file. The inputs are passed as commandline arguments to a subprocess which invokes that command. The hook's stdout & stderr are the same as the parent pip process. The exit status is 0 to indicate "accept package" and non-zero to indicate "reject package".
-but I'd be happy with any system that fulfills the Criteria above.
Related Issues
Note, there is a less-well-specified ticket #425. I made this ticket because the vagueness of that ticket makes it difficult to close. (Is #425 satisfied by TLS authentication to package repositories based on a standard OS or user trust root? Does it imply or require package signature verification?)
The text was updated successfully, but these errors were encountered: