Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v7 discussion #76

Closed
Keats opened this issue Feb 4, 2019 · 86 comments
Closed

v7 discussion #76

Keats opened this issue Feb 4, 2019 · 86 comments

Comments

@Keats
Copy link
Owner

Keats commented Feb 4, 2019

There are quite a few changes happening in the PRs: more input format for RSA (#69, #74), ECDSA signing & verification (#73) as well as some Validation changes.

I have already removed the iat check in the next branch since it isn't something that should be checked.

Right now, Validation::algorithms is a vec. I don't remember why but it shouldn't be the case, it should be an algorithm: Algorithm instead, I will accept a PR for that or do it myself later.

#69 also adds a standalone verify_rsa fn, which I'm wondering if we can somehow streamline it with the rest of the crate.

Since Rust doesn't have default parameters, we always have to pass a Validation struct currently. Maybe we can put the decoding onto this struct instead so you get something like:

// currently
let token = decode::<Claims>(&token, "secret".as_ref(), &Validation::default())?;

// possible
// `JwtDecoder` has the same fields as the current `Validation`
let token = JwtDecoder::default().decode_hs256::<Claims>(&token, "secret".as_ref())?;

This way we don't have a single function where we are trying to fit all arguments inside and the user has to select explicitely which decode fn to use. This solves the vec of algorithms at the same time and allows having better documentation for each. The downside is duplication of those functions/docs for the various digest values (decode_hs256, decode_hs384, decode_hs512 for each algo).

Any other ideas/criticisms/things missing?

ccing the people involved in various issues/PRs recently
@AaronFriel @jbg @Jake-Shadle @matthew-nichols-westtel @greglearns

@Jake-Shadle
Copy link
Contributor

I only became aware of this lib yesterday because of the bug I was tracking down, and while #74 works, it's not what I would call clean.

So my one thought looking at this code only briefly was that it might be nice to be able to expose some helpers for deserializing the RSA key based on the user knowledge and passing that key down to the API rather than the current method of just trying both. The reason I say helpers would be nice instead of just passing the key directly is that would require the user adding 2 new dependencies, ring and untrusted, though another option would of course to be to expose the types and functions needed from those crates as part of the public API of this lib.

@briansmith
Copy link
Contributor

In my future plans for ring, I expect the user will be able to create keys in ways other than reading them from byte arrays (files). For example, some users may want to generate a private key in memory and never serialize it. Or, we may support importing keys from the operating system where we never have the raw bytes of the key. And, I think once we have that functionality, people will want to use it with a library like this. So IMO it would be better to delegate all the key object creation to ring other than PEM parsing (since ring doesn't do PEM).

@briansmith
Copy link
Contributor

In other words, rather than having a PKCS#8 parsing API in jsonwebtoken, and a "raw DER" parsing API here, i recommend instead just have the user use ring's parsing API to construct a key and pass it into jsonwebtoken.

@Keats
Copy link
Owner Author

Keats commented Feb 5, 2019

i recommend instead just have the user use ring's parsing API to construct a key and pass it into jsonwebtoken.

That would require users having to add ring and untrusted to their list of direct dependencies, which isn't great in terms of UX.
Also if we somehow decide to have an openssl backend in addition of ring it would make things complicated

@briansmith
Copy link
Contributor

That would require users having to add ring and untrusted to their list of direct dependencies, which isn't great in terms of UX.

In ring, we're gradually phasing out the use of untrusted in the API. Maybe that will be done in ring 0.15. I don't think it's problematic for people who want to use ring to use use ring's API to construct the key used. That seems much better to me than having this crate (and every crate that uses ring) duplicate ring's API to hide the fact that ring is used.

Also if we somehow decide to have an openssl backend in addition of ring it would make things complicated

I personally think that would hurt the value proposition of the crate and would be a lot of work, so I wouldn't do it. But obviously I'm biased.

@Keats
Copy link
Owner Author

Keats commented Feb 6, 2019

I don't think it's problematic for people who want to use ring to use use ring's API to construct the key used. That seems much better to me than having this crate (and every crate that uses ring) duplicate ring's API to hide the fact that ring is used.

Right now I need to bump the major version of this crate everytime ring gets a major version. If the symbols versioning PR is merged, I won't need to bump it all the time. However if ring is exposed, I would have to bump on every breaking change even after that.

@manifest
Copy link

manifest commented Feb 15, 2019

In other words, rather than having a PKCS#8 parsing API in jsonwebtoken, and a "raw DER" parsing API here, i recommend instead just have the user use ring's parsing API to construct a key and pass it into jsonwebtoken.

@briansmith In our case we have web services written in Rust, Erlang, Ruby and Go. They need to use a same key to verify access tokens in incoming requests. If I generate keys using Ring, I'll get a keypair in pkcs8 DER format instead of just a private key that is, for instance, is used by Erlang crypto library. I will need to parse the keypair in the applications and that seems to me as not a good idea. Sadly there is currently no way to use the same pkcs8 DER key for all the applications, that's why we currently use openssl to generate PEM keys to consume by Erlang apps and pcks8 DER (from the PEM keys) for Rust apps.

Ideally, I would like to have a command line tool (because it just makes things easier for devops engineers) that allow to generate keys in a common format. Maybe at some day such a command line tool will be based on Ring project if we make it possible.

@manifest
Copy link

manifest commented Mar 9, 2019

@Keats it seems that it'll take some time to implement all described features for v6. Can we have an intermediate release with ES256 support before that? Working with the branch creates a bunch of dependencies-related issues.

@Keats
Copy link
Owner Author

Keats commented Mar 9, 2019

An intermediate release would have to be a major one sadly. However I can probably merge that PR in the v6 branch and you shouldn't have dependencies issues at that point no?

@manifest
Copy link

manifest commented Mar 9, 2019

Merging into master without bumping a new version of crate won't make a difference. Dependencies issues I'm talking about exist because of some other crates are using the current version of jsonwebtoken crate.

@manifest
Copy link

manifest commented Mar 9, 2019

An intermediate release would have to be a major one sadly

Maybe, we can have v6 with ES-256 support, and other features in v7? I mean the version is just a number.

@Keats
Copy link
Owner Author

Keats commented Mar 14, 2019

Maybe, we can have v6 with ES-256 support, and other features in v7? I mean the version is just a number.Maybe, we can have v6 with ES-256 support, and other features in v7? I mean the version is just a number.

We could but it is annoying for end users to have frequent breaking changes :/

@manifest
Copy link

We could but it is annoying for end users to have frequent breaking changes

It always better to have releases available as soon as possible in my opinion. You always can stay on a previous major release, If you don't care about new features. It also shouldn't cause any problem if you're using simversion properly.

This was referenced Mar 15, 2019
@Keats
Copy link
Owner Author

Keats commented Mar 22, 2019

I don't have the time to implement that right now so let's get all the PRs in v6.
That is going to be messy but at least not block people

@Keats Keats changed the title v6 discussion v7 discussion Mar 22, 2019
@Keats
Copy link
Owner Author

Keats commented Mar 22, 2019

The next version is at #75 if people want to try it, I'm trying to get the other outstanding PRs to be merged in it.

@Keats
Copy link
Owner Author

Keats commented May 25, 2019

The work on next version has started in #91 thanks to @Jake-Shadle

The plan is to add #69 and #87 to it (so 2 more key types: Modulo and Pem) and figure out why some decoding does not work (#90, #77) by adding more tests.

@Dowwie
Copy link
Contributor

Dowwie commented Sep 12, 2019

@Jake-Shadle @Keats making this work with PEM would be really useful.. I've thus far failed to use openssl::Rsa pem-to-der helper functions to convert a public key PEM to DER and then make it work correctly with jsonwebtoken's decode

@clarkezone
Copy link

(nube question) how do I reference v7 in a consuming app's cargo.toml?

@Keats
Copy link
Owner Author

Keats commented Oct 17, 2019

See https://stackoverflow.com/questions/54196846/how-to-specify-a-certain-commit-in-dependencies-in-cargo-toml for that, just point to the latest commit in the v7 branch and you should be good to go.

@clarkezone
Copy link

Agreed. It would be even nicer if it could load a JWK directly. I can't figure out how to supply the modulus and exponents from the json form.. base64 decoding those strings fails for me

@Keats
Copy link
Owner Author

Keats commented Dec 28, 2019

I think I'll have some time tomorrow to work on it.

@Keats
Copy link
Owner Author

Keats commented Dec 29, 2019

EncodingKey/DecodingKey are in #113

It would be great if some people could review that and see if that fits everyone. It does look good from my pov.

@Dowwie
Copy link
Contributor

Dowwie commented Dec 30, 2019

@Keats I've commented in the PR

@Dowwie
Copy link
Contributor

Dowwie commented Jan 3, 2020

@rib I think it would be helpful if Vincent got feedback from another user of the library other than just me. Could you try out the latest EncodingKey/DecodingKey PR and see if it suits? We had a valuable exchange in the PR comments that may help you.

@rib
Copy link

rib commented Jan 3, 2020

I guess I'm kinda biased since I ended up coming up with a different design and am using that now ;-p

Taking a look though, one thing I'm wondering about is whether it would be good if there were more rigorous checks that a Decode key that was constructed with a hmac secret vs rsa key vs eliptic curve key could only be used to decode a token with a matching algorithm. Internally most of the keys can be coerced via as_bytes() and so then it looks like you could essentially interpret an rsa key as a hmac or ec secret. The same thing for the Encode keys; they all coerce internally to a [u8] and so there's no type checking that a key is use with the intended kind of algorithm.

Another key constructor that I think would also be useful is for a hmac secret that's base64 encoded - instead of requiring the user to have to decode it first themselves. I've seen a number of other jwt libraries provide that convenience and have also found myself needing that.

@Keats
Copy link
Owner Author

Keats commented Jan 4, 2020

Both are good points and should be easy to add.

@Dowwie
Copy link
Contributor

Dowwie commented Jan 5, 2020

@Keats easy to add, you say? :) are you going to inspect the magic bytes? If that works, might be worth adding this to the infer crate and using accordingly rather than crafting something custom?

@Dowwie
Copy link
Contributor

Dowwie commented Jan 5, 2020

@rib great points

@rib
Copy link

rib commented Jan 5, 2020

@Dowwie the key/secret type is defined by the constructor used (e.g. from_rsa_pem() implies type=RSA), so there shouldn't really be any need to infer from the raw byte data (and notably since a hmac secret could be anything then it's not really something that can be done reliably by looking at the data itself).

@Keats
Copy link
Owner Author

Keats commented Jan 6, 2020

Exactly. It needs to add from_rsa_der and from_ec_der rather than just der but then it is trivial

@Dowwie
Copy link
Contributor

Dowwie commented Jan 10, 2020

@Keats what do you think-- is this the final stretch before releasing v7?

@Keats
Copy link
Owner Author

Keats commented Jan 10, 2020

I think so, I might have time to do the patch Sunday and I'll release a rc on crates.io just to let more people easily try it.

@Keats
Copy link
Owner Author

Keats commented Jan 12, 2020

A problem with DecodingKey and using that method is that Validation takes an array of Algorithm so it is possible to fail when comparing the algo families if a user puts multiple unrelated algorithms for some reasons.
I also don't remember why it is an an array to be honest.

@Dowwie
Copy link
Contributor

Dowwie commented Jan 13, 2020

Looks like spring-security supports multiple algorithms.

@rib
Copy link

rib commented Jan 13, 2020

Based on some of the issues discussed here, I would err towards making it hard to mix up algorithms at runtime: https://auth0.com/blog/critical-vulnerabilities-in-json-web-token-libraries/

E.g. users will want to be 100% sure they can't be tricked into treating an RSA public key as a HMAC key at runtime.

This is simpler to reason about if the library has strong key-type safety and only one expected algorithm can be specified when verifying a token.

The cases where multiple algorithms might need to be selected at runtime are probably uncommon enough that it's reasonable to require application to manually determine the algorthm first (which should almost certainly not be based on decoding the header and trusting the given "alg").

@Dowwie
Copy link
Contributor

Dowwie commented Jan 13, 2020

The None-attack mentioned in that article is addressed by resolving to one of Algorithm's supported variants. None is not one of the supported variants. A supported algorithm must be used. Agreed?

As for the "algorithm misdirection" scenario, this requires analysis and literally stepping through much of the flow of jsonwebtoken. I don't think that the Auth0 author's claim is true here but the only way to know for sure is to verify.. verification.

@Keats
Copy link
Owner Author

Keats commented Jan 13, 2020

I think the multiple alg is there when you know the alg family but not sure which one. Ie you might put [RSA256, RSA384, RSA512], so in general a single family. BUT someone could put [RSA256, EC256] and then the key check would fail if we check all members. Which is probably fine.

This crate is not vulnerable to the issue in the article:

  • none is not supported
  • it requires at least one algorithm to be set when decoding, if the alg in the header doesn't match one in the Validation struct, it will fail before doing actually anything

@rib
Copy link

rib commented Jan 13, 2020

yeah, I was mainly pointing to the article just to support the idea of erring on the side of caution and avoiding unnecessary flexibility with regards to letting multiple algorithms to get accepted within one verification pass.

I suppose if someone created a Validation allowing HMAC and RSA (or EC) they could open themselves up to this vulnerability. I'm not sure why you would do that, but the api does seem to allow you to shoot yourself in the foot and maybe it would be better if it just wasn't possible to even make that mistake?

@Dowwie
Copy link
Contributor

Dowwie commented Jan 13, 2020

@rib here's plausible scenario as to why a programmer would do that: a monolith server that for internal communication among servers uses RS512 where as web clients receive HS256

As Keats confirmed, a server that supports both algorithms still is not exposed to the "algorithm misdirection"

@Keats
Copy link
Owner Author

Keats commented Jan 13, 2020

#113 has been updated with encode/decode key type checking wrt the algorithm + base64 secrets

Anything else missing?

@Keats
Copy link
Owner Author

Keats commented Jan 17, 2020

Ok this got merged, anything else missing for downstream users? cc @rib / @JTKBowers

@LeviSchuck
Copy link
Contributor

I am quite excited and happy to have contributed.

@Keats
Copy link
Owner Author

Keats commented Jan 17, 2020

@LeviSchuck you did the biggest bits of this release!

@Dowwie
Copy link
Contributor

Dowwie commented Jan 21, 2020

@Keats looks good to me. can you cut another alpha release? I want to run against tests..

@Keats
Copy link
Owner Author

Keats commented Jan 21, 2020

I've just pushed beta.1 to crates.io

@Keats
Copy link
Owner Author

Keats commented Jan 25, 2020

I'll release it on Monday if nothing else comes up

@Dowwie
Copy link
Contributor

Dowwie commented Jan 27, 2020

looks good from here, so far..

@Keats
Copy link
Owner Author

Keats commented Jan 28, 2020

I didn't forget about it, I got a new laptop yesterday and I was getting some weird rustc crash which might be resolved now.

@Keats
Copy link
Owner Author

Keats commented Jan 29, 2020

It's on crates.io

@Keats Keats closed this as completed Jan 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants