-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
btcec: create new btcec/v2 module that type aliases into the dcrec module #1773
Conversation
The PR doesn't build as is due to some Go modules oddities. I think we may actually need to make a new branch for the initial v2 switch over? |
We don't allow this condition in Decred, as we enforce proper DER signature lengths. If memory serves, this must be allowed in BTC though and there are historical signatures on the chain that violate it. I believe you'll have have to keep the |
Pushed up a new version with things re-worked slightly:
After going with this initial approach, thinking maybe things would be easier if we did it in the following order:
|
After the latest CI run, there's something off w/ sig parsing+verification still AFAICT. One peculiar thing is I had to remove the example (temporarily in an attempt to fix the module) as the sig validation was failing there. As is, we use our existing signature parsing, but dcr's sig serialization, which afaict produces slightly stricter signatures. |
2edaf4b
to
bdcd742
Compare
Pull Request Test Coverage Report for Build 1753757282
💛 - Coveralls |
The build is now ✅! Will proceed with the second leg now where I make a new PR to make a new I'm open to an alternative route vs the one laid out above, such as making this into a new |
Glad to hear it. Please retain the copyright headers from the files that were copied over from DCR. |
Also a note to reviewers: you'll want to check out the commits in order, so you can examine the diff due to the type alias port, and then the final API changes needed in the greater repo. In a few areas, I needed to convert types, since |
What about just moving |
While reviewing, I saw that a lot of tests were removed as they were already being tested in dcrd anyways. This is an idea but if there isn't a strong objection against cgo, it'd be better to bind to Bitcoin core's secp256k1. Then there won't be any worries about bug incompatibility at the least. |
I'm no longer maintaining btcd, so it's not up to me, but using cgo would be an absolute 100% no from me if I were. If you care about maintaining quality software, binding crypto to cgo really is a bad idea in a project like this for several reasons. Among them:
All of that aside, I'd also note that the crypto itself is standardized, so the worry about compatibility is not in the core crypto itself, rather the potential compatibility issues arise in serialization and parsing due to Bitcoin's allowance of non-standard signature serializations. Pulling in an entire crypto library via cgo just to parse signatures seems like overkill, to say the least. |
I agree with @davecgh there. I really think avoiding cgo is the way to go in nearly all cases. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Took me a moment to understand the logic behind the code move and the commit structure but I think it's quite nice to see the diff between the v1
and v2
code in the first commit.
Nice work preserving that!
LGTM assuming we fix the TODOs in the tests with the attached patch:
tests.patch.txt
I'm also very much against using cgo as that would break reproducible builds in lnd
.
Now that #1788 is merged, I'll pick this up again to get it properly building. We also should be able to remove one of the intermediate commits that just copies things over. We'll also need to make sure we push out the proper git tags for the affected sub-modules afterwards. |
Just pushed up the new version! This is ready for another round of review now. EDIT: looks like Actions is down? Got this on the first run: |
@@ -469,9 +463,6 @@ func (a *AddressPubKey) serialize() []byte { | |||
|
|||
case PKFCompressed: | |||
return a.pubKey.SerializeCompressed() | |||
|
|||
case PKFHybrid: | |||
return a.pubKey.SerializeHybrid() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ended up removing support for hybrid keys, as they don't exist in BIP 340 and have been non standard for some time elsewhere.
net: &chaincfg.MainNetParams, | ||
}, | ||
{ | ||
name: "mainnet p2pk hybrid (0x07)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed this since we drop support for serializing in a hybrid format.
ilNum := new(big.Int).SetBytes(il) | ||
if ilNum.Cmp(btcec.S256().N) >= 0 || ilNum.Sign() == 0 { | ||
var ilNum btcec.ModNScalar | ||
if overflow := ilNum.SetByteSlice(il); overflow { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that this is kinda optional, big.Ints work here, but wanted to make things more uniform so started to use the new ModNScalar
.
childKeyBytes := ilNum.Add(&keyNum).Bytes() | ||
childKey = childKeyBytes[:] | ||
|
||
// Strip leading zeroes from childKey, to match the expectation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only odd thing I encountered during this translation, ran into something similar when getting the tests to pass for this original PR. As is, SetByteSlice
assumes that all leading zeroes are stripped off, which is why it's added in the modified parseSig
operation.
If this is removed, we fail TestLeadingZero
.
f54d599
to
878e182
Compare
Latest set of comments addressed. Should be g2g now, though I think the tool we use for coverages doesn't understand
|
Pushed new rebased version after the BIP 9 PR was merged. Once we get this in, I'll do a new tag for the v2 module, then rebase the schnorr PR as that'll be the final PR that needs to be merged for the main taproot PR to start building. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be g2g now, though I think the tool we use for coverages doesn't understand v2 modules?
It seems that the tool is using the go module path (btcec/v2/
) instead of the directory path (btcec/
). I wonder if moving all the contents of btcec/
into a subdir btcec/v2/
fixes it?
If this is the case, then it's the tool that's messed up IMO. Just having to update the module means we don't need to actually move everything when/if we create a I did a scan of the issue tracker for that repo and didn't see anything related to v2 modules.... |
Perhaps there's some extra YAML field we need to be setting somewhere to tell it exactly where to look? |
Pushed a commit to try something out.... |
In this commit, we turn the package into a new Go module (version 2), and then port over the current set of types and functions to mainly alias to the more optimized and maintained dcrec variant. Taking a look at the benchmarks, most operations other than normalization (which IIRC is a bit slower now due to constant time fixes) enjoy some nice speeds up: ``` benchcmp is deprecated in favor of benchstat: https://pkg.go.dev/golang.org/x/perf/cmd/benchstat benchmark old ns/op new ns/op delta BenchmarkAddJacobian-8 464 328 -29.20% BenchmarkAddJacobianNotZOne-8 1138 372 -67.27% BenchmarkScalarBaseMult-8 47336 31531 -33.39% BenchmarkScalarBaseMultLarge-8 42465 32057 -24.51% BenchmarkScalarMult-8 123355 117579 -4.68% BenchmarkNAF-8 582 168 -71.12% BenchmarkSigVerify-8 175414 120794 -31.14% BenchmarkFieldNormalize-8 23.8 24.4 +2.39% BenchmarkParseCompressedPubKey-8 24282 10907 -55.08% ```
Didn't work, but I think what's happening is that the coverage file itself has |
The `goveralls` tool we use to handle code coverage upload seems to not understand that a `v2 module can exist, without having a v2 file path on disk. We use a `sed` command to remove the `v2` module prefix so the tool can reach into the correct file to extract the source code.
Great success: b3d263e An alternative path would be updating the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Already wrote these, feel free to ignore since post-merge
// This is NOT constant time. | ||
// | ||
// The field value is returned to support chaining. This enables syntax like: | ||
// f := new(FieldVal).SetHex("0abc").Add(1) so that f = 0x0abc + 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this comment seems slightly wrong since this function is setHex
and not (* FieldVal).SetHex
if len(hexString)%2 != 0 { | ||
hexString = "0" + hexString | ||
} | ||
bytes, _ := hex.DecodeString(hexString) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it assumed that only valid hex is passed in?
} | ||
|
||
// fromHex converts the passed hex string into a big integer pointer and will | ||
// panic is there is an error. This is only provided for the hard-coded |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/panic is/panic if
s/bet detected/be detected
return p | ||
} | ||
|
||
// BenchmarkAddNonConst benchmarks the secp256k1 curve AddNonConst function with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems to be the wrong godoc comment? If this is changed in a later commit I'll delete this
tmp1.Mul(c1, endomorphismA1) | ||
tmp2.Mul(c2, endomorphismA2) | ||
k1.Sub(bigIntK, tmp1) | ||
k1.Add(k1, tmp2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it looks like this is doing:
k1 = k - c1 x a1 + c2 x a2
rather than what the comment says, am i missing something?
// k2 = - c1 * b1 - c2 * b2 from step 5 (note c2's sign is reversed) | ||
tmp1.Mul(c1, endomorphismB1) | ||
tmp2.Mul(c2, endomorphismB2) | ||
k2.Sub(tmp2, tmp1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This also doesnt seem to match the comment which makes me think im missing context
) | ||
|
||
// Error identifies an error related to public key cryptography using a | ||
// sec256k1 curve. It has full support for errors.Is and errors.As, so the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/sec256k1/secp256k1
continue | ||
} | ||
} | ||
} | ||
|
||
// TestSquare ensures that squaring field values via Square works as expected. | ||
func TestSquare(t *testing.T) { | ||
// TestFieldSquare ensures that squaring field values via Square and SqualVal |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/SqualVal/SquareVal
In this PR, we make the switch to remove most of the code from
btcec
in favor of creating a new module that uses type aliases to take advantage of all the constant timeness fixes and optimizations in thedcrec
repo.The changes were mostly mechanical, though there're a few things I need to fix like ensuring we still expose decoding signatures in the BER format.
Taking a look at the benchmarks, most operations other than normalization (which IIRC is a bit slower now due to constant time fixes) enjoy some nice speeds up: