Skip to content
This repository has been archived by the owner on May 26, 2023. It is now read-only.

unforgiven - [Medium] Function PreCheckWithdrawals() doesn't uniquify withdrawal items, this may cause double spend parameters like gas limit set differently for those duplicate items #223

Closed
github-actions bot opened this issue Feb 20, 2023 · 5 comments
Labels
Duplicate A valid issue that is a duplicate of an issue with `Has Duplicates` label Escalation Resolved This issue's escalations have been approved/rejected Medium A valid Medium severity issue Reward A payout will be made for this issue

Comments

@github-actions
Copy link

github-actions bot commented Feb 20, 2023

unforgiven

medium

[Medium] Function PreCheckWithdrawals() doesn't uniquify withdrawal items, this may cause double spend parameters like gas limit set differently for those duplicate items

Summary

Function PreCheckWithdrawals() checks that the given list of withdrawals represents all withdrawals made in the legacy system and filters out any extra withdrawals not included in the legacy system but code don't check for duplicate items and it may add single message multiple times. This can be a problem if other parts of the code treat them as different messages. This can cause the deployment script to exit with error or generate suspicious logs that can cause migration to be halted. also double spend may happen if for those message code set different nonce or gas limit for them L2ToL1MessagePasser hash calculation. the list of withdrawals shouldn't have duplicate items after filtering.

Vulnerability Detail

This is PreCheckWithdrawals() code:

// PreCheckWithdrawals checks that the given list of withdrawals represents all withdrawals made
// in the legacy system and filters out any extra withdrawals not included in the legacy system.
func PreCheckWithdrawals(db *state.StateDB, withdrawals []*LegacyWithdrawal) ([]*LegacyWithdrawal, error) {
	// Convert each withdrawal into a storage slot, and build a map of those slots.
	slotsInp := make(map[common.Hash]*LegacyWithdrawal)
	for _, wd := range withdrawals {
		slot, err := wd.StorageSlot()
		if err != nil {
			return nil, fmt.Errorf("cannot check withdrawals: %w", err)
		}

		slotsInp[slot] = wd
	}

	// Build a mapping of the slots of all messages actually sent in the legacy system.
	var count int
	slotsAct := make(map[common.Hash]bool)
	err := db.ForEachStorage(predeploys.LegacyMessagePasserAddr, func(key, value common.Hash) bool {
		// When a message is inserted into the LegacyMessagePasser, it is stored with the value
		// of the ABI encoding of "true". Although there should not be any other storage slots, we
		// can safely ignore anything that is not "true".
		if value != abiTrue {
			// Should not happen!
			log.Error("found unknown slot in LegacyMessagePasser", "key", key.String(), "val", value.String())
			return true
		}

		// Slot exists, so add it to the map.
		slotsAct[key] = true
		count++
		return true
	})
	if err != nil {
		return nil, fmt.Errorf("cannot iterate over LegacyMessagePasser: %w", err)
	}

	// Log the number of messages we found.
	log.Info("Iterated legacy messages", "count", count)

	// Iterate over the list of actual slots and check that we have an input message for each one.
	for slot := range slotsAct {
		_, ok := slotsInp[slot]
		if !ok {
			return nil, fmt.Errorf("unknown storage slot in state: %s", slot)
		}
	}

	// Iterate over the list of input messages and check that we have a known slot for each one.
	// We'll filter out any extra messages that are not in the legacy system.
	filtered := make([]*LegacyWithdrawal, 0)
	for slot := range slotsInp {
		_, ok := slotsAct[slot]
		if !ok {
			log.Info("filtering out unknown input message", "slot", slot.String())
			continue
		}

		filtered = append(filtered, slotsInp[slot])
	}

	// At this point, we know that the list of filtered withdrawals MUST be exactly the same as the
	// list of withdrawals in the state. If we didn't have enough withdrawals, we would've errored
	// out, and if we had too many, we would've filtered them out.
	return filtered, nil
}

As you can see code checks that all the cross domain messages are included in the withdrawals and only select withdrawals that their hash has been set to true in the LegacyMessagePasserAddr storage. but code don't uniquify the withdrawal list and if the initial list had duplicate items then the final list would have duplicate items too.
The legacy message in L2CrossDomainMessage includes this information: (Target, Sender, Data and Nonce) and the new withdraw message has this information: (Target, Sender, Data, Nonce, Value, GasLimit). so the parameters (Value, GasLimit) are new and specially parameter GasLimit is set by the deployment script.
The new L2ToL1MessageParser has this information: (Nonce, Sender, Target, Value, Gaslimit, Data) and GasLimit and Value and Nonce are new and set by the deployment script.
so if PreCheckWithdrawals() return duplicate items then function MigrateWithdrawal() which is responsible for migrating withdrawal message to new format may create duplicate message for single old message by setting different values for Gaslimit or Nonce in the L2ToL1MessageParser.

Impact

first impact is that the issue can cause migration to be interrupted as the messages count would be higher than the real amount of the message, and even if the hashed values were equal for duplicate messages and couldn't cause double spend but the bigger number of message would be sign of suspicious withdrawals.
also possible double spend if duplicate message generate different hash because of the added new field to messages that their values calculated by deployment script.

Code Snippet

https://github.com/ethereum-optimism/optimism/blob/3f4b3c328153a8aa03611158b6984d624b17c1d9/op-chain-ops/crossdomain/precheck.go#L13-L77

Tool used

Manual Review

Recommendation

uniquify the withdrawal items

Duplicate of #105

@github-actions github-actions bot added Duplicate A valid issue that is a duplicate of an issue with `Has Duplicates` label Medium A valid Medium severity issue labels Feb 20, 2023
@rcstanciu
Copy link
Contributor

Comment from Optimism


Description: DoS during the migration process

Reason: Good catch if accurate. Just need validation. Would call this a medium severity if so.

@sherlock-admin sherlock-admin added the Reward A payout will be made for this issue label Feb 21, 2023
@0xunforgiven
Copy link

0xunforgiven commented Feb 23, 2023

Escalate for 111 USDC

This issue is not duplicate of the #105

This issue is about PreCheckWithdrawals() and the fact that it doesn't uniquify the unfiltered withdrawals.
According the comment in the function before the return:

	// At this point, we know that the list of filtered withdrawals MUST be exactly the same as the
	// list of withdrawals in the state. If we didn't have enough withdrawals, we would've errored
	// out, and if we had too many, we would've filtered them out.
	return filtered, nil
}

So code perform 2 action to filter the withdrawals:

  1. if we had too many, we would've filtered them out.
  2. If we didn't have enough withdrawals, we would've errored
    and it assumes that: "At this point, we know that the list of filtered withdrawals MUST be exactly the same as the list of withdrawals in the state". but this assumption is wrong, because code doesn't remove legit duplicate withdrawals. if there were duplicate items in the functions input they would be duplicate items in the function's return.
    As the rest of the migration code migrate all the old withdrawals this duplicate items can generate different hash after migration due the the fact that some other values added into withdrawal hash(like gas and value).

the logic of the PreCheckWithdrawals() is not complete and the return list is not "exactly the same as the list of the withdrawals in the state". as the comment says the return list "MOST" be exactly the same so the issue is Medium because code doesn't follow it.

an example:
if state=[w1, w2, w3] was in the sate(hashes) and the input was input=[w1, q1, r1, w2, z3, w1, w2, w3] then code would first filter the extra elements and check that all the [w1, w2, w3] is in the input list and in the end it would return the [w1, w2, w1, w2, w3] as result which is not exactly what is in the state and it has duplicate items. those duplicate items may generate different hash after migration which would cause double spending as some other parameters added to the old withdrawals during the migration and because the code wrongly assumes "exactly the same as the list of the withdrawals in the state" so the rest of the migration process doesn't gonna fix check the list again and fix the duplicate issue.

@sherlock-admin
Copy link
Contributor

sherlock-admin commented Feb 23, 2023

Escalate for 111 USDC

This issue is not duplicate of the #105

This issue is about PreCheckWithdrawals() and the fact that it doesn't uniquify the unfiltered withdrawals.
According the comment in the function before the return:

	// At this point, we know that the list of filtered withdrawals MUST be exactly the same as the
	// list of withdrawals in the state. If we didn't have enough withdrawals, we would've errored
	// out, and if we had too many, we would've filtered them out.
	return filtered, nil
}

So code perform 2 action to filter the withdrawals:

  1. if we had too many, we would've filtered them out.
  2. If we didn't have enough withdrawals, we would've errored
    and it assumes that: "At this point, we know that the list of filtered withdrawals MUST be exactly the same as the list of withdrawals in the state". but this assumption is wrong, because code doesn't remove legit duplicate withdrawals. if there were duplicate items in the functions input they would be duplicate items in the function's return.
    As the rest of the migration code migrate all the old withdrawals this duplicate items can generate different hash after migration due the the fact that some other values added into withdrawal hash(like gas and value).

the logic of the PreCheckWithdrawals() is not complete and the return list is not "exactly the same as the list of the withdrawals in the state". as the comment says the return list "MOST" be exactly the same so the issue is Medium because code doesn't follow it.

an example:
if state=[w1, w2, w3] was in the sate(hashes) and the input was input=[w1, q1, r1, w2, z3, w1, w2, w3] then code would first filter the extra elements and check that all the [w1, w2, w3] is in the input list and in the end it would return the [w1, w2, w1, w2, w3] as result which is not exactly what is in the state and it has duplicate items. those duplicate items may generate different hash after migration which would cause double spending as some other parameters added to the old withdrawals during the migration and because the code wrongly assumes "exactly the same as the list of the withdrawals in the state" so the rest of the migration process doesn't gonna fix check the list again and fix the duplicate issue.

You've created a valid escalation for 111 USDC!

To remove the escalation from consideration: Delete your comment.
To change the amount you've staked on this escalation: Edit your comment (do not create a new comment).

You may delete or edit your escalation comment anytime before the 48-hour escalation window closes. After that, the escalation becomes final.

@sherlock-admin sherlock-admin added the Escalated This issue contains a pending escalation label Feb 23, 2023
@Evert0x
Copy link
Contributor

Evert0x commented Mar 2, 2023

Escalation rejected and keeping duplication state, the comment fails to show a detailed explanation how the faulty state can be reached

@sherlock-admin
Copy link
Contributor

Escalation rejected and keeping duplication state, the comment fails to show a detailed explanation how the faulty state can be reached

This issue's escalations have been rejected!

Watsons who escalated this issue will have their escalation amount deducted from their next payout.

@sherlock-admin sherlock-admin added Escalation Resolved This issue's escalations have been approved/rejected and removed Escalated This issue contains a pending escalation labels Mar 2, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Duplicate A valid issue that is a duplicate of an issue with `Has Duplicates` label Escalation Resolved This issue's escalations have been approved/rejected Medium A valid Medium severity issue Reward A payout will be made for this issue
Projects
None yet
Development

No branches or pull requests

4 participants