-
Notifications
You must be signed in to change notification settings - Fork 586
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support json tx encoding for interchain accounts #3246
Comments
Is there an example user flow? Trying to understand whether this is a helper function or for decoding on the host? I'm not sure I follow when this would need to be rlp encoded on the host side within ibc-go |
Revisiting, here's how to implement this issue for json. Please implement json and rlp support in separate prs. User flowjson_encode(tx) -> give to controller -> ibc send, relay -> host recv -> decode_json(tx) into []sdk.Msg -> execute tx json_encoding code will be external to ibc-go (potentially a wasm contract) ImplementationAdd Add Modify // DeserializeCosmosTx, pass in full packetData
// *codec.ProtoCodec implements the JSONCodec interface
if _, ok := cdc.(*codec.ProtoCodec); !ok {
return nil, errorsmod.Wrap(ErrInvalidCodec, "provided codec is not of type %T", codec.ProtoCodec{})
}
var cosmosTx CosmosTx
switch encoding // encoding is a parameter passed to the function
case EncodingProtobuf:
if err := cdc.Unmarshal(data, &cosmosTx); err != nil {
return nil, err
}
case EncodingJSON:
if err := cdc.UnmarshalJSON(data, &cosmosTx); err != nil {
return nil, err
}
default:
return nil, errorsmod.Wrap(ErrUnsupportedEncoding, "encoding type %s is not supported", packetData.Encoding)
} Similar changes can be made to TestingA test should be added to the host submodule which creates a channel using the JSON encoding. It should send a successful json encoded transaction (succeeds) and it should send a proto encoded transaction (fails). An e2e is not necessary in my opinion. It would be greatly appreciated if someone could test this integration with wasm 🙏 I may have missed something so please ask questions if you run into any issues |
Thanks for the write-up, @colin-axner! I would like to add a couple of things and hear your opinion. The If we don't add an extra field to the
However, Maybe we can also make the encoding type an enum, at least in the Go code, but still use a string in the We also have this helper CLI command on the host submodule to generate the packet data that I think we should also adapt so that it takes an extra parameter for the encoding type. UPDATE: instead of adding an extra argument to the CLI, it would be possible to retrieve the encoding from the version of the channel end. |
Please note that there is no specification for how Cosmos SDK transactions are serialized to JSON. It is defined by the Cosmos SDK implementation based on the protobuf JSON specs plus a bunch of extra rules that may or may not be consistent. I.e. the JSON serialization can change any time. I would not recommend to assume chain 1 and chain 2 use the same JSON codec as long as such a spec does not exist. |
The SDK does not apply any extra rules to proto JSON serialization. The only deviation that might be present relates to how some legacy amino types marshal json which (if they still exist) I would basically consider bugs at this point. We attempt to follow the official proto3 spec to the letter, but of course that spec could change. Also in the future we will use the official protoreflect based JSON serializer so there should be no legacy amino glitches with that. Note, however, that the official serializer is intentionally non-deterministic because they want to reserve the right to make breaking changes... So yes, it's generally not a great choice |
I would love to see see this become a reality. It would help independent implementations so much. But right now I often see JSON seralizations that differ from proto. E.g. the Dec cases which are integer string on the protobuf level but in JSON suddenly have a decimal point. Also embedded CosmWasm messages are bytes at protobuf level but nested objects in JSON. Is this something you consider bugs and everything should be the original protobuf JSON? What about the protobuf option system? Should it allow changes to the JSON representation or should it all become the standard once we don't need Amino JSON anymore? |
Thanks for considering adding JSON support to ICA! This would make using interchain accounts from smart contracts 100x as easy. For the JSON encoding to use, in CosmWasm we use the one defined here (also, you can find the JSON schema for this here). From my perspective, using that would be AMAZING as we could start using ICA natively (without a wrapper) from CosmWasm. |
Thanks everyone for the context. I have some clarifying questions just to make sure I follow the concerns being raised here. Please let me know if I have misunderstood something As @webmaster128 points out in the first #3246 (comment), the SDK does not contain json schema's for every sdk.Msg nor does it enforce this practice. Instead it relies on the protobuf json mapping which as @aaronc mentions could technically change from under our feet (the official serializer reserving the right to making breaking changes). I'm not sure I follow the followup #3246 (comment) though? This seems to be pointing out differences between proto and JSON, not proto JSON vs JSON? With ICA, the primary concern is ensuring that we can decode a msg type using the encoding type negotiated at channel opening (in addition to support encoding types that connecting controllers have access to). What is the exact concern with the lack of a specification for encoding sdk.Msgs? Is it a concern that the proto json mapping could potentially change (ie future proofing)? Is it a concern that without json schema's attached to sdk.Msg type, it is difficult for smart contract users to correctly encode a specific sdk.Msg? Please note that ICA will be expecting an Any which references to specific sdk.Msg implementation. While no json schema explicitly exists, it does implicitly exist by referencing the proto definition + using proto json. |
Hi @aaronc and @webmaster128, there are two proposed implementations for this issue, #3725 and #3796. If you have any further comments, they would be very much appreciated! In further consideration, I'm not so sure we need to worry about proto JSON future compatibility only with respect to ICA transaction encode/decode. IBC packet data (such as for transfer and ICA) is already encoded/decoded using proto JSON and will be required to maintain backwards compatibility in the absence of a coordinated upgrade between both channel ends. Thus in such a world where this compatibility would break, we would need to go the extra mile regardless to ensure we maintain compatibility |
Cool so I looked into this some more, documenting my findings here. The purpose of this feature is to support: wasm contracts to json encode SDK msgs -> which are relayed to another chain -> and decoded by ibc-go ica host module into sdk.Msg and then the msgs are executed via baseapp One issue I don't see how we can get around is usage of From what I've been able to tell, proto3 json differs from the standard json library in a few ways:
I did some testing and it appears to me the pbjson impl can successfully unmarshal any bytes encoded by the standard json library. The reverse is not true. If you encode using proto3 json (say by using a string for a int64) the standard library will return an error Based on this understanding, if a new encoding type is added for The requirements set by the proto3 json spec for As a side note:
From my understanding based on @srdtrk research, the rust json library provides good flexibility in determining exactly how to encode a custom type. As shown in some code @srdtrk sent me, meeting the proto3 requirement is quite simple with the standard rust library: /// This enum does not derive Deserialize, see issue [#1443](https://github.com/CosmWasm/cosmwasm/issues/1443)
#[derive(Serialize, Clone, Debug)]
#[serde(tag = "@type")]
pub enum CosmosMessages {
#[serde(rename = "/cosmos.bank.v1beta1.MsgSend")]
MsgSend {
/// Sender's address.
from_address: String,
/// Recipient's address.
to_address: String,
/// Amount to send
amount: Vec<Coin>,
},
#[serde(rename = "/cosmos.gov.v1beta1.MsgSubmitProposal")]
MsgSubmitProposalLegacy {
content: Box<CosmosMessages>,
initial_deposit: Vec<Coin>,
proposer: String,
},
#[serde(rename = "/cosmos.gov.v1beta1.TextProposal")]
TextProposal { title: String, description: String },
}
#[derive(Serialize, Debug)]
pub struct CosmosTx {
pub messages: Vec<CosmosMessages>,
} Based on these findings, adding a Please let us know if you have any concerns or comments, we will likely be moving forward with #3796 (excellent work @srdtrk) |
I'll note that PR #3796 is end to end tested using this CosmWasm implementation of the ica controller and the e2e testing package in that repo. I've verified that the json codec we used in this PR and cosmwasm communicate as intended. |
Summary
Currently interchain accounts only has protobuf encoding, json encoding would give more flexibility to using the feature, particularly for CosmWasm and Solidity smart contracts.
Problem Definition
Adding json encoding would improve ICA composability, the feature is currently limited to only protobuf encoding
Proposal
Add json encoding for ICA messages
For Admin Use
The text was updated successfully, but these errors were encountered: