You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Current UX for JSON-parsing of objects is not great. Users are required to place JSON fields in alphabetical order, and parseJson basically guesses types, making it impossible to parse e.g. fixed-length arrays or bytes of length 20 or 32.
To fix this we need to make JSON parser aware of struct field types and names. This issue proposes the following approach.
Add new cheatcodes
function parseJsonStruct(stringcalldatajson, stringcalldataschema) externalpurereturns (bytesmemoryabiEncodedData);
function parseJsonStruct(stringcalldatajson, stringcalldatakey, stringcalldataschema) externalpurereturns (bytesmemoryabiEncodedData);
Those cheatcodes will be identical to parseJson, but will be guided by types from schema. Format for schema would be the json representation of Vec<StructField> where StructField is:
structStructField{/// Name of the field which will be used when parsing JSON.name:String,/// Type of the fieldty:StructFieldType,}/// Solidity type representation which can be (de)serialized from Json and converted into DynSolTypeenumStructFieldType{Struct(Vec<StructField>),Array(Box<StructFieldType>),FixedArray(Box<StructFieldType>,usize),/// Inner value must be decodable into DynSolType.Primitive(String),}
Another way to represent this in JSON could be by using alloy_dyn_abi::Resolver used for EIP-712 + type name.
Add helpers for generating schema values. For example, we could add forge bind-json command which would accept a path to .sol file, and produce either a schema for all structs, or a comlete parsing library looking like:
The same approach can be used for serialization of structs as well:
function serializeStruct(stringcalldataschema, bytesmemoryabiEncodedData)
externalreturns (stringmemoryjson);
function serializeStruct(stringcalldataobjectKey, stringcalldatavalueKey, stringcalldataschema, bytesmemoryabiEncodedData)
externalreturns (stringmemoryjson);
Such approach reduces compilation overhead by keeping most of the logic in cheatcode implementation, and only requiring contracts to only contain relatively small schema definitions.
The text was updated successfully, but these errors were encountered:
This would be great, I've been getting quite a few questions around JSON parsing / writing and the current API for serialization / deserialization is difficult to work with
because there's no way to get the abi of a tuple at runtime we're forced to pass this as args, we also need the fields to get around the ordering problem.
this approach is doable, we just need to make it easy to get the schema of a type, I think this can even be combined with loading the schema from a json file itself and forge could generate those
I think ideally we can do custom preprocessing here with custom cache. e.g. store a mapping (struct name -> struct schema) which is derived from AST on each non-cached compiler run and kept along with artifacts. it can be cheap to generate if we'll do it in parallel with running the solc, and would allow users to just do parseJsonStruct(json, "StructName") instead of manually updating schema each time when new fields are added.
Component
Forge
Describe the feature you would like
Current UX for JSON-parsing of objects is not great. Users are required to place JSON fields in alphabetical order, and
parseJson
basically guesses types, making it impossible to parse e.g. fixed-length arrays orbytes
of length 20 or 32.To fix this we need to make JSON parser aware of struct field types and names. This issue proposes the following approach.
Add new cheatcodes
Those cheatcodes will be identical to
parseJson
, but will be guided by types fromschema
. Format forschema
would be the json representation ofVec<StructField>
whereStructField
is:Another way to represent this in JSON could be by using
alloy_dyn_abi::Resolver
used for EIP-712 + type name.Add helpers for generating
schema
values. For example, we could addforge bind-json
command which would accept a path to .sol file, and produce either a schema for all structs, or a comlete parsing library looking like:The same approach can be used for serialization of structs as well:
Such approach reduces compilation overhead by keeping most of the logic in cheatcode implementation, and only requiring contracts to only contain relatively small schema definitions.
The text was updated successfully, but these errors were encountered: