You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I realize now it's really time to go through deep rework. The rework goal is to have:
clear categorize on component's scope
design in the way it can be scalable with multiple datalake type
yes. I wrote some spagetti codes to ship fast. Should consider value consumption.
This issue is also will be used as ground work of documentation.
Design
Primitives
Block
Ethereum block fields that needed for datalakes.
RPC type that provider will deserialize the response. ( alloy-rpc-type dependable )
Consensus type that will be rlp encode/decodable. This should be compatible with actual ethereum trie implementation. ( alloy-consensus-type dependable )
Note: Both Datalake, Task will not have compile in primitives scope
Datalake
each datalake type folder
Acceptable Fields. This indexes of fields will be sync across Cairo and Solidity. Each datalake's collection will have each property field enum
Datalake Collection: It bind with sampled_property, and allow to be serialize and deserialize into Vec format
Datalake instance: support encodedecodecommit
root
DatalakeEnvelope: Enum that embed different Datalake types
DatalakeType : Enum for just identify type
Task
ComputationalTaskWithDatalake: The type that embed the datalake envelope support encodedecodecommit
ComputationalTask: support encodedecodecommit
Core
Note: now this is where we should use Provider
DatalakeCompiler
DatalakeCompiler is responsible for fetch relevant proofs and values from given datalake type and request using Provider. This returns CompiledDatalakewhich stacks all the relevant onchain data of required datalake
each datalake type folder
Compiled type: Evaluator depends on this type to return result format that can be serialize into json format. And Compiler will return compatible compiled type as result
Compiler: Takes datalake instance and fetch and compile it and return compiled type ( do not consume value of datalake and provider )
Evaluator
Evaluator is responsible for return EvaluationResult that can be serialize into json that is compatible with input.json format.
EvaluationResult embed compiled type, but also contains encoded values from evaluator. And this is directly serialize into json file output.json or input.json
Provider
( Not target scope for current refactoring ) Fetch proofs and values.
The text was updated successfully, but these errors were encountered:
I realize now it's really time to go through deep rework. The rework goal is to have:
This issue is also will be used as ground work of documentation.
Design
Primitives
Block
Ethereum block fields that needed for datalakes.
Note: Both Datalake, Task will not have compile in primitives scope
Datalake
encode
decode
commit
Task
encode
decode
commit
encode
decode
commit
Core
Note: now this is where we should use Provider
DatalakeCompiler
DatalakeCompiler
is responsible for fetch relevant proofs and values from given datalake type and request using Provider. This returnsCompiledDatalake
which stacks all the relevant onchain data of required datalakeEvaluator
Evaluator
is responsible for returnEvaluationResult
that can be serialize into json that is compatible withinput.json
format.EvaluationResult
embed compiled type, but also contains encoded values from evaluator. And this is directly serialize into json file output.json or input.jsonProvider
( Not target scope for current refactoring ) Fetch proofs and values.
The text was updated successfully, but these errors were encountered: