Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deep rework: hdp-core and hdp-primitives refactoring #50

Closed
rkdud007 opened this issue Mar 27, 2024 · 1 comment
Closed

deep rework: hdp-core and hdp-primitives refactoring #50

rkdud007 opened this issue Mar 27, 2024 · 1 comment

Comments

@rkdud007
Copy link
Member

rkdud007 commented Mar 27, 2024

I realize now it's really time to go through deep rework. The rework goal is to have:

  1. clear categorize on component's scope
  2. design in the way it can be scalable with multiple datalake type
  3. yes. I wrote some spagetti codes to ship fast. Should consider value consumption.

This issue is also will be used as ground work of documentation.

Design

Primitives

Block

Ethereum block fields that needed for datalakes.

  1. RPC type that provider will deserialize the response. ( alloy-rpc-type dependable )
  2. Consensus type that will be rlp encode/decodable. This should be compatible with actual ethereum trie implementation. ( alloy-consensus-type dependable )

Note: Both Datalake, Task will not have compile in primitives scope

Datalake

each datalake type folder

  1. Acceptable Fields. This indexes of fields will be sync across Cairo and Solidity. Each datalake's collection will have each property field enum
  2. Datalake Collection: It bind with sampled_property, and allow to be serialize and deserialize into Vec format
  3. Datalake instance: support encode decode commit

root

  1. DatalakeEnvelope: Enum that embed different Datalake types
  2. DatalakeType : Enum for just identify type

Task

  1. ComputationalTaskWithDatalake: The type that embed the datalake envelope support encode decode commit
  2. ComputationalTask: support encode decode commit

Core

Note: now this is where we should use Provider

DatalakeCompiler

DatalakeCompiler is responsible for fetch relevant proofs and values from given datalake type and request using Provider. This returns CompiledDatalakewhich stacks all the relevant onchain data of required datalake

each datalake type folder

  1. Compiled type: Evaluator depends on this type to return result format that can be serialize into json format. And Compiler will return compatible compiled type as result
  2. Compiler: Takes datalake instance and fetch and compile it and return compiled type ( do not consume value of datalake and provider )

Evaluator

Evaluator is responsible for return EvaluationResult that can be serialize into json that is compatible with input.json format.

EvaluationResult embed compiled type, but also contains encoded values from evaluator. And this is directly serialize into json file output.json or input.json

Provider

( Not target scope for current refactoring ) Fetch proofs and values.

@rkdud007 rkdud007 mentioned this issue Mar 28, 2024
@rkdud007
Copy link
Member Author

rkdud007 commented Apr 4, 2024

close with #51

@rkdud007 rkdud007 closed this as completed Apr 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant