Skip to content

Commit

Permalink
Solid EvmProvider , alloy dependency version bumped to v0.1.1 (#96)
Browse files Browse the repository at this point in the history
* provider rework wip

* chore: unified alloy type, finalized reworked provider

* hotfix: with test

* hotfix to make it work

* clean up log

* chore

* bumped `eth-trie-proofs` to 0.1.0

* feat: tx provider error handling

* chore: provider docs, rough bench for provider

* feat: update storage test case

* chore: clean up

* wip

* chore: provider cleanup

* chore: cleaner

* fix: validation, bump `eth-trie-proof` 0.1.1

* chore: docs, readme
  • Loading branch information
rkdud007 committed Jun 26, 2024
1 parent 466bff2 commit 899403b
Show file tree
Hide file tree
Showing 71 changed files with 3,344 additions and 2,191 deletions.
1,148 changes: 1,030 additions & 118 deletions Cargo.lock

Large diffs are not rendered by default.

16 changes: 8 additions & 8 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[workspace]
resolver = "2"
members = ["cli", "crates/core", "crates/provider", "crates/primitives"]
members = ["cli", "crates/core", "crates/primitives", "crates/provider"]

[workspace.package]
version = "0.2.0"
Expand All @@ -14,10 +14,10 @@ hdp-primitives = { version = "0.2.0", path = "crates/primitives" }
hdp-provider = { version = "0.2.0", path = "crates/provider" }
tokio = { version = "1", features = ["full"] }
tempfile = "3.10.1"
alloy-dyn-abi = "0.6.2"
alloy-primitives = { version = "0.6.2", feature = ["rlp"] }
alloy-merkle-tree = { version = "0.5.0" }
alloy-rlp = { version = "0.3.4", features = ["derive"] }
alloy-merkle-tree = { version = "0.6.0" }
alloy-rpc-client = { version = "0.1.1" }
alloy = { version = "0.1.1", features = ["full"] }
alloy-rlp = { version = "0.3.5", features = ["derive"] }
anyhow = "1.0.79"
serde = { version = "1.0", features = ["derive"] }
serde_with = "2.3.2"
Expand All @@ -31,6 +31,6 @@ starknet-crypto = "0.6.1"
cairo-lang-starknet-classes = "2.6.3"
futures = "0.3.30"
lazy_static = "1.4.0"

# TODO: ideally should published
eth-trie-proofs = { git = "https://github.com/HerodotusDev/eth-trie-proofs.git", branch = "main" }
thiserror = "1.0"
eth-trie-proofs = "0.1.1"
itertools = "0.10"
61 changes: 32 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,35 +132,38 @@ The core soundness of HDP relies on generating the correct input file and runnin
Here is the support matrix indicating which blockchain elements are tested for each aggregate function. The matrix highlights fields where these functions are applicable.
| Field Description | SUM | AVG | MIN | MAX | COUNT | SLR |
| ----------------------------- | --- | --- | --- | --- | ----- | --- |
| `account.nonce` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `account.balance` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `account.storage_root` | - | - | - | - | - | - |
| `account.code_hash` | - | - | - | - | - | - |
| `storage.key` (numeric value) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `storage.key` (hash value) | - | - | - | - | - | - |
| `header.difficulty` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.gas_limit` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.gas_used` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.timestamp` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.base_fee_per_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.blob_gas_used` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.excess_blob_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.nonce` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Other `header` elements | - | - | - | - | - | - |
| `tx.nonce` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.gas_price` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.gas_limit` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.value` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.v` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.r` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.s` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.chain_id` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.max_fee_per_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.max_priority_fee_per_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.max_fee_per_blob_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Other `tx` elements | - | - | - | - | - | - |
| Field Description | SUM | AVG | MIN | MAX | COUNT | SLR |
| -------------------------------- | --- | --- | --- | --- | ----- | --- |
| `account.nonce` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `account.balance` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `account.storage_root` | - | - | - | - | - | - |
| `account.code_hash` | - | - | - | - | - | - |
| `storage.key` (numeric value) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `storage.key` (hash value) | - | - | - | - | - | - |
| `header.difficulty` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.gas_limit` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.gas_used` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.timestamp` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.base_fee_per_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.blob_gas_used` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.excess_blob_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `header.nonce` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Other `header` elements | - | - | - | - | - | - |
| `tx.nonce` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.gas_price` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.gas_limit` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.value` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.v` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.r` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.s` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.chain_id` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.max_fee_per_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.max_priority_fee_per_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx.max_fee_per_blob_gas` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Other `tx` elements | - | - | - | - | - | - |
| `tx_receipt.success` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| `tx_receipt.cumulative_gas_used` | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Other `tx_receipt` elements | - | - | - | - | - | - |
_Note: Fields marked with "-" are not applicable for the specified aggregate functions because they do not contain numeric data or the data type is not suitable for these calculations._
Expand Down
2 changes: 1 addition & 1 deletion cli/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ tokio = { workspace = true }
tracing = { workspace = true }
hdp-provider = { workspace = true }
hdp-primitives = { workspace = true }
alloy-primitives = { workspace = true }
serde_json = { workspace = true }
clap = { version = "4.4.4", features = ["derive"] }
dotenv = "0.15.0"
tracing-subscriber = "0.3.0"
inquire = "0.7.4"
alloy = { workspace = true }
40 changes: 23 additions & 17 deletions cli/src/main.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#![deny(unused_crate_dependencies)]

use alloy_primitives::U256;
use alloy::{hex, primitives::U256};
use anyhow::{bail, Result};
use hdp_primitives::{
aggregate_fn::{integer::Operator, FunctionContext},
Expand All @@ -18,21 +18,20 @@ use hdp_primitives::{
},
processed_types::cairo_format::AsCairoFormat,
};
use hdp_provider::evm::provider::EvmProviderConfig;
use inquire::{error::InquireError, Select};
use std::{fs, path::PathBuf, str::FromStr, vec};
use tracing_subscriber::FmtSubscriber;

use clap::{Parser, Subcommand};
use hdp_core::{
codec::datalake_compute::DatalakeComputeCodec,
compiler::{module::ModuleCompilerConfig, CompilerConfig},
compiler::module::ModuleCompilerConfig,
config::Config,
pre_processor::PreProcessor,
pre_processor::{PreProcessor, PreProcessorConfig},
processor::Processor,
};

use hdp_provider::evm::AbstractProviderConfig;

use tracing::{error, info, Level};

/// Simple Herodotus Data Processor CLI to handle tasks and datalakes
Expand Down Expand Up @@ -174,17 +173,17 @@ async fn handle_run(
let url: &str = "http://localhost:3030";
let program_path = "./build/compiled_cairo/hdp.json";
let config = Config::init(rpc_url, datalakes, tasks, chain_id).await;
let provider_config = AbstractProviderConfig {
rpc_url: &config.rpc_url,
let datalake_config = EvmProviderConfig {
rpc_url: config.rpc_url.parse().expect("Failed to parse RPC URL"),
chain_id: config.chain_id,
rpc_chunk_size: config.rpc_chunk_size,
max_requests: config.rpc_chunk_size,
};
let module_config = ModuleCompilerConfig {
module_registry_rpc_url: url.parse().unwrap(),
program_path: PathBuf::from(&program_path),
};
let compiler_config = CompilerConfig::new(provider_config.clone(), module_config);
let preprocessor = PreProcessor::new_with_config(compiler_config);
let preprocessor_config = PreProcessorConfig::new(datalake_config, module_config);
let preprocessor = PreProcessor::new_with_config(preprocessor_config);
let result = preprocessor
.process_from_serialized(config.datalakes.clone(), config.tasks.clone())
.await?;
Expand All @@ -208,7 +207,7 @@ async fn handle_run(
Ok(())
} else {
let output_file_path = output_file.unwrap();
let processor = Processor::new(provider_config, PathBuf::from(program_path));
let processor = Processor::new(PathBuf::from(program_path));
let processor_result = processor.process(result, pie_file.unwrap()).await?;
let output_string = serde_json::to_string_pretty(&processor_result).unwrap();
fs::write(&output_file_path, output_string).expect("Unable to write file");
Expand Down Expand Up @@ -507,8 +506,8 @@ async fn main() -> Result<()> {
.prompt()?;

handle_run(
Some(encoded_computes),
Some(encoded_datalakes),
Some(hex::encode(encoded_computes)),
Some(hex::encode(encoded_datalakes)),
rpc_url,
chain_id,
Some(output_file),
Expand Down Expand Up @@ -570,11 +569,14 @@ async fn main() -> Result<()> {
let datalake_compute_codec = DatalakeComputeCodec::new();
let (encoded_datalakes, encoded_computes) =
datalake_compute_codec.encode_batch(vec![target_datalake_compute])?;

let encoded_computes_str = hex::encode(encoded_computes);
let encoded_datalakes_str = hex::encode(encoded_datalakes);
// if allow_run is true, then run the evaluator
if allow_run {
handle_run(
Some(encoded_computes),
Some(encoded_datalakes),
Some(encoded_computes_str),
Some(encoded_datalakes_str),
rpc_url,
chain_id,
output_file,
Expand All @@ -586,11 +588,15 @@ async fn main() -> Result<()> {
}
Commands::Decode { tasks, datalakes } => {
let datalake_compute_codec = DatalakeComputeCodec::new();
datalake_compute_codec.decode_batch(datalakes, tasks)?;
let tasks = hex::decode(tasks)?;
let datalakes = hex::decode(datalakes)?;
datalake_compute_codec.decode_batch(&datalakes, &tasks)?;
}
Commands::DecodeOne { task, datalake } => {
let datalake_compute_codec = DatalakeComputeCodec::new();
datalake_compute_codec.decode_single(datalake, task)?;
let task = hex::decode(task)?;
let datalake = hex::decode(datalake)?;
datalake_compute_codec.decode_single(&datalake, &task)?;
}
Commands::Run {
tasks,
Expand Down
3 changes: 1 addition & 2 deletions crates/core/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,8 @@ version.workspace = true
[dependencies]
hdp-provider = { workspace = true }
hdp-primitives = { workspace = true }
alloy-primitives = { workspace = true }
alloy-merkle-tree = { workspace = true }
alloy-dyn-abi = { workspace = true }
alloy = { workspace = true }
anyhow = { workspace = true }
cairo-lang-starknet-classes.workspace = true
starknet-crypto.workspace = true
Expand Down
5 changes: 3 additions & 2 deletions crates/core/src/cairo_runner/input/run.rs
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
use hdp_primitives::processed_types::{cairo_format, module::ProcessedModule};
use hdp_provider::evm::ProcessedBlockProofs;
use hdp_primitives::processed_types::{
block_proofs::ProcessedBlockProofs, cairo_format, module::ProcessedModule,
};
use serde::Serialize;

/*
Expand Down
21 changes: 14 additions & 7 deletions crates/core/src/cairo_runner/run.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
use alloy::primitives::{B256, U256};
use anyhow::Result;
use hdp_primitives::processed_types::uint256::Uint256;
use std::fs;
use std::path::{Path, PathBuf};
use std::process::{Command, Stdio};
use std::str::FromStr;
use tempfile::NamedTempFile;
use tracing::info;

Expand All @@ -13,8 +15,8 @@ use regex::Regex;
#[derive(Debug)]
pub struct RunResult {
pub pie_path: PathBuf,
pub task_results: Vec<String>,
pub results_root: String,
pub task_results: Vec<U256>,
pub results_root: B256,
}

pub struct Runner {
Expand Down Expand Up @@ -68,20 +70,20 @@ impl Runner {
}

/// Parse the output of the run command
fn parse_run(&self, output: String) -> Result<(Vec<String>, String)> {
fn parse_run(&self, output: String) -> Result<(Vec<U256>, B256)> {
let task_result_re = Regex::new(r"Task Result\((\d+)\): (\S+)").unwrap();
let mut task_results = vec![];
for caps in task_result_re.captures_iter(&output) {
let _ = &caps[1];
let value = &caps[2];
task_results.push(value.to_string());
task_results.push(U256::from_str(value)?);
}
let results_root_re = Regex::new(r"Results Root: (\S+) (\S+)").unwrap();
if let Some(results_root_caps) = results_root_re.captures(&output) {
let results_root_1 = &results_root_caps[1];
let results_root_2 = &results_root_caps[2];
let result_root = Uint256::from_strs(results_root_2, results_root_1)?;
let combined_results_root = result_root.to_combined_string().to_string();
let combined_results_root = result_root.to_combined_string();
Ok((task_results, combined_results_root))
} else {
bail!("Results Root not found");
Expand All @@ -100,10 +102,15 @@ mod tests {
let output = "Task Result(0): 0x01020304\nResults Root: 0x01020304 0x05060708";
let (task_results, results_root) = runner.parse_run(output.to_string()).unwrap();
assert_eq!(task_results.len(), 1);
assert_eq!(task_results[0], "0x01020304");
assert_eq!(
task_results[0],
U256::from_str_radix("01020304", 16).unwrap()
);
assert_eq!(
results_root,
"0x0000000000000000000000000506070800000000000000000000000001020304"
Uint256::from_strs("05060708", "01020304")
.unwrap()
.to_combined_string()
);
}
}
Loading

0 comments on commit 899403b

Please sign in to comment.