Skip to content

Commit

Permalink
release v0.2.5 (#98)
Browse files Browse the repository at this point in the history
* version bump to 0.2.5

* cargo.toml cleanup, readme update

* revert unexpected change
  • Loading branch information
rkdud007 authored Jun 26, 2024
1 parent 899403b commit 485a86d
Show file tree
Hide file tree
Showing 8 changed files with 52 additions and 28 deletions.
8 changes: 4 additions & 4 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 11 additions & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,20 @@ resolver = "2"
members = ["cli", "crates/core", "crates/primitives", "crates/provider"]

[workspace.package]
version = "0.2.0"
version = "0.2.5"
edition = "2021"
license-file = "LICENSE"
authors = ["Pia <pia@herodotus.dev>"]
repository = "https://github.com/HerodotusDev/hdp"
homepage = "https://herodotus.dev/"
exclude = ["benches/", "tests/", "fixtures/"]
keywords = ["blockchain", "ethereum", "rust", "data-processor", "storage-proof"]
categories = [
"command-line-interface",
"cryptography::cryptocurrencies",
"compilers",
"asynchronous",
]

[workspace.dependencies]
hdp-core = { version = "0.2.0", path = "crates/core" }
Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ The Data Processor CLI serves as an essential tool for developers working with C

```bash
# Install with cargo
❯ cargo install --git https://github.com/HerodotusDev/hdp --locked --force
❯ cargo install --git https://github.com/HerodotusDev/hdp --tag v0.2.5 --locked --force
```

### Build from source
Expand Down Expand Up @@ -63,25 +63,25 @@ The following examples demonstrate how to use the HDP CLI to encode various bloc
Header value with `AVG`:
```
hdp encode "avg" -b 4952100 4952110 "header.base_fee_per_gas" 1
hdp encode -a -c {input.file} "avg" -b 4952100 4952110 "header.base_fee_per_gas" 1
```
Account value with `SUM`:
```
hdp encode "sum" -b 4952100 4952110 "account.0x7f2c6f930306d3aa736b3a6c6a98f512f74036d4.nonce" 2
hdp encode -a -c {input.file} "sum" -b 4952100 4952110 "account.0x7f2c6f930306d3aa736b3a6c6a98f512f74036d4.nonce" 2
```
Storage value with `AVG`:
```
hdp encode "avg" -b 5382810 5382820 "storage.0x75CeC1db9dCeb703200EAa6595f66885C962B920.0x0000000000000000000000000000000000000000000000000000000000000002" 1
hdp encode -a -c {input.file} "avg" -b 5382810 5382820 "storage.0x75CeC1db9dCeb703200EAa6595f66885C962B920.0x0000000000000000000000000000000000000000000000000000000000000002" 1
```
Account value with `COUNT`:
```
hdp encode "count" "gt.1000" -b 4952100 4952110 "account.0x7f2c6f930306d3aa736b3a6c6a98f512f74036d4.nonce" 2
hdp encode -a -c {input.file} "count" "gt.1000" -b 4952100 4952110 "account.0x7f2c6f930306d3aa736b3a6c6a98f512f74036d4.nonce" 2
```
After encoding, you can directly run processing tasks using environmental configurations for RPC and Chain ID, as shown below:
Expand Down Expand Up @@ -185,17 +185,17 @@ For developers interested in extending the functionality of HDP by adding new mo
### Getting Started
1. **Module Location**: Start by creating a new module within the `aggregate_fn` directory. You can find this at [aggregation_fn/mod.rs](./crates/core/src/aggregate_fn).
1. **Module Location**: Start by creating a new module within the `aggregate_fn` directory. You can find this at [aggregation_fn/mod.rs](./crates/primitives/src/aggregate_fn).
2. **Define Enum**: Define your new function as an enum in the [file](./crates/core/src/aggregate_fn). Make sure to add match arms for the new enum variants in the implementation.
2. **Define Enum**: Define your new function as an enum in the [file](./crates/primitives/src/aggregate_fn). Make sure to add match arms for the new enum variants in the implementation.
3. **Handle Data Types**: Depending on the expected input type for your function:
- **Integer Inputs**: Use [`U256`](https://docs.rs/alloy-primitives/latest/alloy_primitives/index.html#reexport.U256) for handling large integers compatible with Ethereum's numeric constraints.
- **String Inputs**: Use Rust's standard `String` type for text data.
### Context Required Operation
For a practical example of how to implement context-sensitive operations, refer to the implementation of the [`COUNT`](./crates/core/src/aggregate_fn/integer.rs#L118) function. This example shows how to pass and utilize additional context for operations, which can be particularly useful for conditional processing or complex calculations.
For a practical example of how to implement context-sensitive operations, refer to the implementation of the `COUNT` function. This example shows how to pass and utilize additional context for operations, which can be particularly useful for conditional processing or complex calculations.
During `SLR` computation, we also need a context to use as the target index for computation. Since `SLR` is not supported during the preprocessing step, we simply pass the encoded task that contains the function context, and the Cairo program will handle this computation based on the provided index.
Expand Down
25 changes: 11 additions & 14 deletions cli/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,20 +1,17 @@
[package]
name = "hdp-cli"
version = "0.2.0"
edition = "2021"
license-file = "LICENSE"
description = "Interact Herodotus Data Processor via CLI"
authors = ["Pia <pia@herodotus.dev>"]
homepage = "https://herodotus.dev/"
repository = "https://github.com/HerodotusDev/hdp"
description = "Interactive Herodotus Data Processor via CLI"
edition.workspace = true
license-file.workspace = true
version.workspace = true
repository.workspace = true
homepage.workspace = true
exclude.workspace = true
keywords.workspace = true
categories.workspace = true
authors.workspace = true
readme = "../README.md"
keywords = ["blockchain", "ethereum", "rust", "data-processor", "storage-proof"]
categories = [
"command-line-interface",
"cryptography::cryptocurrencies",
"compilers",
"asynchronous",
]


[[bin]]
name = "hdp"
Expand Down
4 changes: 3 additions & 1 deletion cli/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,9 @@ async fn handle_run(
} else {
let output_file_path = output_file.unwrap();
let processor = Processor::new(PathBuf::from(program_path));
let processor_result = processor.process(result, pie_file.unwrap()).await?;
let processor_result = processor
.process(result, pie_file.expect("PIE path should be specified"))
.await?;
let output_string = serde_json::to_string_pretty(&processor_result).unwrap();
fs::write(&output_file_path, output_string).expect("Unable to write file");
info!(
Expand Down
5 changes: 5 additions & 0 deletions crates/core/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,11 @@ edition.workspace = true
license-file.workspace = true
repository.workspace = true
version.workspace = true
exclude.workspace = true
keywords.workspace = true
categories.workspace = true
authors.workspace = true
readme = "README.md"

[dependencies]
hdp-provider = { workspace = true }
Expand Down
5 changes: 5 additions & 0 deletions crates/primitives/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,11 @@ edition.workspace = true
license-file.workspace = true
repository.workspace = true
version.workspace = true
exclude.workspace = true
keywords.workspace = true
categories.workspace = true
authors.workspace = true
readme = "README.md"

[dependencies]
serde = { workspace = true }
Expand Down
5 changes: 5 additions & 0 deletions crates/provider/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,11 @@ edition.workspace = true
license-file.workspace = true
repository.workspace = true
version.workspace = true
exclude.workspace = true
keywords.workspace = true
categories.workspace = true
authors.workspace = true
readme = "README.md"

[dependencies]
anyhow.workspace = true
Expand Down

0 comments on commit 485a86d

Please sign in to comment.