Skip to content

Commit

Permalink
Merge pull request #92 from HerodotusDev/hdp_neo
Browse files Browse the repository at this point in the history
Hdp neo
  • Loading branch information
petscheit authored Oct 29, 2024
2 parents 2b63fda + cec73ed commit 072d31d
Show file tree
Hide file tree
Showing 174 changed files with 20,933 additions and 14,593 deletions.
Binary file added .github/architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 8 additions & 0 deletions .github/workflows/pull-request.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,20 +38,28 @@ jobs:
source venv/bin/activate
./tools/make/python_format_check.sh
- name: Check Python tests
run: |
source venv/bin/activate
pytest tools/*
- name: Check Cairo formatting
run: |
source venv/bin/activate
./tools/make/cairo_format_check.sh
- name: Compile Cairo files
run: |
source venv/bin/activate
make build
- name: Run Unit Cairo tests
env:
RPC_URL_MAINNET: ${{ secrets.RPC_URL_MAINNET }}
run: |
source venv/bin/activate
./tools/make/cairo_tests.sh
- name: Run Full Flow tests
run: |
source venv/bin/activate
Expand Down
79 changes: 11 additions & 68 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,7 @@
# Cairo HDP
# HDP Cairo

Cairo HDP is a collection of Cairo0 programs designed to verify inclusion proofs and perform computations on the data. These computations can be verified on-chain, enabling trustless operations on any historical data from Ethereum or integrated EVM chains.

## Run docker test run
```bash
docker build -t hdp-cairo . && docker run hdp-cairo
```

## Installation and Setup

To install the required dependencies and set up the Python virtual environment, run:
Expand Down Expand Up @@ -35,56 +30,23 @@ The program will output the results root and tasks root. These roots can be used

## How It Works

Cairo HDP operates in three main stages. First, it verifies the passed state. Upon validation, it executes the defined tasks on the data. Finally, the tasks and results are added to a Merkle tree, returning the respective roots as output.

### 1. Verification

Verification involves several sequential steps:

#### a: Header Verification

The first step is to verify the validity of the passed headers by recreating the MMR root, proving each header's inclusion in the MMR. Since the Herodotus header accumulator stores every Ethereum header, it verifies the headers' validity.

#### b: Account and Storage Slot Verification

The second step is to verify the passed account and storage slot data's validity by checking MPT proofs against the state root from the respective header.

### 2. Computation

Currently, the following operators are available:

- `min`: Returns the minimum value of the passed data.
- `max`: Returns the maximum value of the passed data.
- `sum`: Returns the sum of the passed data.
- `avg`: Returns the average of the passed data.
- `count_if`: Returns the number of elements that satisfy a condition.
- `slr`: Returns the best-fit linear regression predicted point for the supplied data.
HDP Cairo is the repository containing the logic for verifying on-chain state via storage proofs, and then making that state available to custom Cairo1 contract modules. To enable this functionality, a custom syscall was designed, enabling dynamic access to the verified state. The syscalls are defined in `cairo1_syscall_binding` where we have some examples.

These operations can be performed on any verified field, including non-numerical values like addresses or hashes, such as the `parent_hash` of a header.
### Architecture
The overall program is split into two main parts:

### 3. Output Roots
### Architecture
The overall program is split into two main parts:

In the final step, the results and tasks are added to a Merkle tree, and the roots of these trees are returned as output. These roots can be used to extract the results from the on-chain contract, enabling multiple aggregations in a single execution.
![Architecture](.github/architecture.png)

## Adding a Custom Aggregation Function

To add a new aggregation function:

1. Add the function to `src/tasks/aggregate_functions`.
2. Integrate the function into the datalake tasks handler by updating the parameter decoder and the `execute` function.
3. Define the `fetch_trait` function for this aggregation functionality.
### 1. Storage Proof Verification
In the first stage, we verify the storage proofs found in the `hdp_input.json` file. This file contains all the storage proof for the state required by the contracts execution. This file is generated in the Dry Run stage, where we mock the execution and extract the state that was accessed by the contract. This enables the dynamic access of state from the contract, while ensuring everything can be proven in a sound way. Once this stage is complete, all of the verified state is stored in the memorizers, allowing it to be queried via syscall.

## Adding a Custom Cairo1 Module

HDP can dynamically load Cairo1 programs at runtime, allowing the creation of Cairo1 modules with aggregate function logic. To add a Cairo1 module:

1. Create a new Scarb project in `src/cairo1/`.
2. Add the new aggregation function file to `src/tasks/aggregate_functions`.
3. Define the `fetch_trait` function appropriate for this aggregation functionality.

## Fetch Trait

The `fetch_trait` is an abstract template containing datalake-specific data fetching functions. Each aggregate function must implement this template individually.
### 2. Bootloading
In this stage, we bootload the cairo1 contract. The contracts bytecode is read from the `hdp_input.json` file and now executed in the HDP bootloader. The bootloader now processes the bytecode, and invokes the contained syscalls. These syscalls are then proccessed, fetching and decoding the requested state from the memorizers and loading it into the contracts memory. This enables the seamless access of verified on-chain state in contracts.

## Testing

Expand All @@ -95,22 +57,3 @@ To run the tests (from the virtual environment), execute:
```bash
make test-full
```

## Roadmap

### In Progress

- **Transaction Verifier:** Verifies and decodes raw transactions.
- Status: ![](https://geps.dev/progress/65)

### Planned

- **Merkelize:** Extract data and add it to a Merkle tree.
- **Transaction Datalake:** A datalake focused on transactions.
- **Iterative Dynamic Layout Datalake:** Iterate through a dynamic layout, such as a Solidity mapping.
- **Multi Task Executions:** Run multiple tasks in a single execution.
- **Bloom Filter Aggregate:** Generate a bloom filter from the data.

Herodotus Dev Ltd - 2024

---
13 changes: 0 additions & 13 deletions Scarb.lock
Original file line number Diff line number Diff line change
@@ -1,19 +1,6 @@
# Code generated by scarb DO NOT EDIT.
version = 1

[[package]]
name = "dry_run_test"
version = "0.1.0"
dependencies = [
"hdp_cairo",
"snforge_std",
]

[[package]]
name = "hdp_cairo"
version = "0.1.0"

[[package]]
name = "snforge_std"
version = "0.24.0"
source = "git+https://github.com/foundry-rs/starknet-foundry?tag=v0.24.0#95e9fb09cb91b3c05295915179ee1b55bf923653"
4 changes: 1 addition & 3 deletions Scarb.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,7 @@ version = "0.1.0"
edition = "2023_11"

members = [
"cairo",
"src/contracts/simple_linear_regression",
"src/contracts/dry_run_test"
"cairo1_syscall_binding"
]

[workspace.dependencies]
Expand Down
3 changes: 0 additions & 3 deletions cairo/src/memorizer.cairo

This file was deleted.

85 changes: 0 additions & 85 deletions cairo/src/memorizer/account_memorizer.cairo

This file was deleted.

Loading

0 comments on commit 072d31d

Please sign in to comment.