Skip to content

Commit

Permalink
Merge branch 'dev' into bf-emptytx
Browse files Browse the repository at this point in the history
  • Loading branch information
etan-status committed Oct 9, 2024
2 parents 9d53bb9 + 28f9f07 commit 34510e3
Show file tree
Hide file tree
Showing 58 changed files with 2,842 additions and 1,154 deletions.
10 changes: 10 additions & 0 deletions .github/workflows/generate_vectors.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,16 @@ jobs:
cp -r presets/ ../consensus-spec-tests/presets
cp -r configs/ ../consensus-spec-tests/configs
find . -type d -empty -delete
- name: Check for errors
run: |
if grep -q "\[ERROR\]" consensustestgen.log; then
echo "There is an error in the log"
exit 1
fi
if find . -type f -name "INCOMPLETE" | grep -q "INCOMPLETE"; then
echo "There is an INCOMPLETE file"
exit 1
fi
- name: Archive configurations
run: |
cd consensus-spec-tests
Expand Down
20 changes: 14 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ MARKDOWN_FILES = $(wildcard $(SPEC_DIR)/*/*.md) \
$(wildcard $(SPEC_DIR)/_features/*/*/*.md) \
$(wildcard $(SSZ_DIR)/*.md)

ALL_EXECUTABLE_SPEC_NAMES = phase0 altair bellatrix capella deneb electra whisk eip6800 eip7732
ALL_EXECUTABLE_SPEC_NAMES = phase0 altair bellatrix capella deneb electra whisk eip6800 eip7594 eip7732
# The parameters for commands. Use `foreach` to avoid listing specs again.
COVERAGE_SCOPE := $(foreach S,$(ALL_EXECUTABLE_SPEC_NAMES), --cov=eth2spec.$S.$(TEST_PRESET_TYPE))
PYLINT_SCOPE := $(foreach S,$(ALL_EXECUTABLE_SPEC_NAMES), ./eth2spec/$S)
Expand Down Expand Up @@ -105,7 +105,7 @@ generate_tests: $(GENERATOR_TARGETS)

# "make pyspec" to create the pyspec for all phases.
pyspec:
python3 -m venv venv; . venv/bin/activate; python3 setup.py pyspecdev
@python3 -m venv venv; . venv/bin/activate; python3 setup.py pyspecdev

# check the setup tool requirements
preinstallation:
Expand Down Expand Up @@ -141,13 +141,21 @@ endif
open_cov:
((open "$(COV_INDEX_FILE)" || xdg-open "$(COV_INDEX_FILE)") &> /dev/null) &

# Check all files and error if any ToC were modified.
check_toc: $(MARKDOWN_FILES:=.toc)
@[ "$$(find . -name '*.md.tmp' -print -quit)" ] && exit 1 || exit 0

# Generate ToC sections & save copy of original if modified.
%.toc:
cp $* $*.tmp && \
doctoc $* && \
diff -q $* $*.tmp && \
rm $*.tmp
@cp $* $*.tmp; \
doctoc $* > /dev/null; \
if diff -q $* $*.tmp > /dev/null; then \
echo "Good $*"; \
rm $*.tmp; \
else \
echo "\033[1;33m Bad $*\033[0m"; \
echo "\033[1;34m See $*.tmp\033[0m"; \
fi

codespell:
codespell . --skip "./.git,./venv,$(PY_SPEC_DIR)/.mypy_cache" -I .codespell-whitelist
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ This repository hosts the current Ethereum proof-of-stake specifications. Discus

## Specs

[![GitHub release](https://img.shields.io/github/v/release/ethereum/eth2.0-specs)](https://github.com/ethereum/eth2.0-specs/releases/) [![PyPI version](https://badge.fury.io/py/eth2spec.svg)](https://badge.fury.io/py/eth2spec)
[![GitHub release](https://img.shields.io/github/v/release/ethereum/consensus-specs)](https://github.com/ethereum/consensus-specs/releases/) [![PyPI version](https://badge.fury.io/py/eth2spec.svg)](https://badge.fury.io/py/eth2spec) [![testgen](https://github.com/ethereum/consensus-specs/actions/workflows/generate_vectors.yml/badge.svg?branch=dev&event=schedule)](https://github.com/ethereum/consensus-specs/actions/workflows/generate_vectors.yml)

Core specifications for Ethereum proof-of-stake clients can be found in [specs](specs/). These are divided into features.
Features are researched and developed in parallel, and then consolidated into sequential upgrades when ready.
Expand Down
7 changes: 6 additions & 1 deletion presets/mainnet/electra.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ MAX_EFFECTIVE_BALANCE_ELECTRA: 2048000000000
# State list lengths
# ---------------------------------------------------------------
# `uint64(2**27)` (= 134,217,728)
PENDING_BALANCE_DEPOSITS_LIMIT: 134217728
PENDING_DEPOSITS_LIMIT: 134217728
# `uint64(2**27)` (= 134,217,728)
PENDING_PARTIAL_WITHDRAWALS_LIMIT: 134217728
# `uint64(2**18)` (= 262,144)
Expand Down Expand Up @@ -43,3 +43,8 @@ MAX_WITHDRAWAL_REQUESTS_PER_PAYLOAD: 16
# ---------------------------------------------------------------
# 2**3 ( = 8) pending withdrawals
MAX_PENDING_PARTIALS_PER_WITHDRAWALS_SWEEP: 8

# Pending deposits processing
# ---------------------------------------------------------------
# 2**4 ( = 4) pending deposits
MAX_PENDING_DEPOSITS_PER_EPOCH: 16
11 changes: 8 additions & 3 deletions presets/minimal/electra.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ MAX_EFFECTIVE_BALANCE_ELECTRA: 2048000000000
# State list lengths
# ---------------------------------------------------------------
# `uint64(2**27)` (= 134,217,728)
PENDING_BALANCE_DEPOSITS_LIMIT: 134217728
PENDING_DEPOSITS_LIMIT: 134217728
# [customized] `uint64(2**6)` (= 64)
PENDING_PARTIAL_WITHDRAWALS_LIMIT: 64
# [customized] `uint64(2**6)` (= 64)
Expand Down Expand Up @@ -41,5 +41,10 @@ MAX_WITHDRAWAL_REQUESTS_PER_PAYLOAD: 2

# Withdrawals processing
# ---------------------------------------------------------------
# 2**0 ( = 1) pending withdrawals
MAX_PENDING_PARTIALS_PER_WITHDRAWALS_SWEEP: 1
# 2**1 ( = 2) pending withdrawals
MAX_PENDING_PARTIALS_PER_WITHDRAWALS_SWEEP: 2

# Pending deposits processing
# ---------------------------------------------------------------
# 2**4 ( = 4) pending deposits
MAX_PENDING_DEPOSITS_PER_EPOCH: 16
3 changes: 3 additions & 0 deletions pysetup/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,7 @@ def format_constant(name: str, vardef: VariableDefinition) -> str:
hardcoded_func_dep_presets = reduce(lambda obj, builder: {**obj, **builder.hardcoded_func_dep_presets(spec_object)}, builders, {})
# Concatenate all strings
imports = reduce(lambda txt, builder: (txt + "\n\n" + builder.imports(preset_name) ).strip("\n"), builders, "")
classes = reduce(lambda txt, builder: (txt + "\n\n" + builder.classes() ).strip("\n"), builders, "")
preparations = reduce(lambda txt, builder: (txt + "\n\n" + builder.preparations() ).strip("\n"), builders, "")
sundry_functions = reduce(lambda txt, builder: (txt + "\n\n" + builder.sundry_functions() ).strip("\n"), builders, "")
# Keep engine from the most recent fork
Expand Down Expand Up @@ -154,6 +155,8 @@ def format_constant(name: str, vardef: VariableDefinition) -> str:
constant_vars_spec,
preset_vars_spec,
config_spec,
# Custom classes which are not required to be SSZ containers.
classes,
ordered_class_objects_spec,
protocols_spec,
functions_spec,
Expand Down
7 changes: 7 additions & 0 deletions pysetup/spec_builders/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,13 @@ def imports(cls, preset_name: str) -> str:
"""
return ""

@classmethod
def classes(cls) -> str:
"""
Define special classes.
"""
return ""

@classmethod
def preparations(cls) -> str:
"""
Expand Down
15 changes: 15 additions & 0 deletions pysetup/spec_builders/deneb.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,21 @@ def imports(cls, preset_name: str):
from eth2spec.capella import {preset_name} as capella
'''

@classmethod
def classes(cls):
return f'''
class BLSFieldElement(bls.Scalar):
pass
class Polynomial(list):
def __init__(self, evals: Optional[Sequence[BLSFieldElement]] = None):
if evals is None:
evals = [BLSFieldElement(0)] * FIELD_ELEMENTS_PER_BLOB
if len(evals) != FIELD_ELEMENTS_PER_BLOB:
raise ValueError("expected FIELD_ELEMENTS_PER_BLOB evals")
super().__init__(evals)
'''

@classmethod
def preparations(cls):
Expand Down
31 changes: 30 additions & 1 deletion pysetup/spec_builders/eip7594.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,41 @@ def imports(cls, preset_name: str):
return f'''
from eth2spec.deneb import {preset_name} as deneb
'''



@classmethod
def classes(cls):
return f'''
class PolynomialCoeff(list):
def __init__(self, coeffs: Sequence[BLSFieldElement]):
if len(coeffs) > FIELD_ELEMENTS_PER_EXT_BLOB:
raise ValueError("expected <= FIELD_ELEMENTS_PER_EXT_BLOB coeffs")
super().__init__(coeffs)
class Coset(list):
def __init__(self, coeffs: Optional[Sequence[BLSFieldElement]] = None):
if coeffs is None:
coeffs = [BLSFieldElement(0)] * FIELD_ELEMENTS_PER_CELL
if len(coeffs) != FIELD_ELEMENTS_PER_CELL:
raise ValueError("expected FIELD_ELEMENTS_PER_CELL coeffs")
super().__init__(coeffs)
class CosetEvals(list):
def __init__(self, evals: Optional[Sequence[BLSFieldElement]] = None):
if evals is None:
evals = [BLSFieldElement(0)] * FIELD_ELEMENTS_PER_CELL
if len(evals) != FIELD_ELEMENTS_PER_CELL:
raise ValueError("expected FIELD_ELEMENTS_PER_CELL coeffs")
super().__init__(evals)
'''

@classmethod
def sundry_functions(cls) -> str:
return """
def retrieve_column_sidecars(beacon_block_root: Root) -> Sequence[DataColumnSidecar]:
# pylint: disable=unused-argument
return []
"""

Expand Down
5 changes: 3 additions & 2 deletions pysetup/spec_builders/electra.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ class ElectraSpecBuilder(BaseSpecBuilder):
def imports(cls, preset_name: str):
return f'''
from eth2spec.deneb import {preset_name} as deneb
from eth2spec.utils.ssz.ssz_impl import serialize
'''

@classmethod
Expand All @@ -28,8 +29,8 @@ class NoopExecutionEngine(ExecutionEngine):
def notify_new_payload(self: ExecutionEngine,
execution_payload: ExecutionPayload,
execution_requests: ExecutionRequests,
parent_beacon_block_root: Root) -> bool:
parent_beacon_block_root: Root,
execution_requests_list: list[bytes]) -> bool:
return True
def notify_forkchoice_updated(self: ExecutionEngine,
Expand Down
27 changes: 21 additions & 6 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,16 @@
)
from pysetup.md_doc_paths import get_md_doc_paths

# Ignore '1.5.0-alpha.*' to '1.5.0a*' messages.
import warnings
warnings.filterwarnings('ignore', message='Normalizing .* to .*')

# Ignore 'running' and 'creating' messages
import logging
class PyspecFilter(logging.Filter):
def filter(self, record):
return not record.getMessage().startswith(('running ', 'creating '))
logging.getLogger().addFilter(PyspecFilter())

# NOTE: have to programmatically include third-party dependencies in `setup.py`.
def installPackage(package: str):
Expand Down Expand Up @@ -173,7 +183,7 @@ def _update_constant_vars_with_kzg_setups(constant_vars, preset_name):
constant_vars['KZG_SETUP_G1_MONOMIAL'] = VariableDefinition(constant_vars['KZG_SETUP_G1_MONOMIAL'].value, str(kzg_setups[0]), comment, None)
constant_vars['KZG_SETUP_G1_LAGRANGE'] = VariableDefinition(constant_vars['KZG_SETUP_G1_LAGRANGE'].value, str(kzg_setups[1]), comment, None)
constant_vars['KZG_SETUP_G2_MONOMIAL'] = VariableDefinition(constant_vars['KZG_SETUP_G2_MONOMIAL'].value, str(kzg_setups[2]), comment, None)


def get_spec(file_name: Path, preset: Dict[str, str], config: Dict[str, str], preset_name=str) -> SpecObject:
functions: Dict[str, str] = {}
Expand Down Expand Up @@ -251,10 +261,17 @@ def get_spec(file_name: Path, preset: Dict[str, str], config: Dict[str, str], pr
# marko parses `**X**` as a list containing a X
description = description[0].children

if isinstance(name, list):
# marko parses `[X]()` as a list containing a X
name = name[0].children
if isinstance(value, list):
# marko parses `**X**` as a list containing a X
value = value[0].children

# Skip types that have been defined elsewhere
if description is not None and description.startswith("<!-- predefined-type -->"):
continue

if not _is_constant_id(name):
# Check for short type declarations
if value.startswith(("uint", "Bytes", "ByteList", "Union", "Vector", "List", "ByteVector")):
Expand Down Expand Up @@ -394,8 +411,6 @@ def initialize_options(self):
def finalize_options(self):
"""Post-process options."""
if len(self.md_doc_paths) == 0:
print("no paths were specified, using default markdown file paths for pyspec"
" build (spec fork: %s)" % self.spec_fork)
self.md_doc_paths = get_md_doc_paths(self.spec_fork)
if len(self.md_doc_paths) == 0:
raise Exception('no markdown files specified, and spec fork "%s" is unknown', self.spec_fork)
Expand Down Expand Up @@ -428,6 +443,7 @@ def run(self):
if not self.dry_run:
dir_util.mkpath(self.out_dir)

print(f'Building pyspec: {self.spec_fork}')
for (name, preset_paths, config_path) in self.parsed_build_targets:
spec_str = build_spec(
spec_builders[self.spec_fork].fork,
Expand Down Expand Up @@ -492,7 +508,6 @@ def run_pyspec_cmd(self, spec_fork: str, **opts):
self.run_command('pyspec')

def run(self):
print("running build_py command")
for spec_fork in spec_builders:
self.run_pyspec_cmd(spec_fork=spec_fork)

Expand Down Expand Up @@ -561,7 +576,7 @@ def run(self):
RUAMEL_YAML_VERSION,
"lru-dict==1.2.0",
MARKO_VERSION,
"py_arkworks_bls12381==0.3.4",
"curdleproofs==0.1.1",
"py_arkworks_bls12381==0.3.8",
"curdleproofs==0.1.2",
]
)
2 changes: 1 addition & 1 deletion specs/_features/eip7594/das-core.md
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ In this construction, we extend the blobs using a one-dimensional erasure coding

### Parameters

For each column -- use `data_column_sidecar_{subnet_id}` subnets, where `subnet_id` can be computed with the `compute_subnet_for_data_column_sidecar(column_index: ColumnIndex)` helper. The sidecars can be computed with the `get_data_column_sidecars(signed_block: SignedBeaconBlock, blobs: Sequence[Blob])` helper.
For each column -- use `data_column_sidecar_{subnet_id}` subnets, where `subnet_id` can be computed with the `compute_subnet_for_data_column_sidecar(column_index: ColumnIndex)` helper. The sidecars can be computed with `cells_and_kzg_proofs = [compute_cells_and_kzg_proofs(blob) for blob in blobs]` and then `get_data_column_sidecars(signed_block, cells_and_kzg_proofs)`.

Verifiable samples from their respective column are distributed on the assigned subnet. To custody a particular column, a node joins the respective gossipsub subnet. If a node fails to get a column on the column subnet, a node can also utilize the Req/Resp protocol to query the missing column from other peers.

Expand Down
3 changes: 2 additions & 1 deletion specs/_features/eip7594/fork-choice.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@ def is_data_available(beacon_block_root: Root) -> bool:
# `MIN_EPOCHS_FOR_DATA_COLUMN_SIDECARS_REQUESTS` epochs.
column_sidecars = retrieve_column_sidecars(beacon_block_root)
return all(
verify_data_column_sidecar_kzg_proofs(column_sidecar)
verify_data_column_sidecar(column_sidecar)
and verify_data_column_sidecar_kzg_proofs(column_sidecar)
for column_sidecar in column_sidecars
)
```
Expand Down
30 changes: 25 additions & 5 deletions specs/_features/eip7594/p2p-interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
- [Containers](#containers)
- [`DataColumnIdentifier`](#datacolumnidentifier)
- [Helpers](#helpers)
- [`verify_data_column_sidecar`](#verify_data_column_sidecar)
- [`verify_data_column_sidecar_kzg_proofs`](#verify_data_column_sidecar_kzg_proofs)
- [`verify_data_column_sidecar_inclusion_proof`](#verify_data_column_sidecar_inclusion_proof)
- [`compute_subnet_for_data_column_sidecar`](#compute_subnet_for_data_column_sidecar)
Expand Down Expand Up @@ -64,16 +65,35 @@ class DataColumnIdentifier(Container):

### Helpers

##### `verify_data_column_sidecar`

```python
def verify_data_column_sidecar(sidecar: DataColumnSidecar) -> bool:
"""
Verify if the data column sidecar is valid.
"""
# The sidecar index must be within the valid range
if sidecar.index >= NUMBER_OF_COLUMNS:
return False

# A sidecar for zero blobs is invalid
if len(sidecar.kzg_commitments) == 0:
return False

# The column length must be equal to the number of commitments/proofs
if len(sidecar.column) != len(sidecar.kzg_commitments) or len(sidecar.column) != len(sidecar.kzg_proofs):
return False

return True
```

##### `verify_data_column_sidecar_kzg_proofs`

```python
def verify_data_column_sidecar_kzg_proofs(sidecar: DataColumnSidecar) -> bool:
"""
Verify if the proofs are correct.
Verify if the KZG proofs are correct.
"""
assert sidecar.index < NUMBER_OF_COLUMNS
assert len(sidecar.column) == len(sidecar.kzg_commitments) == len(sidecar.kzg_proofs)

# The column index also represents the cell index
cell_indices = [CellIndex(sidecar.index)] * len(sidecar.column)

Expand Down Expand Up @@ -148,7 +168,7 @@ The *type* of the payload of this topic is `DataColumnSidecar`.

The following validations MUST pass before forwarding the `sidecar: DataColumnSidecar` on the network, assuming the alias `block_header = sidecar.signed_block_header.message`:

- _[REJECT]_ The sidecar's index is consistent with `NUMBER_OF_COLUMNS` -- i.e. `sidecar.index < NUMBER_OF_COLUMNS`.
- _[REJECT]_ The sidecar is valid as verified by `verify_data_column_sidecar(sidecar)`.
- _[REJECT]_ The sidecar is for the correct subnet -- i.e. `compute_subnet_for_data_column_sidecar(sidecar.index) == subnet_id`.
- _[IGNORE]_ The sidecar is not from a future slot (with a `MAXIMUM_GOSSIP_CLOCK_DISPARITY` allowance) -- i.e. validate that `block_header.slot <= current_slot` (a client MAY queue future sidecars for processing at the appropriate slot).
- _[IGNORE]_ The sidecar is from a slot greater than the latest finalized slot -- i.e. validate that `block_header.slot > compute_start_slot_at_epoch(state.finalized_checkpoint.epoch)`
Expand Down
Loading

0 comments on commit 34510e3

Please sign in to comment.