Skip to content

Commit

Permalink
Delete IOPTranscript, update with master (#51) (aka Ligero++)
Browse files Browse the repository at this point in the history
* Add the trait bounds

* Add `CommitmentState`

* Update benches for the new type

* Fix the name of local variable

* Merge `PCCommitmentState` with `PCRandomness`

* Update `README.md`

* Fix a bug

* Simplify `hash_column`

* Delete comments

* Add `CommitmentState`

* Make `fmt` happy

* Refactor, remove `hash_columns`

* Rename all params

* Maybe `empty` not return `Self`

* Make `empty` return `Self`

* Rename `rand` to `state`

* Add type `Randomness`

* Ligero+++ (#46)

* conversion to `into_iter` is a no-op

* remove explicit casts to vecs

* rename to use singular of `labeled_commitment`

* simplify the iterators even further by zipping two iters

* Apply suggestions from code review

* Fix tests: sponge config for univariate ligero

* Rename nonnative to emulated, as in `r1cs-std` (arkworks-rs#137)

* Rename nonnative to emulated, as in `r1cs-std`

* Run `fmt`

* Temporarily change `Cargo.toml`

* Revert `Cargo.toml`

* Refactor `FoldedPolynomialStream` partially

* Substitute `ChallengeGenerator` by the generic sponge (arkworks-rs#139)

* Rename nonnative to emulated, as in `r1cs-std`

* Run `fmt`

* Temporarily change `Cargo.toml`

* Substitute `ChallengeGenerator` with the generic sponge

* Run `fmt`

* Remove the extra file

* Update modules

* Delete the unnecessary loop

* Revert `Cargo.toml`

* Refactor `FoldedPolynomialStream` partially

* Update README

* Make the diff more readable

* Bring the whitespace back

* Make diff more readable, 2

* Fix according to breaking changes in `ark-ec` (arkworks-rs#141)

* Fix for KZG10

* Fix the breaking changes in `ark-ec`

* Remove the extra loop

* Fix the loop range

* re-use the preprocessing table

* also re-use the preprocessing table for multilinear_pc

---------

Co-authored-by: mmagician <marcin.gorny.94@protonmail.com>

* Auxiliary opening data (arkworks-rs#134)

* Add the trait bounds

* Add `CommitmentState`

* Update benches for the new type

* Fix the name of local variable

* Merge `PCCommitmentState` with `PCRandomness`

* Update `README.md`

* Fix a bug

* Put `Randomness` in `CommitmentState`

* Add a comment

* Remove the extra loop

* Update the comment for `CommitmentState`

Co-authored-by: Marcin <marcin.gorny.94@protonmail.com>

* cargo fmt

---------

Co-authored-by: Marcin <marcin.gorny.94@protonmail.com>

* `batch_mul_with_preprocessing` no longer takes `self` as argument (arkworks-rs#142)

* batch_mul_with_preprocessing no longer takes `self` as argument

* Apply suggestions from code review

Co-authored-by: Pratyush Mishra <pratyush795@gmail.com>

* fix variable name

---------

Co-authored-by: Pratyush Mishra <pratyush795@gmail.com>

* Remove `ChallengeGenerator` and `IOPTranscript` for Ligero (#57)

* Squash and merge `delete-chalgen` onto here

* Fix Ligero for `ChallengeGenerator` and `AsRef` for Merkle tree

* Fix tests: sponge config for univariate ligero

* Delete `IOPTranscript` for Ligero (#54)

* Replace the `IOPTranscript` with `CryptographicSponge`

* Delete extra comments

* Run fmt

* Fix tests: sponge config for univariate ligero

* Delete TODOs and do not absorb what you just squeezed

* Fix unused import

* Revert "Fix unused import"

This reverts commit e85af90.

* Try to fix

* Remove the extra loop

---------

Co-authored-by: mmagician <marcin.gorny.94@protonmail.com>
Co-authored-by: Pratyush Mishra <pratyush795@gmail.com>
  • Loading branch information
3 people committed Jan 18, 2024
1 parent 2bcff80 commit 044d74a
Show file tree
Hide file tree
Showing 28 changed files with 776 additions and 1,079 deletions.
16 changes: 7 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ This trait defines the interface for a polynomial commitment scheme. It is recom
// In this example, we will commit to a single polynomial, open it first at one point, and then batched at two points, and finally verify the proofs.
// We will use the KZG10 polynomial commitment scheme, following the approach from Marlin.

use ark_poly_commit::{Polynomial, marlin_pc::MarlinKZG10, LabeledPolynomial, PolynomialCommitment, QuerySet, Evaluations, challenge::ChallengeGenerator};
use ark_poly_commit::{Polynomial, marlin_pc::MarlinKZG10, LabeledPolynomial, PolynomialCommitment, QuerySet, Evaluations};
use ark_bls12_377::Bls12_377;
use ark_crypto_primitives::sponge::poseidon::{PoseidonSponge, PoseidonConfig};
use ark_crypto_primitives::sponge::CryptographicSponge;
Expand Down Expand Up @@ -128,17 +128,15 @@ let (ck, vk) = PCS::trim(&pp, degree, 2, Some(&[degree])).unwrap();

// 3. PolynomialCommitment::commit
// The prover commits to the polynomial using their committer key `ck`.
let (comms, rands) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();

let challenge_generator: ChallengeGenerator<<Bls12_377 as Pairing>::ScalarField, Sponge_Bls12_377> = ChallengeGenerator::new_univariate(&mut test_sponge);
let (comms, states) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();

// 4a. PolynomialCommitment::open
// Opening proof at a single point.
let proof_single = PCS::open(&ck, [&labeled_poly], &comms, &point_1, &mut (challenge_generator.clone()), &rands, None).unwrap();
let proof_single = PCS::open(&ck, [&labeled_poly], &comms, &point_1, &mut (test_sponge.clone()), &states, None).unwrap();

// 5a. PolynomialCommitment::check
// Verifying the proof at a single point, given the commitment, the point, the claimed evaluation, and the proof.
assert!(PCS::check(&vk, &comms, &point_1, [secret_poly.evaluate(&point_1)], &proof_single, &mut (challenge_generator.clone()), Some(rng)).unwrap());
assert!(PCS::check(&vk, &comms, &point_1, [secret_poly.evaluate(&point_1)], &proof_single, &mut (test_sponge.clone()), Some(rng)).unwrap());

let mut query_set = QuerySet::new();
let mut values = Evaluations::new();
Expand All @@ -155,8 +153,8 @@ let proof_batched = PCS::batch_open(
[&labeled_poly],
&comms,
&query_set,
&mut (challenge_generator.clone()),
&rands,
&mut (test_sponge.clone()),
&states,
Some(rng),
).unwrap();

Expand All @@ -167,7 +165,7 @@ assert!(PCS::batch_check(
&query_set,
&values,
&proof_batched,
&mut (challenge_generator.clone()),
&mut (test_sponge.clone()),
rng,
).unwrap());
```
Expand Down
25 changes: 12 additions & 13 deletions bench-templates/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,7 @@ use rand_chacha::{
use core::time::Duration;
use std::{borrow::Borrow, marker::PhantomData, time::Instant};

use ark_poly_commit::{
challenge::ChallengeGenerator, to_bytes, LabeledPolynomial, PolynomialCommitment,
};
use ark_poly_commit::{to_bytes, LabeledPolynomial, PolynomialCommitment};

pub use criterion::*;
pub use paste::paste;
Expand Down Expand Up @@ -131,7 +129,7 @@ where
let labeled_poly =
LabeledPolynomial::new("test".to_string(), rand_poly(num_vars, rng), None, None);

let (coms, randomness) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();
let (coms, states) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();
let point = rand_point(num_vars, rng);

let start = Instant::now();
Expand All @@ -140,8 +138,8 @@ where
[&labeled_poly],
&coms,
&point,
&mut ChallengeGenerator::new_univariate(&mut test_sponge()),
&randomness,
&mut test_sponge(),
&states,
Some(rng),
)
.unwrap();
Expand All @@ -165,16 +163,16 @@ where
let labeled_poly =
LabeledPolynomial::new("test".to_string(), rand_poly(num_vars, rng), None, None);

let (coms, randomness) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();
let (coms, states) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();
let point = P::Point::rand(rng);

let proofs = PCS::open(
&ck,
[&labeled_poly],
&coms,
&point,
&mut ChallengeGenerator::new_univariate(&mut test_sponge()),
&randomness,
&mut test_sponge(),
&states,
Some(rng),
)
.unwrap();
Expand Down Expand Up @@ -202,16 +200,17 @@ where
let labeled_poly =
LabeledPolynomial::new("test".to_string(), rand_poly(num_vars, rng), None, None);

let (coms, randomness) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();
let (coms, states) = PCS::commit(&ck, [&labeled_poly], Some(rng)).unwrap();
let point = rand_point(num_vars, rng);

let claimed_eval = labeled_poly.evaluate(&point);
let proof = PCS::open(
&ck,
[&labeled_poly],
&coms,
&point,
&mut ChallengeGenerator::new_univariate(&mut test_sponge()),
&randomness,
&mut test_sponge(),
&states,
Some(rng),
)
.unwrap();
Expand All @@ -223,7 +222,7 @@ where
&point,
[claimed_eval],
&proof,
&mut ChallengeGenerator::new_univariate(&mut test_sponge()),
&mut test_sponge(),
None,
)
.unwrap();
Expand Down
61 changes: 0 additions & 61 deletions poly-commit/src/challenge.rs

This file was deleted.

20 changes: 10 additions & 10 deletions poly-commit/src/constraints.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ use crate::{
use ark_crypto_primitives::sponge::CryptographicSponge;
use ark_ff::PrimeField;
use ark_poly::Polynomial;
use ark_r1cs_std::fields::nonnative::NonNativeFieldVar;
use ark_r1cs_std::fields::emulated_fp::EmulatedFpVar;
use ark_r1cs_std::{fields::fp::FpVar, prelude::*};
use ark_relations::r1cs::{ConstraintSystemRef, Namespace, Result as R1CSResult, SynthesisError};
use ark_std::{borrow::Borrow, cmp::Eq, cmp::PartialEq, hash::Hash, marker::Sized};
Expand All @@ -24,8 +24,8 @@ pub enum LinearCombinationCoeffVar<TargetField: PrimeField, BaseField: PrimeFiel
One,
/// Coefficient -1.
MinusOne,
/// Other coefficient, represented as a nonnative field element.
Var(NonNativeFieldVar<TargetField, BaseField>),
/// Other coefficient, represented as a "emulated" field element.
Var(EmulatedFpVar<TargetField, BaseField>),
}

/// An allocated version of `LinearCombination`.
Expand Down Expand Up @@ -60,7 +60,7 @@ impl<TargetField: PrimeField, BaseField: PrimeField>
let (f, lc_term) = term;

let fg =
NonNativeFieldVar::new_variable(ark_relations::ns!(cs, "term"), || Ok(f), mode)
EmulatedFpVar::new_variable(ark_relations::ns!(cs, "term"), || Ok(f), mode)
.unwrap();

(LinearCombinationCoeffVar::Var(fg), lc_term.clone())
Expand All @@ -79,12 +79,12 @@ impl<TargetField: PrimeField, BaseField: PrimeField>
pub struct PCCheckRandomDataVar<TargetField: PrimeField, BaseField: PrimeField> {
/// Opening challenges.
/// The prover and the verifier MUST use the same opening challenges.
pub opening_challenges: Vec<NonNativeFieldVar<TargetField, BaseField>>,
pub opening_challenges: Vec<EmulatedFpVar<TargetField, BaseField>>,
/// Bit representations of the opening challenges.
pub opening_challenges_bits: Vec<Vec<Boolean<BaseField>>>,
/// Batching random numbers.
/// The verifier can choose these numbers freely, as long as they are random.
pub batching_rands: Vec<NonNativeFieldVar<TargetField, BaseField>>,
pub batching_rands: Vec<EmulatedFpVar<TargetField, BaseField>>,
/// Bit representations of the batching random numbers.
pub batching_rands_bits: Vec<Vec<Boolean<BaseField>>>,
}
Expand Down Expand Up @@ -172,7 +172,7 @@ pub struct LabeledPointVar<TargetField: PrimeField, BaseField: PrimeField> {
/// MUST be a unique identifier in a query set.
pub name: String,
/// The point value.
pub value: NonNativeFieldVar<TargetField, BaseField>,
pub value: EmulatedFpVar<TargetField, BaseField>,
}

/// An allocated version of `QuerySet`.
Expand All @@ -184,16 +184,16 @@ pub struct QuerySetVar<TargetField: PrimeField, BaseField: PrimeField>(
/// An allocated version of `Evaluations`.
#[derive(Clone)]
pub struct EvaluationsVar<TargetField: PrimeField, BaseField: PrimeField>(
pub HashMap<LabeledPointVar<TargetField, BaseField>, NonNativeFieldVar<TargetField, BaseField>>,
pub HashMap<LabeledPointVar<TargetField, BaseField>, EmulatedFpVar<TargetField, BaseField>>,
);

impl<TargetField: PrimeField, BaseField: PrimeField> EvaluationsVar<TargetField, BaseField> {
/// find the evaluation result
pub fn get_lc_eval(
&self,
lc_string: &str,
point: &NonNativeFieldVar<TargetField, BaseField>,
) -> Result<NonNativeFieldVar<TargetField, BaseField>, SynthesisError> {
point: &EmulatedFpVar<TargetField, BaseField>,
) -> Result<EmulatedFpVar<TargetField, BaseField>, SynthesisError> {
let key = LabeledPointVar::<TargetField, BaseField> {
name: String::from(lc_string),
value: point.clone(),
Expand Down
12 changes: 7 additions & 5 deletions poly-commit/src/data_structures.rs
Original file line number Diff line number Diff line change
Expand Up @@ -70,9 +70,12 @@ pub trait PCPreparedCommitment<UNPREPARED: PCCommitment>: Clone {
fn prepare(comm: &UNPREPARED) -> Self;
}

/// Defines the minimal interface of commitment randomness for any polynomial
/// commitment scheme.
pub trait PCRandomness: Clone + CanonicalSerialize + CanonicalDeserialize {
/// Defines the minimal interface of commitment state for any polynomial
/// commitment scheme. It might be randomness etc.
pub trait PCCommitmentState: Clone + CanonicalSerialize + CanonicalDeserialize {
/// This is the type of `Randomness` that the `rand` method returns
type Randomness: Clone + CanonicalSerialize + CanonicalDeserialize;

/// Outputs empty randomness that does not hide the commitment.
fn empty() -> Self;

Expand All @@ -86,9 +89,8 @@ pub trait PCRandomness: Clone + CanonicalSerialize + CanonicalDeserialize {
has_degree_bound: bool,
num_vars: Option<usize>,
rng: &mut R,
) -> Self;
) -> Self::Randomness;
}

/// A proof of satisfaction of linear combinations.
#[derive(Clone, CanonicalSerialize, CanonicalDeserialize)]
pub struct BatchLCProof<F: PrimeField, T: Clone + CanonicalSerialize + CanonicalDeserialize> {
Expand Down
3 changes: 2 additions & 1 deletion poly-commit/src/ipa_pc/data_structures.rs
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,8 @@ pub struct Randomness<G: AffineRepr> {
pub shifted_rand: Option<G::ScalarField>,
}

impl<G: AffineRepr> PCRandomness for Randomness<G> {
impl<G: AffineRepr> PCCommitmentState for Randomness<G> {
type Randomness = Self;
fn empty() -> Self {
Self {
rand: G::ScalarField::zero(),
Expand Down
Loading

0 comments on commit 044d74a

Please sign in to comment.