Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Constraints #47

Open
wants to merge 63 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
63 commits
Select commit Hold shift + click to select a range
2d63406
prepared vk/commitment types, and individual opening challenges
npwardberkeley Oct 22, 2020
895e274
addressed comments
npwardberkeley Oct 23, 2020
084b4b9
removed constraints from this PR
npwardberkeley Oct 23, 2020
b3ac047
cargo fmt
npwardberkeley Oct 23, 2020
85cc789
finish some final changes
weikengchen Oct 24, 2020
8c25981
some other comments
weikengchen Oct 25, 2020
6a7bc6a
remove the dependency of nonnative for now
weikengchen Oct 25, 2020
ee10685
try to fix nostd
weikengchen Oct 25, 2020
24fd572
fix
npwardberkeley Oct 25, 2020
a467a52
constraints
npwardberkeley Oct 27, 2020
37298ab
fix Cargo toml
weikengchen Oct 27, 2020
beee357
Default, Clone, ToConstraintField impls
npwardberkeley Oct 27, 2020
fdf1668
cargo fmt
npwardberkeley Oct 27, 2020
d04a9e9
cargo fmt
npwardberkeley Nov 3, 2020
8f17d50
fixes
npwardberkeley Nov 3, 2020
5e2ac2e
used HashMap instead of BTreeMap
npwardberkeley Nov 4, 2020
1b7617f
using HashMap and HashSet
npwardberkeley Nov 4, 2020
e17ce7d
Update Cargo.toml
weikengchen Nov 4, 2020
13c2ca0
Update pc_constraints.rs
weikengchen Nov 4, 2020
f080e51
fix nostd
weikengchen Nov 4, 2020
88306f9
add density-optimized; clippy
weikengchen Nov 12, 2020
cf79c6b
add ToConstraintField
weikengchen Nov 12, 2020
87084be
tracing
weikengchen Nov 12, 2020
a641d27
Merge branch 'master' into constraints
weikengchen Nov 12, 2020
9baed3e
done
weikengchen Nov 12, 2020
132842a
Update pc_constraints.rs
weikengchen Nov 13, 2020
56e481a
fmt
weikengchen Nov 13, 2020
8d4a964
Apply suggestions from code review
npwardberkeley Nov 17, 2020
2eaed72
small fixes
npwardberkeley Nov 17, 2020
1f639eb
Merge branch 'master' into constraints
weikengchen Nov 17, 2020
7bc0da1
cleaning
weikengchen Nov 17, 2020
58896f2
fix nostd
weikengchen Nov 17, 2020
8c4641d
Merge branch 'master' into constraints
weikengchen Dec 26, 2020
77c456a
reduce the PR size
oblivious-app Dec 26, 2020
ac23b82
reduce the PR size
oblivious-app Dec 26, 2020
ee7318c
reduce the PR size
oblivious-app Dec 26, 2020
f9baf70
reduce the PR size
oblivious-app Dec 26, 2020
cc2cfc0
reduce the PR size
oblivious-app Dec 26, 2020
a005c2d
minimize the PR size
oblivious-app Dec 26, 2020
5a2ef01
Updates Marlin constraints to latest LC representation (#68)
howardwu Jan 29, 2021
ebcf462
bump digest abnd blake2
weikengchen Feb 5, 2021
417154a
Update src/marlin_pc/constraints.rs
weikengchen Feb 8, 2021
d71e8d6
Merge branch 'master' into constraints
weikengchen Feb 8, 2021
9a633fb
simplify and refactor
weikengchen Feb 8, 2021
453841b
remove density-optimized feature
weikengchen Feb 8, 2021
9f328fb
use PairingFriendlyCycle instead of CycleEngine
weikengchen Feb 8, 2021
9019c22
simplify CycleEngine
weikengchen Feb 8, 2021
b0ff531
simplify
weikengchen Feb 8, 2021
100bc57
derive Clone
weikengchen Feb 8, 2021
a757cc2
derive Clone for BatchLCProof
weikengchen Feb 8, 2021
cffd1a4
change the handling of one and minus one coeff
weikengchen Feb 9, 2021
93969dc
no std
weikengchen Feb 9, 2021
1751933
remove bench-utils
weikengchen Mar 23, 2021
38add1b
rewrote some comments
npwardberkeley Apr 14, 2021
78af43d
LCInfo type for clarity
npwardberkeley Apr 15, 2021
caa3375
update dependencies to use release versions
Will-Lin4 Apr 29, 2021
44ca5a6
fix lcitem type integration
Will-Lin4 Apr 29, 2021
afb2959
Merge branch 'master' into constraints
weikengchen Jun 16, 2021
e9d5af8
Merge branch 'master' into constraints
Pratyush Jul 26, 2021
3cfa438
Fix and update dependencies to 0.3 (#93)
vlopes11 Mar 7, 2022
7f8e2c5
Merge `master` into `constraints` (#94)
vlopes11 Mar 7, 2022
9da67e2
Merge remote-tracking branch 'origin/master' into constraints
vlopes11 Mar 8, 2022
a688fe9
Update `PC::check_combinations` to optional rng (#97)
vlopes11 Apr 25, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,18 @@ ark-ec = { git = "https://github.com/arkworks-rs/algebra", default-features = fa
ark-poly = { git = "https://github.com/arkworks-rs/algebra", default-features = false }

ark-std = { git = "https://github.com/arkworks-rs/utils", default-features = false }
ark-relations = { git = "https://github.com/arkworks-rs/snark", default-features = false }
ark-r1cs-std = { git = "https://github.com/arkworks-rs/r1cs-std", default-features = false }
ark-nonnative-field = { git = "https://github.com/arkworks-rs/nonnative", default-features = false }
npwardberkeley marked this conversation as resolved.
Show resolved Hide resolved

bench-utils = { git = "https://github.com/arkworks-rs/utils", default-features = false }
tracing = { version = "0.1", default-features = false, features = [ "attributes" ] }

rand_core = { version = "0.5", default-features = false }
digest = "0.8"
rayon = { version = "1", optional = true }
derivative = { version = "2", features = [ "use_core" ] }
hashbrown = "0.9"

[dev-dependencies]
rand = { version = "0.7", default-features = false }
Expand All @@ -55,7 +60,8 @@ incremental = true
debug = true

[features]
default = [ "std", "parallel" ]
std = [ "ark-ff/std", "ark-ec/std", "ark-poly/std", "ark-std/std", "ark-serialize/std" ]
default = ["std", "parallel"]
std = [ "ark-ff/std", "ark-ec/std", "ark-nonnative-field/std", "ark-poly/std", "ark-std/std", "ark-relations/std", "ark-serialize/std" ]
npwardberkeley marked this conversation as resolved.
Show resolved Hide resolved
print-trace = [ "bench-utils/print-trace" ]
parallel = [ "std", "ark-ff/parallel", "ark-ec/parallel", "ark-poly/parallel", "ark-std/parallel", "rayon" ]
density-optimized = [ "ark-nonnative-field/density-optimized" ]
npwardberkeley marked this conversation as resolved.
Show resolved Hide resolved
24 changes: 15 additions & 9 deletions src/data_structures.rs
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
use crate::{Polynomial, Rc, String, Vec};
use ark_ff::Field;
use ark_ff::{Field, ToConstraintField};
use ark_std::{
borrow::Borrow,
marker::PhantomData,
Expand Down Expand Up @@ -41,9 +41,9 @@ pub trait PCVerifierKey: Clone + core::fmt::Debug {

/// Defines the minimal interface of prepared verifier keys for any polynomial
/// commitment scheme.
pub trait PCPreparedVerifierKey<Unprepared: PCVerifierKey> {
pub trait PCPreparedVerifierKey<UNPREPARED: PCVerifierKey> {
npwardberkeley marked this conversation as resolved.
Show resolved Hide resolved
/// prepare
fn prepare(vk: &Unprepared) -> Self;
fn prepare(vk: &UNPREPARED) -> Self;
npwardberkeley marked this conversation as resolved.
Show resolved Hide resolved
}

/// Defines the minimal interface of commitments for any polynomial
Expand Down Expand Up @@ -199,6 +199,14 @@ impl<C: PCCommitment> LabeledCommitment<C> {
}
}

impl<F: Field, C: PCCommitment + ToConstraintField<F>> ToConstraintField<F>
for LabeledCommitment<C>
{
fn to_field_elements(&self) -> Option<Vec<F>> {
self.commitment.to_field_elements()
}
}

impl<C: PCCommitment> ark_ff::ToBytes for LabeledCommitment<C> {
#[inline]
fn write<W: ark_std::io::Write>(&self, writer: W) -> ark_std::io::Result<()> {
Expand All @@ -219,11 +227,7 @@ impl LCTerm {
/// Returns `true` if `self == LCTerm::One`
#[inline]
pub fn is_one(&self) -> bool {
if let LCTerm::One = self {
true
} else {
false
}
matches!(self, LCTerm::One)
}
}

Expand Down Expand Up @@ -293,7 +297,7 @@ impl<F: Field> LinearCombination<F> {
let terms = terms.into_iter().map(|(c, t)| (c, t.into())).collect();
Self {
label: label.into(),
terms: terms,
terms,
}
}

Expand All @@ -315,13 +319,15 @@ impl<F: Field> LinearCombination<F> {
}

impl<'a, F: Field> AddAssign<(F, &'a LinearCombination<F>)> for LinearCombination<F> {
#[allow(clippy::suspicious_op_assign_impl)]
fn add_assign(&mut self, (coeff, other): (F, &'a LinearCombination<F>)) {
self.terms
.extend(other.terms.iter().map(|(c, t)| (coeff * c, t.clone())));
}
}

impl<'a, F: Field> SubAssign<(F, &'a LinearCombination<F>)> for LinearCombination<F> {
#[allow(clippy::suspicious_op_assign_impl)]
fn sub_assign(&mut self, (coeff, other): (F, &'a LinearCombination<F>)) {
self.terms
.extend(other.terms.iter().map(|(c, t)| (-coeff * c, t.clone())));
Expand Down
2 changes: 1 addition & 1 deletion src/ipa_pc/data_structures.rs
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ pub type PreparedCommitment<E> = Commitment<E>;
impl<G: AffineCurve> PCPreparedCommitment<Commitment<G>> for PreparedCommitment<G> {
/// prepare `PreparedCommitment` from `Commitment`
fn prepare(vk: &Commitment<G>) -> Self {
vk.clone()
*vk
}
}

Expand Down
51 changes: 28 additions & 23 deletions src/ipa_pc/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -54,9 +54,9 @@ impl<G: AffineCurve, D: Digest, P: UVPolynomial<G::ScalarField>> InnerProductArg

let mut comm = VariableBaseMSM::multi_scalar_mul(comm_key, &scalars_bigint);

if randomizer.is_some() {
if let Some(randomizer) = randomizer {
assert!(hiding_generator.is_some());
comm += &hiding_generator.unwrap().mul(randomizer.unwrap());
comm += &hiding_generator.unwrap().mul(randomizer);
}

comm
Expand Down Expand Up @@ -169,8 +169,8 @@ impl<G: AffineCurve, D: Digest, P: UVPolynomial<G::ScalarField>> InnerProductArg
let h_prime = h_prime.into_affine();

let check_commitment_elem: G::Projective = Self::cm_commit(
&[proof.final_comm_key.clone(), h_prime],
&[proof.c.clone(), v_prime],
&[proof.final_comm_key, h_prime],
&[proof.c, v_prime],
None,
None,
);
Expand Down Expand Up @@ -256,21 +256,21 @@ impl<G: AffineCurve, D: Digest, P: UVPolynomial<G::ScalarField>> InnerProductArg
let mut commitments = Vec::new();

let mut i = 0;
for info in lc_info.into_iter() {
for info in lc_info.iter() {
npwardberkeley marked this conversation as resolved.
Show resolved Hide resolved
let commitment;
let label = info.0.clone();
let degree_bound = info.1;

if degree_bound.is_some() {
commitment = Commitment {
comm: comms[i].clone(),
shifted_comm: Some(comms[i + 1].clone()),
comm: comms[i],
shifted_comm: Some(comms[i + 1]),
};

i += 2;
} else {
commitment = Commitment {
comm: comms[i].clone(),
comm: comms[i],
shifted_comm: None,
};

Expand All @@ -280,7 +280,7 @@ impl<G: AffineCurve, D: Digest, P: UVPolynomial<G::ScalarField>> InnerProductArg
commitments.push(LabeledCommitment::new(label, commitment, degree_bound));
}

return commitments;
commitments
}

fn sample_generators(num_generators: usize) -> Vec<G> {
Expand Down Expand Up @@ -362,15 +362,15 @@ where

let ck = CommitterKey {
comm_key: pp.comm_key[0..(supported_degree + 1)].to_vec(),
h: pp.h.clone(),
s: pp.s.clone(),
h: pp.h,
s: pp.s,
max_degree: pp.max_degree(),
};

let vk = VerifierKey {
comm_key: pp.comm_key[0..(supported_degree + 1)].to_vec(),
h: pp.h.clone(),
s: pp.s.clone(),
h: pp.h,
s: pp.s,
max_degree: pp.max_degree(),
};

Expand All @@ -380,6 +380,7 @@ where
}

/// Outputs a commitment to `polynomial`.
#[allow(clippy::type_complexity)]
fn commit<'a>(
ck: &Self::CommitterKey,
polynomials: impl IntoIterator<Item = &'a LabeledPolynomial<G::ScalarField, P>>,
Expand Down Expand Up @@ -545,18 +546,19 @@ where
let combined_v = combined_polynomial.evaluate(point);

// Pad the coefficients to the appropriate vector size
let d = ck.supported_degree();
let degree = ck.supported_degree();

// `log_d` is ceil(log2 (d + 1)), which is the number of steps to compute all of the challenges
let log_d = ark_std::log2(d + 1) as usize;
let log_d = ark_std::log2(degree + 1) as usize;

let mut combined_commitment;
let mut hiding_commitment = None;

if has_hiding {
let mut rng = rng.expect("hiding commitments require randomness");
let hiding_time = start_timer!(|| "Applying hiding.");
let mut hiding_polynomial = P::rand(d, &mut rng);

let mut hiding_polynomial = P::rand(degree, &mut rng);
hiding_polynomial -= &P::from_coefficients_slice(&[hiding_polynomial.evaluate(point)]);

let hiding_rand = G::ScalarField::rand(rng);
Expand Down Expand Up @@ -597,8 +599,10 @@ where
None
};

let proof_time =
start_timer!(|| format!("Generating proof for degree {} combined polynomial", d + 1));
let proof_time = start_timer!(|| format!(
"Generating proof for degree {} combined polynomial",
degree + 1
));

combined_commitment = combined_commitment_proj.into_affine();

Expand All @@ -610,18 +614,19 @@ where
let h_prime = ck.h.mul(round_challenge).into_affine();

// Pads the coefficients with zeroes to get the number of coeff to be d+1

let mut coeffs = combined_polynomial.coeffs().to_vec();
if coeffs.len() < d + 1 {
for _ in coeffs.len()..(d + 1) {
if coeffs.len() < degree + 1 {
for _ in coeffs.len()..(degree + 1) {
coeffs.push(G::ScalarField::zero());
}
}
let mut coeffs = coeffs.as_mut_slice();

// Powers of z
let mut z: Vec<G::ScalarField> = Vec::with_capacity(d + 1);
let mut z: Vec<G::ScalarField> = Vec::with_capacity(degree + 1);
let mut cur_z: G::ScalarField = G::ScalarField::one();
for _ in 0..(d + 1) {
for _ in 0..(degree + 1) {
z.push(cur_z);
cur_z *= point;
}
Expand All @@ -640,7 +645,7 @@ where
let mut l_vec = Vec::with_capacity(log_d);
let mut r_vec = Vec::with_capacity(log_d);

let mut n = d + 1;
let mut n = degree + 1;
while n > 1 {
let (coeffs_l, coeffs_r) = coeffs.split_at_mut(n / 2);
let (z_l, z_r) = z.split_at_mut(n / 2);
Expand Down
33 changes: 30 additions & 3 deletions src/kzg10/data_structures.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
use crate::*;
use ark_ec::{AffineCurve, PairingEngine, ProjectiveCurve};
use ark_ff::{PrimeField, ToBytes, Zero};
use ark_ff::{PrimeField, ToBytes, ToConstraintField, Zero};
use ark_std::{
borrow::Cow,
marker::PhantomData,
Expand Down Expand Up @@ -90,6 +90,24 @@ impl<E: PairingEngine> ToBytes for VerifierKey<E> {
}
}

impl<E: PairingEngine> ToConstraintField<<E::Fq as Field>::BasePrimeField> for VerifierKey<E>
where
E::G1Affine: ToConstraintField<<E::Fq as Field>::BasePrimeField>,
E::G2Affine: ToConstraintField<<E::Fq as Field>::BasePrimeField>,
{
fn to_field_elements(&self) -> Option<Vec<<E::Fq as Field>::BasePrimeField>> {
// TODO: gamma_g is omitted because our constraint system does not use. This is a little bit problematic
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't gamma_g used for hiding?

// The order should accommodate the one in the constraints.rs, which takes g, h, and beta_h.
let mut res = Vec::new();

res.extend_from_slice(&self.g.to_field_elements().unwrap());
res.extend_from_slice(&self.h.to_field_elements().unwrap());
res.extend_from_slice(&self.beta_h.to_field_elements().unwrap());

Some(res)
}
}

/// `PreparedVerifierKey` is the fully prepared version for checking evaluation proofs for a given commitment.
/// We omit gamma here for simplicity.
#[derive(Derivative)]
Expand All @@ -109,7 +127,7 @@ impl<E: PairingEngine> PreparedVerifierKey<E> {
let supported_bits = E::Fr::size_in_bits();

let mut prepared_g = Vec::<E::G1Affine>::new();
let mut g = E::G1Projective::from(vk.g.clone());
let mut g = E::G1Projective::from(vk.g);
for _ in 0..supported_bits {
prepared_g.push(g.clone().into());
g.double_in_place();
Expand Down Expand Up @@ -170,6 +188,15 @@ impl<'a, E: PairingEngine> AddAssign<(E::Fr, &'a Commitment<E>)> for Commitment<
}
}

impl<E: PairingEngine> ToConstraintField<<E::Fq as Field>::BasePrimeField> for Commitment<E>
where
E::G1Affine: ToConstraintField<<E::Fq as Field>::BasePrimeField>,
{
fn to_field_elements(&self) -> Option<Vec<<E::Fq as Field>::BasePrimeField>> {
self.0.to_field_elements()
}
}

/// `PreparedCommitment` commits to a polynomial and prepares for mul_bits.
#[derive(Derivative)]
#[derivative(
Expand All @@ -189,7 +216,7 @@ impl<E: PairingEngine> PreparedCommitment<E> {
/// prepare `PreparedCommitment` from `Commitment`
pub fn prepare(comm: &Commitment<E>) -> Self {
let mut prepared_comm = Vec::<E::G1Affine>::new();
let mut cur = E::G1Projective::from(comm.0.clone());
let mut cur = E::G1Projective::from(comm.0);

let supported_bits = E::Fr::size_in_bits();

Expand Down
10 changes: 6 additions & 4 deletions src/kzg10/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,7 @@ where
}

/// Outputs a commitment to `polynomial`.
#[allow(clippy::type_complexity)]
pub fn commit(
powers: &Powers<E>,
polynomial: &P,
Expand Down Expand Up @@ -201,6 +202,7 @@ where
/// The witness polynomial w(x) the quotient of the division (p(x) - p(z)) / (x - z)
/// Observe that this quotient does not change with z because
/// p(z) is the remainder term. We can therefore omit p(z) when computing the quotient.
#[allow(clippy::type_complexity)]
pub fn compute_witness_polynomial(
p: &P,
point: P::Point,
Expand All @@ -226,7 +228,7 @@ where
Ok((witness_polynomial, random_witness_polynomial))
}

pub(crate) fn open_with_witness_polynomial<'a>(
pub(crate) fn open_with_witness_polynomial(
powers: &Powers<E>,
point: P::Point,
randomness: &Randomness<E::Fr, P>,
Expand Down Expand Up @@ -270,7 +272,7 @@ where
}

/// On input a polynomial `p` and a point `point`, outputs a proof for the same.
pub(crate) fn open<'a>(
pub(crate) fn open(
powers: &Powers<E>,
p: &P,
point: P::Point,
Expand Down Expand Up @@ -435,12 +437,12 @@ where
if enforced_degree_bounds.binary_search(&bound).is_err() {
Err(Error::UnsupportedDegreeBound(bound))
} else if bound < p.degree() || bound > max_degree {
return Err(Error::IncorrectDegreeBound {
Err(Error::IncorrectDegreeBound {
poly_degree: p.degree(),
degree_bound: p.degree_bound().unwrap(),
supported_degree,
label: p.label().to_string(),
});
})
} else {
Ok(())
}
Expand Down
Loading