Organization
- @megaeth
Engagement Type
Cantina Reviews
Period
-
Repositories
Findings
Medium Risk
4 findings
1 fixed
3 acknowledged
Low Risk
10 findings
5 fixed
5 acknowledged
Informational
9 findings
4 fixed
5 acknowledged
Medium Risk4 findings
Non unique internal representation of Element.
Summary
The elements of the Banderwagon subgroup can be internally represented in a non-unique way, yet this is accounted for only in some parts of the implementation.
Finding Description
This is not always true. Since double internal representation is allowed, the identity can be both
(0, 1)and(0, -1). We can get the first representation by callingElement::zero()and the second withElement::from_bytes(&[0u8;32]). These lead to different encodings.Impact Explanation
Multiple representations of points can lead to different serializations of the same point and faulty equality checks. These, in turn, can result in correctness and malleability issues for the protocols built on top.
Likelihood Explanation
Proof of Concept (if required)
#[test] fn batch_zero_element_to_commitments() { let zero = Element::from_bytes(&[0u8; 32]).unwrap(); assert!(zero.is_zero()); let hash_bytes_1 = Element::batch_to_commitments(&[zero])[0]; let hash_bytes_2 = zero.to_bytes_uncompressed(); dbg!(hash_bytes_1, hash_bytes_2); assert_eq!(hash_bytes_1, hash_bytes_2); let intended_bytes = Element::batch_to_commitments(&[Element::zero()])[0]; assert_ne!(hash_bytes_1, intended_bytes); }Recommendation (optional)
Enforce a unique internal representation of the elements by controlling the entry points that allow the creation of the struct.
Incorrect Banderwagon elements.
Summary
Elementcan be created and hold incorrect Banderwagon elements.Finding Description
The internal
EdwardsProjectivepoint should not be public. This allows the creation ofElements which do not satisfy:- The curve equation.
- The subgroup check.
The upper layers (like the IPA commitment scheme), rely on the correctness of the points for their security guarantees.
Impact Explanation
Incorrect elements - elements that are not on the curve or in the appropriate subgroup - compromise the soundness of the vector commitment scheme and can also lead to malleability issues.
Likelihood Explanation
Elements are often created as a wrapper of an Edwards projective point, bypassing the crucial subgroup check.
Recommendation (optional)
- Make the inner
EdwardsProjectiveprivate. - Only allow the creation of
Elementthrough designated methods that ensure the correctness of the element.
Errors are handled with panic affecting external integrations
State
- Acknowledged
Severity
- Severity: Medium
≈
Likelihood: Low×
Impact: High Submitted by
0xluk3
Description
Libraries should never terminate execution through
panic!, as doing so prevents callers from handling errors safely. In Rust, panics may abort the process, making it impossible for external software using the library to recover or report meaningful errors.When SALT is integrated as a reusable library within larger systems, callers must be able to receive structured error information (e.g., through
Resultor abooleanreturn) instead.This is particularly problematic when user-supplied input can trigger the panic, as it enables external data to cause uncontrolled termination of the calling program and may lead to Denial of Service
Full list of potentially problematic functions if exposed outside of library (in progress)
- in
ipa_multipoint/src/crs.rs,from_bytesexpects at least one element to construct the CRS, otherwise panics
#[allow(non_snake_case)] pub fn from_bytes(bytes: &[[u8; 64]]) -> CRS { let (q_bytes, g_vec_bytes) = bytes .split_last() .expect("bytes vector should not be empty");In the same fine,
impl Index<usize>for CRS will panic if the index i is out of bounds:impl std::ops::Index<usize> for CRS { type Output = Element; fn index(&self, index: usize) -> &Self::Output { &self.G[index] }}- in
ipa_multipoint/src/lagrange_basis.rs,LagrangeBasis::evaluate_in_domainpanics on oob access:
pub fn evaluate_in_domain(&self, index: usize) -> Fr { self.values[index] }- in
salt/src/trie/trie.rs,StateRoot::rebuilduses.map_err(|_| unreachable!())and additionally all public methods onStateRootmodify state, update, finalize, and update_fin useexpect()in multiple places - in
salt/src/state/updates.rsinStateUpdates::add(&mut self, salt_key: SaltKey, old_value: Option<SaltValue>, new_value: Option<SaltValue>), panics inassert_eq!(old_value, change.get().1, "Invalid state transition"); - in
salt/src/state/state.rsfunctions:EphemeralSaltState::shi_rehashandEphemeralSaltState::metadatausesexpect() - in
salt/src/trie/node_utils.rsinget_child_node(parent: &NodeId, child_idx: usize)may lead to index out of bounds onSTARTING_NODE_ID - in
salt/src/types.rsseveral functions, for instance:SaltValue::newincopy_from_slice,bucket_metadata_keyinassert!(is_valid_data_bucket(bucket_id), ...),bucket_id_from_metadata_key(key: SaltKey)inassert!(METADATA_KEYS_RANGE.contains(&key), ...)andpub fn is_subtree_node(node_id: NodeId)can panic as well
Recommendation (optional)
Consider changing the panic occurrences to
Errortypes that can be returned to the caller.Missing consistency check between b_vec and input_point
State
- Acknowledged
Severity
- Severity: Medium
≈
Likelihood: Low×
Impact: High Submitted by
dnevado
Summary
The
input_pointfor IPA verification must be consistent with the inputb_vec, which must hold the powers of the input point in Lagrange form.Finding Description
There are no explicit checks that guarantee the consistency between
input_pointandb_vec. Moreover,b_vecnever influences the transcript. So controlling this vector completely compromises the verification of the proof.Moreover, this vector is directly fed into the
inner_productfunction, which has an issue that could be chained to this one. See:Impact Explanation
A malicious or incoherent
b_veccompletely compromises verification.Likelihood Explanation
It is unlikely that the issue gets exploited as the function is only called internally.
Recommendation (optional)
The verification function that is exposed by the library should not contain this input. It should receive only
input_point. A possible solution is to make this function private and expose a wrapper that first computes the rightb_vecand then calls the original function. It should also be clearly documented what each input of the function must be.
Low Risk10 findings
Unchecked length in inner product computation
Summary
Different length inputs in
inner_productlead to incorrect result.Finding Description
The correctness of this function requires
a.len() == b.len(). A short input can be passed in order to ignore some values of the other input.Impact Explanation
A malicious short input in this function could be used to ignore or short-circuit values of the other input. This could be used to ignore challenges of the vector commitment scheme protocol and compromise the soundness of the proof.
Likelihood Explanation
This would need to be chained to other bugs in order to be exploited.
Proof of Concept (if required)
Recommendation (optional)
- Assert the lengths of the inputs are equal.
- Reduce the visibility of the function.
Collisions in map_to_scalar
Summary
It is possible to find different, non-equivalent preimages of the same scalar. This map is not 2-to-1.
Finding Description
Different non-equivalent points can map to the same scalar field element. These points can be computed easily.
The
map_to_scalar_fieldfunction maps a curve point into an element of the scalar field, but it is not 2-to-1.It does so in 2 steps:
- First, it maps the point to an element in the base field:
(x, y) -> x/y.
This function maps equivalent points in the Banderwagon subgroup to the same base field element. - Second, the base field element is mapped onto the scalar field by serializing it to bytes and then deserializing. Effectively, it is interpreting the base field as an integer in
[0, p-1]and reducing it modr. (Wherep,rare the orders of the base and scalar fields respectively). Sincepis several times larger thanr, this map is not injective.
p = 0x73eda753299d7d483339d80809a1d80553bda402fffe5bfeffffffff00000001r = 0x1cfb69d4ca675f520cce760202687600ff8f87007419047174fd06b52876e7e1`Impact Explanation
Chained with other issues, this could be used to fake part of the path in the Verkle tree.
Likelihood Explanation
It is unlikely that the collision point can be used to build an exploit, since the underlying vector it commits too remains unknown.
Proof of Concept
find_collision_pointtakes anElementand returns (if possible) a differentElementthat maps to the same scalar as the original.fn discriminant(k: Fq) -> Fq { let a = <BandersnatchConfig as TECurveConfig>::COEFF_A; let d = <BandersnatchConfig as TECurveConfig>::COEFF_D; let k_square = k.square(); let four = Fq::from(4); (a * k_square + Fq::ONE).square() - four * d * k_square } fn find_point_for_k(k: Fq) -> Option<Element> { // Curve parameters and constants. let a = <BandersnatchConfig as TECurveConfig>::COEFF_A; let d = <BandersnatchConfig as TECurveConfig>::COEFF_D; let k_square = k.square(); let disc_sq = if let Some(x) = discriminant(k).sqrt() { x } else { // No solutions for k+ i*r return None; }; let y_square = (a * k_square + Fq::ONE + disc_sq) * (Fq::from(2) * d * k_square).inverse().unwrap(); let y = if let Some(y) = y_square.sqrt() { y } else { let y_square = (a * k_square + Fq::ONE - disc_sq) * (Fq::from(2) * d * k_square).inverse().unwrap(); y_square.sqrt().expect("Should have found a valid y here") }; let x = k * y; let p = EdwardsAffine::new_unchecked(x, y); return Some(Element(p.into()));} // Searches for a different element that serializes to the same scalar.fn find_collision_point(e: Element) -> Option<Element> { let r_bytes = [ 0x1c, 0xfb, 0x69, 0xd4, 0xca, 0x67, 0x5f, 0x52, 0x0c, 0xce, 0x76, 0x02, 0x02, 0x68, 0x76, 0x00, 0xff, 0x8f, 0x87, 0x00, 0x74, 0x19, 0x04, 0x71, 0x74, 0xfd, 0x06, 0xb5, 0x28, 0x76, 0xe7, 0xe1, ]; let r = Fq::from_be_bytes_mod_order(&r_bytes); //Input elem: println!("Input point:"); println!("{}", e.0.into_affine()); let mut k = e.map_to_field(); // x/y for i in 1..4 { k += r; if let Some(p) = find_point_for_k(k) { let p = p.0.into_affine(); if p.is_on_curve() { println!("On curve!"); if p.is_in_correct_subgroup_assuming_on_curve() { println!("... and on subgroup!"); if p != e.0 && p != -e.0 { println!("and different to input and its negation"); println!("Iteration {}. Found point:", i); println!("{}", p); } return Some(Element(p.into())); } } } } None}Recommendation (optional)
- Document properly the properties of this map.
- Ensure that all commitments in the protocol are opened at some point. This makes the collision points useless, because the underlying vector that commits to such points remains unknown.
Mitigation of this issue are discused in this blog post.
- First, it maps the point to an element in the base field:
Missing Valid implementation.
Summary
The implementation for
check()in the traitValidis missing, it just returnsOk(()).Finding Description
There is no check of validity,
Okis returned for any element.Impact Explanation
If an invalid element was passed as valid due to this error, the security guarantees of the vector commitment scheme would be comprised.
Likelihood Explanation
Since the trait is not currently being used, it is unlikely that the issue will have an impact.
Recommendation (optional)
One of the following:
- Drop this trait if it is unnecessary.
- Properly check the validity of the element by:
- Element is on the curve: It satisfies the curve equation.
- Subgroup check: In this case, a
2psize subgroup by checking1 - ax^2is a quadratic residue. - If a canonical representation is enforced: check the correct sign of
y.
Faulty ParitialEq for LagrangeBasis
State
- Acknowledged
Severity
- Severity: Low
≈
Likelihood: Low×
Impact: Low Submitted by
dnevado
Summary
Equal polynomials may appear as unequal.
Finding Description
Multiple representations are possible for the same polynomial. However, different representations will not appear as equal even if they represent the same polynomial.
Proof of Concept (if required)
These polynomials are the zero polynomial, and should be equal.
fn test_poly_eq() { let zero = Fr::from(0u128); let poly1 = LagrangeBasis::new(vec![zero, zero]); let poly2 = LagrangeBasis::new(vec![zero]); assert_eq!(poly1, poly2);}Recommendation (optional)
PartialEqshould be implemented manually to account for the multiple possible representation of a polynomial. Alternatively, thenewmethod could compute a canonical representation.Faulty Add implementation for LagrangeBasis
State
- Acknowledged
Severity
- Severity: Low
≈
Likelihood: Low×
Impact: Low Submitted by
dnevado
Summary
Addition of polynomials with different value length returns erroneous result.
Finding Description
It is necessary to check that
lhs.valuesandrhs.valuesare the same length. If one of the input polynomial value vector is longer, some of the values of the other polynomial input are ignored resulting in an erroneous output.Proof of Concept (if required)
fn test_poly_add() { let zero = Fr::from(0u128); let one = Fr::from(1u128); let poly1 = LagrangeBasis::new(vec![zero, one]); let poly2 = LagrangeBasis::new(vec![zero]); let result1 = poly2.clone() + &poly1; let result2 = poly1 + &poly2; let expected = LagrangeBasis::new(vec![zero, one]); assert_eq!(result1, result2); assert_eq!(result1, expected);}Recommendation
Complete the shorter vector values with zeros up to the length of the largest. Then the implemented algorithm will return the correct result.
Generic error handling reduces interoperability
State
- Acknowledged
Severity
- Severity: Low
≈
Likelihood: Low×
Impact: Low Submitted by
0xluk3
Summary
Generic error types reduce library interoperability and make error handling difficult for external integrations.
Finding Description
Throughout the codebase, errors are returned with generic types (e.g.,
ProofError::StateReadError,ProofError::InvalidLookupTable) accompanied by debug-formatted string reasons. The error handling pattern follows such form:.map_err(|e| ProofError::StateReadError { reason: format!("{e:?}"),})?;Occurrences:
- salt/src/proof/prover.rs - generic
StateReadErrorusage - salt/src/proof/witness.rs - generic error handling in witness generation
- salt/src/proof/witness.rs -
InvalidLookupTableused for multiple distinct failure cases - salt/src/state/state.rs - key mismatch and empty slot both treated as "not found"
- salt/src/proof/subtrie.rs - storage read errors (for metadata, node entries, and commitments) are all collapsed into the same generic ProofError::StateReadError
- salt/src/proof/prover.rs:91 -
SerdeMultiPointProofserialization logic converts I/O errors to generic serde errors viato_string(), hiding the originalstd::io::Errorkind. - salt/src/state/state.rs -
shi_rehashfails silently by returningOk(())in line 410 instead of an error when the providednew_capacityis insufficient, leaving the caller unaware that the operation was aborted. - salt/src/state/state.rs - The
direct_findhelper function conflates multiple distinct outcomes (key not in the lookup table, key's value is empty, key mismatch from hash collision) into a singleOk(None)result.
Impact Explanation
Library consumers (external integrations, any software built on top) cannot distinguish between different failure modes like bucket metadata read errors, SHI search errors, key mismatches, or empty slots without fragile string parsing. This makes error handling difficult and reduces the reliability of integrating systems. Moreover, the
reasonformat might differ among operating systems, which requires the integrators to handle errors differently depending on the OS.Recommendation
A more optimal solution would be to throw errors related to the failure reason - which is to parse them here and return a corresponding error, for instance
BucketMetadataReadError,SHISearchError, etc. Errors should be defined as self-explanatory, without additionalmessageorreason.The Validate::Yes path ignores validation
Summary
The match handling of the validation flag routes both
Validate::YesandValidate::Noto the same non-validating deserialization routine. As a result, callers that explicitly request validation receive none.This can allow malformed, non-canonical, off-curve, or wrong-subgroup encodings to be accepted silently wherever validated decoding was expected, undermining soundness guarantees and any downstream assumptions that “validated bytes” were enforced.
Recommendation
Implement a truly validating branch for
Validate::Yesthat enforces canonical encoding, on-curve checks, and subgroup membership before constructing the element. Keep the non-validating branch only for trusted and internal use, and consider returning aResultfrom the validating path to propagate decoding errors cleanly.Unchecked CRS point deserialization
Summary
CRS loading uses an unchecked, uncompressed
Element::from_bytes_unchecked_uncompressedpoint constructor for each entry even though the documentation says the byte array contains serialized elliptic curve points. Without canonical-encoding, on-curve, and subgroup checks, a tampered CRS can contain invalid points.This can break binding, hiding properties and, in the worst case, enable proof forgeries or other algebraic traps if an attacker controls the CRS bytes.
Recommendation
Replace unchecked deserialization with a checked routine that verifies canonical encoding, on-curve status, and subgroup membership for every CRS point before acceptance. Refuse initialization on any failure and surface a descriptive error.
The SaltValue::new truncates lengths to u8 and lacks bounds checks
State
- Acknowledged
Severity
- Severity: Low
≈
Likelihood: Low×
Impact: Low Submitted by
JakubHeba
Summary
The
newconstructor accepts arbitrary-length key and value slices, stores their lengths in single-byte fields, and then copies the original (usize) lengths into a fixed-size buffer. There are no checks that each length fits in a byte or that the total encoded size fits within the backing array.This creates two problems:
- casting the lengths to u8 can mis-encode any input longer than 255 bytes, producing headers that don’t match the actual copied data.
- if
2 + key_len + value_lenexceeds the buffer capacity, the slice copies will panic, yielding a reliable DoS when fed oversized inputs.
Downstream readers that trust the stored length bytes may then see inconsistent or malformed encodings.
Recommendation
Make the constructor fallible and validate inputs before writing. Return an explicit error on violation instead of panicking.
SaltValue::key() and value() trust embedded lengths without consistency checks
State
- Acknowledged
Severity
- Severity: Low
≈
Likelihood: Low×
Impact: Low Submitted by
JakubHeba
Summary
The methods derive slice ranges from the first two bytes (key length and value length) and index directly into the buffer without verifying that
2 + key_len + value_lenfits within the actual buffer size.A maliciously crafted value can cause out-of-bounds slicing and a panic, resulting in a denial-of-service when processing untrusted witnesses.
Recommendation
Introduce a validated parsing routine that checks the buffer length before slicing and returns a
Resulton malformed inputs. Enforce that the total length equals the header-declared lengths and is within the maximum allowed size.
Informational9 findings
Non uniformly random CRS
State
- Acknowledged
Severity
- Severity: Informational
≈
Likelihood: Low×
Impact: Low Submitted by
dnevado
Summary
The method for deriving random points for the CRS does not follow a uniform distribution.
Finding Description
This method of generating a random coordinate is not uniform. It is deriving a random number from a uniform distribution in the range
[0, 2^256)and then reducing it moduloFq::MODULUS. The result is no longer uniform, in fact, it is significantly skewed:2^256 = 2 * Fq::MODULUS + RwhereRis2^256 mod Fq::MODULUSHence, the elements in
Fqin the range[0,R)have a 50% higher probability of being drawn than the elements in the range[R, Fq::MODULUS).Impact Explanation
The impact is minimal. The CRS has not been generated in a completely uniformly random way but in practical terms there is no way to exploit this.
Recommendation (optional)
The usual approach to mitigate this issue is to generate at least 128 bits of extra randomness before performing the reduction. This way, the skewing is greatly reduced. Here is a detailed explanation and a reference implementation: https://github.com/zkcrypto/ff/blob/41fb01b8990909b8b1b44d4601556694c473bc16/src/lib.rs#L422-L448
Non reflexive Eq in Element
Summary
Eqderived forElementdoes not satisfy the reflexive property.Finding Description
Deriving
Eqrequires the reflexive property. That means, that allElementis equal to itself. However, (due to the conditional checks at the beginning ofPartialEqimpl) this does not hold for the element(0,0).This point is not on the curve so it should be impossible to create, so ifElementguarantees the correctness of the point, this trait impl is fine.Recommendation (optional)
Remove the implementation of
Eq.Doc error
Summary
Error in the documentation of equivalent points in Banderwagon.
Finding Description
This comment seems to be wrong.
(-x, y)and(x, y)are different points in Banderwagon, so their conversion to bytes should return different byte arrays.Recommendation (optional)
Change doc to: "This is because if (-x, -y) is on the curve, then (x,y) is also on the curve. This method will return two different byte arrays for each of these."
Unconventional sign choice
State
- Acknowledged
Severity
- Severity: Informational
≈
Likelihood: Low×
Impact: Low Submitted by
dnevado
Summary
The designation of positive and negative elements in a prime field is arbitrary. Here, the convention chosen is the opposite of what is commonly used and is described in the original spec of Bls12-381 and Jubjub.
Finding Description
We are considering the lexicographical largest to be the positive, but this is the opposite of what the BLS specification suggests, which is: The values of
Fqviewed as integers in[0,p)are considered positive if the lie in the first half:[0, (p-1)/2]and negative in the other half.From page 11 in ZCash spec: "... means the positive square root of 𝑎 in F𝑞, i.e. in the range {︀0 ..(𝑞−1)/2}︀. "
The current implementation satisfies:
is_positive( - Fq::ONE ) == truewhich can be misleading.Impact Explanation
Minimal. Just counter-intuitive behavior.
Likelihood Explanation
Minimal. It may lead to conflicting implementations of serialization, but it is unlikely.
Recommendation (optional)
Document clearly what is meant by positive and how this affects the chosen representative of Banderwagon elemens and their serialization.
Non uniform transcript challenges
State
- Acknowledged
Severity
- Severity: Informational
≈
Likelihood: Low×
Impact: Low Submitted by
dnevado
Summary
The modular reduction used to obtain scalar challenges from the transcript breaks the uniformity of the distribution.
Finding Description
It would be good to get more bits of entropy to generate the scalar in order to reduce the bias and have the challenge distribution be closer to uniform. See:
Recommendation (optional)
Generate a random number between `[0, 256 + 128)` and reduce it modulo `r` to obtain the challenge scalar.Constant description is inconsistent with implementation
Severity
- Severity: Informational
≈
Likelihood: Low×
Impact: Low Submitted by
0xluk3
Description
In
salt/src/trie/trie.rs:924, code comment states empty entries are hashed to0, however, the actual implementation in constants.rs definespub const EMPTY_SLOT_HASH: [u8; 32] = [1u8; 32];which means that the array is filled with 1's.Recommendation (optional)
While no security impact was found, it is recommended to adjust the comment to the actual state of the code.
Comparison can be simplified
Severity
- Severity: Informational
≈
Likelihood: Low×
Impact: Low Submitted by
0xluk3
Description
The
PartialEqimplementation forSerdeCommitmentuses unchecked deserialization, which is safe but unnecessarily complex. Insalt/src/proof/prover.rs:39, theeqimplementation directly compares raw bytes withfrom_bytes_unchecked_uncompressed. It differs from the deserialization path (e.g., in deserialize), which first callsElement::from_bytes for validation. While this is safe in context, it introduces unnecessary complexity to the comparison logic.Recommendation (optional)
Consider changing
SerdeCommitmentto wrap anElementdirectly.Unbounded table index in add_deltas
State
- Acknowledged
Severity
- Severity: Informational
Submitted by
JakubHeba
Summary
The
add_deltasmethod uses the provided index to index into internal tables without bounds checks. If caller input were untrusted, an out-of-range index could cause a panic or memory access error.In this codebase, the function is currently used only in unit tests and need to be moved under a test module, so it does not present a production risk.
Recommendation
No production change required if the function remains test-only. If it is ever re-introduced into non-test code, add explicit bounds checks (or safe indexing returning a
Result) and treat indices as untrusted.Unconventional use of CRS point Q
State
- Acknowledged
Severity
- Severity: Informational
≈
Likelihood: Low×
Impact: Low Submitted by
dnevado
Summary
The
Qpoint in the CRS is not being used for its intended purpose.Finding Description
The CRS provides a blinder point
Qthat allows to add zk to when using IPA as a PCS. However,Qis being used for a different purpose. It is being used as a generator for the challenge point of the IPA protocol. Since the dlog relation betweenQand theG_ipoints in the CRS is unknown, this is not a problem for security.Recommendation (optional)
Since ZK is not needed, the doc for this point can be updated to reflect its actual purpose.
This is a good reference implementation for the parameters needed in IPA: https://github.com/zcash/halo2/blob/2308caf68c48c02468b66cfc452dad54e355e32f/halo2_proofs/src/poly/commitment.rs#L26-L33