lnp-bp / client_side_validation Goto Github PK
View Code? Open in Web Editor NEWStandard implementation of client-side-validation APIs
Home Page: https://docs.rs/client_side_validation
License: Apache License 2.0
Standard implementation of client-side-validation APIs
Home Page: https://docs.rs/client_side_validation
License: Apache License 2.0
538417b was a temporarily fix
WitnessVersion
type)Otherwise it breaks CI with nightly copiler: https://github.com/LNP-BP/client_side_validation/runs/7593625092?check_suite_focus=true
After LNP-BP/LNPBPs#103
Downstream crates should start using modern PSBT implementation from psbt
(https://crates.io/crate/psbt) crate, supporting also strict encoding and both v0 and v2 PSBT versions (requires use of string_encoding
flag in that crate).
To prevent difference in TLV implementation for structs implementing both lightning and strict encoding the tlv attribute must have universal name (like #[tlv]
instead of #[network_encoding(tlv)]
Propose a PR to grin_secp256k1zkp to make Error type Ord
Pending async trait to become part of MSRV (still years)
Executing rgb-lib tests we found an issue that sometimes happens and seems very similar to RGB-WG/rgb-node#208, but I'm not sure it's the same.
The error we receive is MerkleBlock conceal procedure is broken
and comes from ~/.cargo/git/checkouts/client_side_validation-8dff74d720902144/46cb8a7/commit_verify/src/mpc/block.rs:418:21
.
@dr-orlovsky could you please investigate this?
usize
makes sense when not all of the data are written. Otherwise, the size of encoded data can be checked with the planned amplify::io::ByteCounter
In must be above 75%
This will require:
LongVec
helper wrapper type, defined at strict_encoding
library and used internallystrict_encode_derive
, which will read special #[strict_encoding(length_bytes = 1|2|3|4|8)]
attribute and encode/decode underlying array via converting it through LongVec
Downstream dependencies:
Prerequisites:
secp256k1
library.secp256k1-zkp
secp256k1-zkp
secp256k1
When fixing some clippy lints on the descriptor wallet, I ran into an issue here:
Docs on the linter rule are here:
https://rust-lang.github.io/rust-clippy/master/index.html#init_numbered_fields
Normally I wouldn't file a feature request just to satisfy a linter, but this might be indicative of some sort of oversight. Feel free to close if we're comfortable with using #![allow(clippy::init_numbered_fields)]
in all projects using strict encoding.
I've been building some functionality to master branch of lnp-node
, and I encountered some nasty build error in dependency strict_encoding
0.9.0.
Something like this
error[E0432]: unresolved import `bitcoin::XpubIdentifier`
--> /home/joemphilips/.cargo/registry/src/github.com-1ecc6299db9ec823/strict_encoding-0.9.0/src/bitcoin.rs:32:21
|
32 | XOnlyPublicKey, XpubIdentifier,
| ^^^^^^^^^^^^^^
| |
| no `XpubIdentifier` in the root
| help: a similar name exists in the module: `PubIdentifier`
Not sure where this PubIdentifier came from, I could not find the name with git log -p
in rust-bitcoin.
Anyway, I tried to debug by forking strict_encoding
. which existed in this repository.
But I suppose it has been moved to https://github.com/strict-types/strict-encoding now?
If so, forking this repository seems not a good idea.
Probably I should wait until the strict-encoding
in the new repository gets stable and other lnp-bp-rgb libraries depends on it? How long should I wait for it?
Currently, bitcoin public key/signatures allow encoding in compressed and uncompressed form, depending on the compression flag value, and do not encode the flag itself. Secp256k1 keys and ECDSA signatures do not allow uncompressed encoding.
This
It is proposed to prohibit uncompressed keys and make bitcoin and secp256k1 key/signature implementations to be identical. If it is required to use uncompressed flag, it must be presented as a separate field withing the structure.
Alternative: use bitcoin-based data types to always encode compression flag, but serialize even uncompressed keys in a compressed version.
And use them throughout RGB validation
Details: RGB-WG/rgb-node#208
stens fails to compile to wasm32-unknown-unknown due to the dependency, amplify_syn.
Hi @dr-orlovsky,
I noticed that the CommitmentId::commitment_id
function returns different values according to the target compilation.
I discovered this while I was testing rgb-schemata
, I even created a test that can help you with the bug investigation:
#[cfg(test)]
mod test {
use super::*;
#[test]
fn show_schema_id() {
let expected = "5YWfKW3CqANHsKqpxy3HaCpt5bgvsMXHUuXiHpoynEYG";
let schema = nia_schema();
let id = schema.schema_id().to_string();
assert_eq!(expected, id); // OK!
}
}
#[cfg(target_arch = "wasm32")]
mod test_wasm32 {
use super::*;
use wasm_bindgen_test::*;
wasm_bindgen_test_configure!(run_in_browser);
#[wasm_bindgen_test]
fn show_schema_id() {
let expected = "5YWfKW3CqANHsKqpxy3HaCpt5bgvsMXHUuXiHpoynEYG";
let schema = nia_schema();
let id = schema.schema_id().to_string();
assert_eq!(expected, id); // Fail!
/*
panicked at 'assertion failed: `(left == right)`
left: `"5YWfKW3CqANHsKqpxy3HaCpt5bgvsMXHUuXiHpoynEYG"`,
right: `"12vPR9qthiLpPgoytGBbeerVYFKvdCXEUmbg891HVyHV"`', src/main.rs:143:9
*/
}
}
Hope this helps.
client_side_validation/src/api.rs
Lines 239 to 251 in 5e87205
Currently, MPC trees (LNPBP-4) are limited to a maximum depth of 2^4 = 16, i.e. they may contain up to 2^16=65536 elements. The original design assumed that's enough to host all assets which may be allocated to a single UTXO.
However, tests do show that even 127 different assets may not UNIQUELY fit into a tree with width=655536 (the position of the asset is 256-bit asset id modulo size of the tree) and practically we can assume just around 64 assets to be assignable to the same UTXO.
Here I propose extending MPC tree depth to 32 and tree width to 2^32 such that the tree may host up to 16000 separate assets (no of assets < 2^(width / 2 - 2))
This will require rewriting strict encoding for primitive data types instead of using bitcoin consensus encoding for them.
The most required parts are LNPBP-4 code and merklization
The main strict encoding rule was to restrict the size of any collection (string, vector of data, maps) to 2^16 elements (i.e. have a length encoded by 2 bytes). This happens implicitly, which leads to a non-obvious bugs missed in code reviews.
To address the issue it is proposed first to perform #97 and than, additionally to that, do the following:
confined_encoding
for unordered hash collections (HashSet
, HashMap
);confined_encoding
from all other collection types;ConfinedString
, ConfinedVec
, ConfinedSet
, ConfinedMap
, backed by inner types String
, Vec
, BTreeSet
, BTreeMap
) and require all consensus code to explicitly use them.Implement strict encoding for KeyPair
type once there will be a way to serialize its inner data in Secpk256k1 lib (see rust-bitcoin/rust-secp256k1#298)
After v0.10 refactoring some of the tests has to be updated
The original intent of strict_encoding
was to provide encoding standard for the client-side-validation. That is why the crate is a part of this repository and used everywhere inside RGB and AluVM.
However, it appeared that the crate has become much more widely used, including commercial projects much outside of the scope of the client-side-validation. That happened due to the fact that in rust there are no good binary compact encodings guaranteeing determinism in data size or ordering.
Widespread use of strict_encoding
creates pressure for adding more dependencies and types to it, since rust foreign type policies prohibits derivation of strict encoding traits on other dependencies downstream. While the new dependencies are optional, it still blows up codebase and requires permanent releases of new version.
On the other hand, client-side-validation requirements to the encoding are quite specific; for instance, we use bitcoin consensus encoding in many places, and such encoding is not consistent nor provides the same approach as encoding for other types.
So basically, it should not be advised to use strict encoding for anything else than client-side-validation and add more types here. Nevertheless, it is really hard to prevent that, especially taking into account existing adoption of the crate.
The solution I propose is the following:
strict_encoding
and confined_encoding
;confined_encoding
;strict_encoding
(including descriptor wallet etc);strict_encoding
;confined_encoding
;encoding_derive_helpers
to https://github.com/rust-amplify/This will restrict use of new crate confined_encoding
to client-side-validation consensus only. Any non-consensus code (including descriptor wallet etc) MUST NOT use confined_encoding
. The confined_encoding
must be ossified and new types must not be added to it.
This reduces boilerplate in method implementation and avoids rust compiler crashes due to recursions in generic resolution
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.