Giter VIP home page Giter VIP logo

milagro_bls's Introduction

BLS12-381 Aggregate Signatures in Rust using Apache Milagro

Build Status Gitter

WARNING: This library is a work in progress and has not been audited. Do NOT consider the cryptography safe!

Uses the The Apache Milagro Cryptographic Library.

This crate is heavily based upon work by @lovesh.

Presently this library only supports features required for Ethereum 2.0 signature validation. The aggregation methods here are vulnerable to the rouge-key attack.

There has been no public audit or scrutiny placed upon this crate. If you're a cryptographer I would love to have your input.

This library uses a Proof of Possession (PoP) variant as protection against rogue key attacks. A public key can be PoP verified by signing a hash of the public key. This must be done before a PublicKey may be used in any aggregate signatures.

Subgroup checks are performed for signatures during verification and public keys during deserialisation.

BLS Standard

Current implementation of the BLS Standard aligns with bls-signatures-v04 and hash-to-curve-v09.

Usage

Single Signatures

Perform signing and verification of non-aggregate BLS signatures. Supports serializing and de-serializing both public and secret keys.

let sk_bytes = vec![
	78, 252, 122, 126, 32, 0, 75, 89, 252, 31, 42, 130, 254, 88, 6, 90, 138, 202, 135, 194,
	233, 117, 181, 75, 96, 238, 79, 100, 237, 59, 140, 111,
];

// Load some keys from a serialized secret key.
let sk = SecretKey::from_bytes(&sk_bytes).unwrap();
let pk = PublicKey::from_secret_key(&sk);

// Sign a message
let message = "cats".as_bytes();
let signature = Signature::new(&message, &sk);
assert!(signature.verify(&message, &pk));

// Serialize then de-serialize, just 'cause we can.
let pk_bytes = pk.as_bytes();
let pk = PublicKey::from_bytes(&pk_bytes).unwrap();

// Verify the message
assert!(signature.verify(&message, &pk));

Generate new "random" secret keys (see SecretKey docs for information on entropy sources).

// Generate a random key pair.
let sk = SecretKey::random(&mut rand::thread_rng());
let pk = PublicKey::from_secret_key(&sk);

// Sign and verify a message.
let message = "cats".as_bytes();
let signature = Signature::new(&message, &sk);
assert!(signature.verify(&message, &pk));

Aggregate Signatures

Aggregate signatures and public keys. Supports serializing and de-serializing both AggregateSignatures and AggregatePublicKeys.

let signing_secret_key_bytes = vec![
		vec![
				98, 161, 50, 32, 254, 87, 16, 25, 167, 79, 192, 116, 176, 74, 164, 217, 40, 57,
				179, 15, 19, 21, 240, 100, 70, 127, 111, 170, 129, 137, 42, 53,
		],
		vec![
				53, 72, 211, 104, 184, 68, 142, 208, 115, 22, 156, 97, 28, 216, 228, 102, 4, 218,
				116, 226, 166, 131, 67, 7, 40, 55, 157, 167, 157, 127, 143, 13,
		],
];

// Load the key pairs from our serialized secret keys,
let signing_keypairs: Vec<Keypair> = signing_secret_key_bytes
		.iter()
		.map(|bytes| {
				let sk = SecretKey::from_bytes(&bytes).unwrap();
				let pk = PublicKey::from_secret_key(&sk);
				Keypair { sk, pk }
		})
		.collect();

let message = "cats".as_bytes();

// Create an aggregate signature over some message, also generating an
// aggregate public key at the same time.
let mut agg_sig = AggregateSignature::new();
let mut public_keys = vec![];
for keypair in &signing_keypairs {
		let sig = Signature::new(&message, &keypair.sk);
		agg_sig.add(&sig);
		public_keys.push(keypair.pk.clone());
}
let agg_pub_key = AggregatePublicKey::into_aggregate(&public_keys).unwrap();

// Serialize and de-serialize the aggregates, just 'cause we can.
let agg_sig_bytes = agg_sig.as_bytes();
let agg_sig = AggregateSignature::from_bytes(&agg_sig_bytes).unwrap();

// Verify the AggregateSignature against the AggregatePublicKey
assert!(agg_sig.fast_aggregate_verify_pre_aggregated(&message, &agg_pub_key));
}

How to Run Benchmarks

cargo bench --features "bench"

milagro_bls's People

Contributors

agemanning avatar g-r-a-n-t avatar kirk-baird avatar lovesh avatar michaelsproul avatar paulhauner avatar prestonvanloon avatar ratankaliani avatar shekohex avatar sorpaas avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

milagro_bls's Issues

Should we validate the pubkeys parameters of `fast_aggregate_verify`?

Hi Kirk,

Currently, the fast_aggregate_verify returns true if the aggregate_public_key is an identity key because we don't run key_validate checks for the input public keys.

The IETF spec defines the behavior of FastAggregateVerify when the preconditions are met. But whether to check those preconditions are left for implementors to decide.

Here are some options:

  • In Eth2's test cases, the validation is expected. It might be safer if all the client teams implement the same behavior.
  • The nim-blscurve implements two versions, one with the check and one without.

Test case01 fails

Problem
The cause of this test failing is that the point in (x, y, z) form of the test does not match the (x, y, z) form of this library. They both translate to the same (x, y) hence they both represent the same point but it would be good to know why the intermediate states are different.

Note: Likely cause would be if affine() is called at anytime.

Location
Test is located in src/amcl_utils

Function Tested
The function being tested is hash_on_g2 which takes a message and domain and generates a G2 point.

Decompress a PublicKey 500% performance regression in v1.4.0

What's wrong

The performance of decompressing a PublicKey is significant slow in v1.4.0. Cropped benchmark result:

compression/Decompress a PublicKey                                                                            
                        time:   [364.46 us 372.29 us 383.01 us]
                        change: [+497.89% +518.88% +547.26%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 11 outliers among 100 measurements (11.00%)
  7 (7.00%) high mild
  4 (4.00%) high severe

Steps to reproduce

git checkout v1.3.0 
git submodule update
rm -rf target/
cargo bench --features "bench"
git checkout v1.4.0
git submodule update 
cargo bench --features "bench"

Logs:

v1.4.0

Running target/release/deps/bls381_benches-21ec5f2d00d66323

signing/Create a Signature                                                                            
                        time:   [2.2649 ms 2.4643 ms 2.8161 ms]
                        change: [-4.1061% +2.3568% +11.914%] (p = 0.72 > 0.05)
                        No change in performance detected.
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high severe

signing/Verify a Signature                                                                           
                        time:   [7.6921 ms 8.3690 ms 9.3884 ms]
                        change: [-6.8269% +2.0599% +13.337%] (p = 0.70 > 0.05)
                        No change in performance detected.

Benchmarking multiple-signatures-verification-30/Verification of multiple aggregate signatures with optimizations: Collecting 10 samples in estimated 8.6845 s                                                                                                                                                                multiple-signatures-verification-30/Verification of multiple aggregate signatures with optimizations                        
                        time:   [81.784 ms 89.208 ms 96.381 ms]
                        change: [+3.7598% +9.7247% +18.196%] (p = 0.01 < 0.05)
                        Performance has regressed.
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high mild

aggregation/Verifying aggregate of 128 signatures                                                                            
                        time:   [7.6322 ms 7.7766 ms 7.9435 ms]
                        change: [+2.2044% +4.3055% +6.3692%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 6 outliers among 100 measurements (6.00%)
  2 (2.00%) high mild
  4 (4.00%) high severe

aggregation/Aggregate a PublicKey                                                                             
                        time:   [2.1013 us 2.1592 us 2.2340 us]
                        change: [+54.977% +78.314% +105.88%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 11 outliers among 100 measurements (11.00%)
  4 (4.00%) high mild
  7 (7.00%) high severe

aggregation/Aggregate a Signature                                                                             
                        time:   [5.1918 us 5.3296 us 5.4961 us]
                        change: [+7.5565% +11.184% +14.766%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 12 outliers among 100 measurements (12.00%)
  2 (2.00%) high mild
  10 (10.00%) high severe

compression/Decompress a Signature                                                                            
                        time:   [198.73 us 200.60 us 202.77 us]
                        change: [+6.7434% +9.5261% +13.081%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 6 outliers among 100 measurements (6.00%)
  2 (2.00%) high mild
  4 (4.00%) high severe

compression/Compress a Signature                                                                            
                        time:   [1.4246 us 1.4627 us 1.5011 us]
                        change: [-5.3093% +1.8072% +7.8935%] (p = 0.65 > 0.05)
                        No change in performance detected.

compression/Decompress a PublicKey                                                                            
                        time:   [364.46 us 372.29 us 383.01 us]
                        change: [+497.89% +518.88% +547.26%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 11 outliers among 100 measurements (11.00%)
  7 (7.00%) high mild
  4 (4.00%) high severe

compression/Compress a PublicKey                                                                            
                        time:   [1.4288 us 1.4490 us 1.4937 us]
                        change: [-10.385% -0.2387% +10.038%] (p = 0.97 > 0.05)
                        No change in performance detected.
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high severe

compression/Decompress a PublicKey from Bigs                                                                             
                        time:   [1.7494 us 1.7913 us 1.8458 us]
                        change: [+6.5483% +8.9457% +11.375%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 8 outliers among 100 measurements (8.00%)
  3 (3.00%) high mild
  5 (5.00%) high severe

compression/Compress a PublicKey to Bigs                                                                            
                        time:   [1.0970 us 1.1233 us 1.1606 us]
                        change: [+5.9204% +10.128% +16.469%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high severe

key generation/Generate random keypair                                                                            
                        time:   [389.87 us 400.86 us 414.20 us]
                        change: [+8.9707% +11.088% +13.485%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 12 outliers among 100 measurements (12.00%)
  2 (2.00%) high mild
  10 (10.00%) high severe

key generation/Generate keypair from known string                                                                            
                        time:   [419.51 us 431.32 us 445.95 us]
                        change: [+4.7468% +12.559% +19.381%] (p = 0.00 < 0.05)
                        Performance has regressed.
Found 10 outliers among 100 measurements (10.00%)
  3 (3.00%) high mild
  7 (7.00%) high severe

v1.3.0

Running target/release/deps/bls381_benches-fe64a3c500672cce
signing/Create a Signature                                                                            
                        time:   [2.2934 ms 2.3310 ms 2.3758 ms]
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high mild

signing/Verify a Signature                                                                           
                        time:   [7.5973 ms 7.8653 ms 8.3068 ms]

Benchmarking multiple-signatures-verification-30/Verification of multiple aggregate signatures with optimizations: Collecting 10 samples in estimated 8.5392 s                                                                                                                                                                multiple-signatures-verification-30/Verification of multiple aggregate signatures with optimizations                        
                        time:   [75.792 ms 77.780 ms 80.390 ms]
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high severe

aggregation/Verifying aggregate of 128 signatures                                                                            
                        time:   [7.4041 ms 7.4556 ms 7.5082 ms]
Found 1 outliers among 100 measurements (1.00%)
  1 (1.00%) high mild

aggregation/Aggregate a PublicKey                                                                             
                        time:   [1.6946 us 1.7270 us 1.7770 us]
Found 6 outliers among 100 measurements (6.00%)
  3 (3.00%) high mild
  3 (3.00%) high severe

aggregation/Aggregate a Signature                                                                             
                        time:   [4.9031 us 5.0449 us 5.2287 us]
Found 4 outliers among 100 measurements (4.00%)
  1 (1.00%) high mild
  3 (3.00%) high severe

compression/Decompress a Signature                                                                            
                        time:   [189.00 us 190.42 us 192.16 us]
Found 6 outliers among 100 measurements (6.00%)
  4 (4.00%) high mild
  2 (2.00%) high severe

compression/Compress a Signature                                                                            
                        time:   [1.3475 us 1.4022 us 1.4885 us]
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high severe

compression/Decompress a PublicKey                                                                            
                        time:   [62.944 us 63.413 us 63.958 us]
Found 2 outliers among 100 measurements (2.00%)
  2 (2.00%) high mild

compression/Compress a PublicKey                                                                            
                        time:   [1.4373 us 1.5601 us 1.7437 us]
Found 1 outliers among 10 measurements (10.00%)
  1 (10.00%) high mild

compression/Decompress a PublicKey from Bigs                                                                             
                        time:   [1.6095 us 1.6231 us 1.6474 us]
Found 10 outliers among 100 measurements (10.00%)
  3 (3.00%) high mild
  7 (7.00%) high severe

compression/Compress a PublicKey to Bigs                                                                            
                        time:   [1.0481 us 1.0510 us 1.0547 us]

key generation/Generate random keypair                                                                            
                        time:   [351.54 us 352.96 us 354.64 us]
Found 3 outliers among 100 measurements (3.00%)
  3 (3.00%) high mild

key generation/Generate keypair from known string                                                                            
                        time:   [389.64 us 410.85 us 440.89 us]
Found 7 outliers among 100 measurements (7.00%)
  2 (2.00%) high mild
  5 (5.00%) high severe

PublicKey::from_bytes runs extremely slowly

In my testing, it takes ~6ms to deserialize a single public key with this method. This is suspiciously slow. I suspect something strange is going on here.

#46 alludes to some performance issues there, but his benchmarks are an order of magnitude faster than mine. I am running an M1 macbook so I don't suspect it is my machine.

Interestingly, using from_bytes_unchecked is faster at around 1ms, but still undershoots previous benchmarks significantly.

Empty `AggregatePublicKeys`

What is the issue

According to the standard an AggregateVerify(sig, msgs, pks) should return false if pks is an empty array.

However, in our case we would convert pks to the identity element which would verify if sig is also the identity.

Recommendation

  • Add an is_empty boolean to AggregatePublicKey and return false when verifying this.
  • In verify_multiple_aggregate_signatures() return false if any aggregate_public_key.is_empty().
  • In aggregate_verify() return false if public_keys.len() == 0
  • In fast_aggregate_verify() return false if public_keys.len() == 0
  • in fast_aggregate_verify_pre_aggregated() return false if aggregate_public_key.is_empty()

Make travis build all targets

As discovered in #10, Travis CI still passes when the benchmarks do not compile.

We should try adding a cargo build --all-targets script to .travis.yml.

Optimise subgroup checks

Issue

Subgroup checks are somewhat time consuming. They can be optimised by methods described in this paper.

See functions:

  • subgroup_check_g1()
  • subgroup_check_g2()

Update

Subgroup checks can be improved by using the pair::g1mul() and pair::g2mul() functions in amcl these will improve multiplication times significantly.

Update Benchmarks

Issue

The bench marks are based off old code and will no longer run.
They need to be updated to reflect recent changes to function names and function parameters.

Found in: benches/bls381_benches.rs

Public Keys subgroup check

Issue

During the proof of possession stage signatures need the subgroup checked.

Once a public key is checked once it does not need to be checked each time during signature verification.

Steps to resolve

Create a subgroup check function and add it to the PopVerify.

Simple would be check pk * r == 0 but this is time consuming so a faster operation would be preferable.

Signatures subgroup check

What is the issue

Signatures may be malleable if we don't check the signature is in the correct sub group.

Steps to resolve

As defined in the BLS standard check r * Sig == 0 before checking the pairings for Signatures and AggregateSignatures

There are a few more changes to the BLS standards that also need to be implemented.

Secret Key Size / Value

Secret Key should be between values 1 < sk < r - 1 as defined here, where r is a little less than 32 bytes.
This check should be added to from_bytes() .
Furthermore we should also then change SecretKey.x to private to prevent modification to the secret key to be larger than r.

This could also be a good time to change SecretKey::from_bytes() to take a bytes which are less than 48 bytes. I would recommend allowing any bytes less than 48 bytes and appending the necessary number of zero bytes such the input length is 48 bytes and can be deserialised using the Big::frombytes() function.

Update to BLS Standard v02

Proposed Changes

There has been an update to the hash-to-field / hash-to-base functions these changes will very likely be introduced into Ethereum soon and should be reflected in this library.

Additional Information

See the following PR for changes cfrg/draft-irtf-cfrg-hash-to-curve#212 for more details.

This is an expansion of #16

Edit

BLS standard version 2 can be found here

`AggregatePublicKey` should never allow adding infinity `PublicKey` and Subgroup Checks

What is the issue

It is currently possible to have public keys which are the point at infinity via:

  • new_from_raw()
  • from_bytes()
  • from_uncompressed_bytes()

These public keys if added to an AggregatePublicKey will skip the check to ensure PublicKeys are not infinity. Noting that public keys should only be aggregated if they have passed proof of possession which is not possible for PublicKey = inf however it seems worthwhile to handle these issues for the cases where there is a bug in the calling library.

Solutions

A) This can be prevented by running key_validate() during deserialisation rather than verification at the cost of extra deserialisation time but less verification time. This will be beneficial when a public key is used more than once. There is no difference if a public key is used once and it will be slower if a public key is never used but is deserialised. This guarantees that all PublicKeys and AggregatePublicKeys will have passed key_validate() or be generated from a SecretKey which is guaranteed to be in the range [1, r-1].

B) Alternatively, by changing AggregatePublicKey::from_public_key(), AggregatePublicKey::add() and AggregatePublicKey::aggregate() to return error if one public key is infinity.

I think option A) is a cleaner and more efficient solution following the principle of failing as early as possible.

Remove `G1Point` and `G2Point`

Proposed Changes

These wrappers no longer seem to be adding any functionality and so are unnecessary.

I vote to remove these changes to cleanup the library, unless there is a good reason to keep them.

Additional Comments

Check that these and not directly used in lighthouse and if they are remove them.

Implement Drop for private keys

We should implement Drop for the SecretKey struct and ensure memory is zero-ed out.

This is a security precaution to avoid leaving secrets hanging around in memory.

Optimise Curve multiplications

What is the issue

Optimised elliptic curve multiplications can exist in pair::g2mul() and pair::g1mul() which will significantly reduce curve multiplication times.

The optimisations use curve endomorphisms over the simple double and add algorithm.

ToDo

These can be used in signing, public key generation and hash to curve (see #26).

SecretKey Key Generation

What is the issue

SecretKey generation should be updated to remove potential bias in the keying material.

See the function KeyGen on the bls standard.

no-std not compiling

I am trying to use this crate with the Substrate framework, which requires the no-std feature. It looks like this crate supports running in no-std environments, if I look at Cargo.toml. However, when I try to compile the crate I get the following compiler errors:

error[E0433]: failed to resolve: could not find `prelude` in `alloc`
  --> src/lib.rs:15:20
   |
15 |     pub use alloc::prelude::v1::*;
   |                    ^^^^^^^ could not find `prelude` in `alloc`

error[E0433]: failed to resolve: use of undeclared type `Vec`
  --> src/keys.rs:58:27
   |
58 |             let mut prk = Vec::<u8>::with_capacity(1 + ikm.len());
   |                           ^^^ not found in this scope
   |
help: consider importing this struct
   |
5  | use alloc::vec::Vec;
   |

I will try to fix these in a forked version, but just wanted to get your input in case I might be missing something.

Is the crate only supported for nightly builds?

error[E0554]: `#![feature]` may not be used on the stable release channel
 --> src/lib.rs:4:5
  |
4 |     feature(alloc),
  |     ^^^^^^^^^^^^^^

error[E0554]: `#![feature]` may not be used on the stable release channel
 --> src/lib.rs:5:5
  |
5 |     feature(alloc_prelude),
  |     ^^^^^^^^^^^^^^^^^^^^^^

error[E0554]: `#![feature]` may not be used on the stable release channel
 --> src/lib.rs:6:5
  |
6 |     feature(prelude_import)
  |     ^^^^^^^^^^^^^^^^^^^^^^^

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.