Giter VIP home page Giter VIP logo

libsparkcrypto's Introduction

https://travis-ci.org/Componolit/libsparkcrypto.svg?branch=componolit

libsparkcrypto - A cryptographic library implemented in SPARK 2014.

libsparkcrypto is a formally verified implementation of several widely used cryptographic algorithms using the SPARK 2014 programming language and toolset [1]. For the complete library proofs of the absence of run-time errors like type range violations, division by zero and numerical overflows are available. Some of its subprograms include proofs of partial correctness.

The distribution contains test cases for all implemented algorithms and a benchmark to compare its performance with the OpenSSL library [2].

Copyright, Warranty and Licensing

Copyright (C) 2011-2017 Stefan Berghofer
Copyright (C) 2010-2011,2018 Alexander Senier
Copyright (C) 2010-2017 secunet Security Networks AG
All rights reserved.

libsparkcrypto is released under the simplified BSD license:

Redistribution  and  use  in  source  and  binary  forms,  with  or  without
modification, are permitted provided that the following conditions are met:

   * Redistributions of source code must retain the above copyright notice,
     this list of conditions and the following disclaimer.

   * Redistributions in binary form must reproduce the above copyright
     notice, this list of conditions and the following disclaimer in the
     documentation and/or other materials provided with the distribution.

   * Neither the name of the  nor the names of its contributors may be used
     to endorse or promote products derived from this software without
     specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE  COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY  EXPRESS OR IMPLIED WARRANTIES,  INCLUDING, BUT NOT LIMITED  TO, THE
IMPLIED WARRANTIES OF  MERCHANTABILITY AND FITNESS FOR  A PARTICULAR PURPOSE
ARE  DISCLAIMED. IN  NO EVENT  SHALL  THE COPYRIGHT  HOLDER OR  CONTRIBUTORS
BE  LIABLE FOR  ANY  DIRECT, INDIRECT,  INCIDENTAL,  SPECIAL, EXEMPLARY,  OR
CONSEQUENTIAL  DAMAGES  (INCLUDING,  BUT  NOT  LIMITED  TO,  PROCUREMENT  OF
SUBSTITUTE GOODS  OR SERVICES; LOSS  OF USE,  DATA, OR PROFITS;  OR BUSINESS
INTERRUPTION)  HOWEVER CAUSED  AND ON  ANY THEORY  OF LIABILITY,  WHETHER IN
CONTRACT,  STRICT LIABILITY,  OR  TORT (INCLUDING  NEGLIGENCE OR  OTHERWISE)
ARISING IN ANY WAY  OUT OF THE USE OF THIS SOFTWARE, EVEN  IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.

Features

This version of libsparkcrypto implements the following cryptographic algorithms and modes:

  • AES-128, AES-192, AES-256
  • AES-CBC (all supported AES modes)
  • SHA-1
  • HMAC-SHA1
  • SHA-256, SHA-384, SHA-512
  • HMAC-SHA-256, HMAC-SHA-384, HMAC-SHA-512
  • PRF-HMAC-SHA-256, PRF-HMAC-SHA-384, PRF-HMAC-SHA-512
  • RIPEMD-160
  • HMAC-RIPEMD-160
  • ECDSA, ECGDSA

Development version

The current development version of libsparkcrypto is available through its GIT [3] repository: https://github.com/Componolit/libsparkcrypto.git

A browsable version of the repository is also available here: https://github.com/Componolit/libsparkcrypto

Building and installing

Required tools

To build and prove libsparkcrypto, the following tools are required:

  • GNAT (recent Pro or FSF)
  • SPARK 2014 (tested with Pro 19.0 and Community 2018)
  • GNU make
  • OpenSSL (for building the benchmark, tested with 1.1.1j)

The primary development environments of libsparkcrypto are Debian (x86_64) and Ubuntu (x86_64). Though the source and project files should be system independent, the Makefiles assume a UNIXish system (cygwin seems to work). Tools like mkdir, uname, tail and install must be present in the systems search path.

Build process

To build libsparkcrypto, change to the source directory and type:

$ make

You can install the library to <destination>, by typing:

$ make DESTDIR=<destination> install

Supported systems

libsparkcrypto was successfully built and tested on the following systems:

operating system architecture toolchain
Debian 9 x86_64 SPARK Pro 19.0, GNAT Pro 19.0
Debian 9 x86_64 GNAT Community 2018

Please send bug reports and comments to Alexander Senier <[email protected]>.

Using libsparkcrypto

Examples for using libsparkcrypto can be found in the tests subdirectory.

A user of the library has to provide a shadow for the package Interfaces providing a type definition for at least Unsigned_8, Unsigned_32 and Unsigned_64.

Extending libsparkcrypto

You are welcome to extend libsparkcrypto according to the terms of the simplified BSD license referenced above. Please obey the following rules when contributing changes back to the project:

  • Make sure no undischarged VCs remain.
  • Make sure the code compiles in both modes MODE=release and MODE=debug.
  • Provide reference to documents and test cases for the parts you implemented.
  • Make sure you successfully ran the test suite (make test).
  • Try to stay consistent with the current style of the source.
  • If feasible, implement a benchmark for your code.
  • Create a pull request on GitHub

The Directory structure of libsparkcrypto is as follows:

directory content
src/shared sources analyzed by SPARK and used by Ada compiler
src/spark sources only analyzed by SPARK
src/ada sources only used by Ada compiler

The directories src/ada and src/shared have a sub-directory generic, which contains platform independent code. Furthermore, there are feature-specific directories like little_endian and architecture-specific directories like x86_64 which are included to proof and build steps as configured.

Configuration is performed automatically by the top-level Makefile and can be altered by passing the following variables to make:

variable description
ARCH CPU architecture as reported by uname -m.
MODE Build mode (release or debug).
OPT Optimization level to use (s, 0, 1, 2 or 3).
SHARED Build a shared library (0, 1).
RUNTIME Runtime to build for (native or zfp).
NO_TESTS Disable tests step.
NO_SPARK Disable SPARK proof step.
NO_ISABELLE Disable ISABELLE proof step.
TARGET_CFG Target system configuration.
SPARK_DIR Base directory of the SPARK installation.
DESTDIR Installation base directory.

Credits

  • Thanks to Adrian-Ken Rüegsegger and Reto Buerki for hosting the project's GIT repository.
  • Thanks to Adacore and Altran Praxis for review, comments and support with many tricky problems.
[1]SPARK 2014 - https://www.adacore.com/about-spark
[2]OpenSSL: The Open Source toolkit for SSL/TLS - http://www.openssl.org
[3]GIT - the fast version control system, http://git-scm.com

libsparkcrypto's People

Contributors

kensan avatar reet- avatar senier avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

libsparkcrypto's Issues

Missing pragma Pure

The units LSC.Internal.Convert_Hash and LSC.SHA2_Generic are pure but are not marked as such.

Enable CI

Build, prove and test library in TravisCI.

Introduce "easy" interface

The API of libsparkcrypto is very cumbersome and generally only acceptable for high-security people that are willing to take a lot of pain. Most interfaces use 32 or 64 bit words as their basic data type, which requires a conversion when handling byte-oriented data. To make things worse, endianness must be taken into account all the time.

A new "easy" API is built on top of the existing word-oriented API. All parameters, like keys, cipher text or plain text are represented as byte arrays based on a modular byte type (mod 2**8). Where feasible, API calls are consolidated to reduce the steps necessary to perform a cryptographic operation (e.g. move the operation to create an AES key context into the encrypt call).

To ease testing of the API's and reuse of standard test vectors, conversion functions from hexadecimal strings to byte arrays and vice versa are added to the test utilities.

The following API's need to be implemented, including test cases and benchmarks (in order of priority):

  • AES256 CBC encryption
  • AES256 CBC decryption
  • SHA256 hash generation
  • HMAC SHA256 PRF generation
  • SHA512 hash generation
  • SHA384 hash generation
  • HMAC SHA512 authenticator generation
  • HMAC SHA384 authenticator generation
  • HMAC SHA512 authenticator generation
  • HMAC SHA256 authenticator generation
  • AES192 CBC encryption
  • AES192 CBC decryption
  • AES128 CBC encryption
  • AES128 CBC decryption
  • AES256 encryption
  • AES256 decryption
  • AES128 encryption
  • AES128 decryption
  • AES192 encryption
  • AES192 decryption
  • HMAC SHA512 PRF generation
  • HMAC SHA384 PRF generation
  • HMAC SHA512 PRF generation
  • SHA1 hash generation
  • HMAC SHA1 authenticator generation
  • RIPEMD160 hash generation
  • HMAC RIPEMD160 authenticator generation

Not sure whether explicit padding is needed very often now when we allow arbitrary byte arrays in all the crypto functions. In any case, this would be a function and thus the data would be put on the stack.

  • Message padding

Easy interfaces for bignum and EC

The easy interface was introduced for symmetric algorithms in #11. This ticket is for the remaining big number and asymmetric algorithms:

  • Big numbers
  • EC signature generation
  • EC signature verification
  • ECDH key agreement

Fix RSA tests

RSA tests fail and are disabled right now. Likely an issue with the binding to OpenSSL used for the tests.

Fix test suite

Tests currently fail with:

raised CONSTRAINT_ERROR : a-reatim.adb:55 overflow check failed
Makefile:92: recipe for target 'tests' failed
make: *** [tests] Error 1

Tested on Debian 9/x86_64 with GNAT Pro 19.0

Proof absence-of-runtime errors

A couple of VCs remain unproved (or get only proved with Isabelle). The goal is to prove absence of runtime errors using SPARKprove and leave functional properties to Isabelle.

  • src/shared/generic/lsc-types.ads
  • src/shared/generic/lsc.ads
  • src/shared/generic/lsc-aes.ads
  • src/shared/generic/lsc-aes-cbc.ads
  • src/shared/generic/lsc-sha2.ads
  • src/shared/generic/lsc-sha2-hmac.ads
  • src/shared/generic/lsc-sha1.ads
  • src/shared/generic/lsc-sha1-hmac.ads
  • src/shared/generic/lsc-ripemd160.ads
  • src/shared/generic/lsc-ripemd160-hmac.ads
  • src/shared/generic/lsc-internal-io.ads
  • src/shared/generic/lsc-internal.ads
  • src/shared/generic/lsc-internal-types.ads
  • src/shared/generic/lsc-internal-aes.ads
  • src/shared/generic/lsc-internal-aes-cbc.ads
  • src/shared/generic/lsc-internal-aes-print.ads
  • src/shared/generic/lsc-internal-aes-tables.ads
  • src/shared/generic/lsc-internal-byteorder32.ads
  • src/shared/generic/lsc-internal-byteorder64.ads
  • src/shared/generic/lsc-internal-byteswap32.ads
  • src/shared/generic/lsc-internal-byteswap64.ads
  • src/shared/generic/lsc-internal-convert.ads
  • src/shared/generic/lsc-internal-convert_hash.ads
  • src/shared/generic/lsc-internal-convert_hmac.ads
  • src/shared/generic/lsc-internal-hmac_ripemd160.ads
  • src/shared/generic/lsc-internal-hmac_sha1.ads
  • src/shared/generic/lsc-internal-hmac_sha256.ads
  • src/shared/generic/lsc-internal-hmac_sha384.ads
  • src/shared/generic/lsc-internal-hmac_sha512.ads
  • src/shared/generic/lsc-internal-sha256.ads
  • src/shared/generic/lsc-internal-sha256-tables.ads
  • src/shared/generic/lsc-internal-sha512.ads
  • src/shared/generic/lsc-internal-sha512-tables.ads
  • src/shared/generic/lsc-internal-ops32.ads
  • src/shared/generic/lsc-internal-ops64.ads
  • src/shared/generic/lsc-internal-pad32.ads
  • src/shared/generic/lsc-internal-pad64.ads
  • src/shared/generic/lsc-internal-ripemd160.ads
  • src/shared/generic/lsc-internal-ripemd160-print.ads
  • src/shared/generic/lsc-internal-sha1.ads
  • src/shared/generic/lsc-internal-math_int.ads
  • src/shared/generic/lsc-internal-ec.ads
  • src/shared/generic/lsc-internal-ec_signature.ads
  • src/shared/generic/lsc-internal-bignum.ads

Generic interface for byte buffer

Currently the easy API uses Types.Bytes for all byte buffers. This is inconvenient if a different type is used for buffers in the program using LSC. Operations using byte buffers should have generic types for their byte buffers. To increase security, different parameters (e.g. ciphertext/plaintext) should use different generic types.

The following APIs need to be adapted:

  • AES.Dec_Key => Key_Type
  • AES.Enc_Key => Key_Type
  • AES.Decrypt => Ciphertext_Type
  • AES.Encrypt => Plaintext_Type
  • AES.CBC.Decrypt => Ciphertext_Type, Plaintext_Type, IV_Type
  • AES.CBC.Encrypt => Ciphertext_Type, Plaintext_Type, IV_Type
  • RIPEMD160 => Message_Type, Hash_Type
  • RIPEMD160.HMAC => Key_Type, Message_Type, Hash_Type
  • SHA1 => Message_Type, Hash_Type
  • SHA1.HMAC => Key_Type, Message_Type, Hash_Type
  • SHA2 => Message_Type, Hash_Type
  • SHA2.HMAC => Key_Type, Message_Type, Hash_Type

Reimplement AES CBC for easy interface

Bad hacks (with overlaying 'Address clauses) are needed to bind the internal AES CBC API to the easy interface. Instead we should reimplemented CBC in terms of the new API.

Port tests to AUnit

Migrate the tests from SPARKUnit to AUnit to get rid of the maintenance overhead.

Make library pure

The library should not have any internal state - add pragma Pure to all packages.

Separate key generation for AES

Change interface of AES such that key setup and encryption/hashing are separate steps. The rationale is that many encryption operations may be performed with the same key. In this case, the setup cost on each encryption can be saved.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.