Giter VIP home page Giter VIP logo

xpir's Introduction

XPIR: Private Information Retrieval for Everyone

XPIR v0.2.0-beta is released please get it here.

This version introduces three major changes:

  • Compilation with cmake instead of the classic autotools
  • Dependencies are no longer included (the user can install them by himself or use a script to download and install them)
  • An API allowing to use XPIR as a library has been released (see below)

The original client/server applications are still available with the associated optimization tools. These can still be used for example to do some tests on PIR without developping an application or to use the optimization process in order to get interesting cryptographic and PIR parameters for a given setting.

If you have compilation/execution issues please contact us. The old version is still available in the branch old-master and of course through previous releases.

Introduction:

XPIR allows a user to privately download an element from a database. This means that the database server knows that she has sent a database element to the user but does not know which one. The scientific term for the underlying protocol is Private Information Retrieval (PIR). This library is described and studied in the paper:

Carlos Aguilar-Melchor, Joris Barrier, Laurent Fousse, Marc-Olivier Killijian, "XPIR: Private Information Retrieval for Everyone", Proceedings on Privacy Enhancing Technologies. Volume 2016, Issue 2, Pages 155–174, ISSN (Online) 2299-0984, DOI: 10.1515/popets-2016-0010, December 2015.

If you use our library, or a sub-part, such a NFLlib, please cite this paper on your work.

This project is closely related to another project available at GitHub: NFLlib. The NTT-based Fast Lattice library, which allows fast cryptographic computations over ideal lattices. Adapting this project to a more recent version of NFLlib would provide a performance boost but would be a lot of work so we are not planning to do it immediately.

Important Note 1: A PIR implementation for a specific application can be much simpler and more compact than this library. Indeed, much of the code of XPIR is aimed to ensure that the library delivers good performance in a large span of applications without user interaction so that the user does not need to be an expert on PIR or cryptography to use it with good performance results. If you want to use a PIR protocol for a very specific setting feel free to contact us for building up a collaboration !

Important Note 2: For publication issues, a small part of the code is missing. From a technical point of view this correspond to the gaussian noise generator for LWE which is replaced by a uniform noise generator until some scientific results are published. Replacing our uniform noise generator with our gaussian noise generator does not impact performance in an observable way.

Important Note 3: This software cannot provide reliable privacy without more scrutiny on many details. We have tried to provide some resiliance to timing tests, but haven't tested them thoroughly. The random seed generation and pseudorandom generation use strong functions but we haven't done a thorough analysis of whether an idiotic fault is present in those critical sections of the code or not. No input verification is done by the server or client so many buffer overflows are potentially possible, etc. As is, the software shows that privacy is possible but cannot guarantee it against strong active adversaries (using timing attacks, introducing malformed entries, etc.) until it gets enough scrutiny. Setting correctly the DYLD_LIBRARY_PATH to point to the correct directory (e.g. $PROJECT_HOME/local/lib) may be necessary.

Installation:

Requirements:

Get a copy of the project with:

On OSX only, execute the following commands (due to avx optimization issues using clang-3.6 is mandatory):

sudo port install gcc48
sudo port select gcc mp-gcc48
sudo port install clang-3.6
sudo port select clang mp-clang-3.6

On All systems, execute the following commands:

You need cmake, GMP (version 6) Mpfr (version 3.1.2), and some boost modules (atomic, chrono, date_time, exception, program_options, regex, system, thread, all in version 1.55.0) to build XPIR. You can install them by yourself on your system (if you know how to do it this will be the fastest option). You can also use the (slower but safer) script helper_script.sh that is in the root directory of the repository to retrieve the exact versions and compile them in a local directory (namely ./local/). If two versions of the required libraries exist (one local and one system-wide) the local will be taken preferently.

To build, and test XPIR, run the following:

$> mkdir _build
$> cd _build
$> cmake .. -DCMAKE_BUILD_TYPE=Release 
$> make
$> make check

The first test should be pretty long (to build initial caches) and then a set of tests should be display CORRECT or "Skipping test...". If you get INCORRECT tests or core dump notifications then something went wrong ...

On OSX only, if you have an old OSX version and very long paths, you may have this error:

error: install_name_tool: changing install names or rpaths can't be redone for: _build/apps/pir_server 
(for architecture x86_64) because larger updated load commands do not fit

This is solved by moving XPIR to a directory with a shorter path (so that hard paths to libraries can fit in the executable header).

The following CMake options are relevant:

Option Description
-DSEND_CATALOG=OFF Do not send the catalog to client (default is send catalog if
-DMULTI_THREAD=OFF Do not use multi-threading
-DPERF_TIMERS=OFF Do not show performance measurements during execution
-DCMAKE_BUILD_TYPE=Debug Add debugging options and remove optimization

Usage of XPIR as a library:

Besides a client and a server that can be used as standalone applications to do PIR protocols, we have (as of version 0.2) created a simple library that gives access to the major functions needed to build a PIR protocol (generate query, generate reply, extract reply, and helper functions). The API is in ./pir/libpir.h (from the build directory) and applications using this API must link dynamically or statically the libraries libpir.so/lippir_static.a (or libpir.dylib/lipbir_static.a for OSX) that can be found in the same directory as libpir.h.

A simple demonstration of how to use this API to build PIR protocols is available on the source tree at apps/simplepir/simple_pir.cpp. It can be run from apps/simplepir in the build tree.

In order to compile a PIR protocol using the API, such as simplepir, one just need the library (either static or dynamic) and the includes. And compiling can be done with something like : g++ -std=c++11 simplePIR.cpp -I$include_dir -L$lib_dir -lpir_static -lgmp -lmpfr -fopenmp -lboost_thread -lboost_system

Usage of the client/server apps:

XPIR comes with a client and a server that allow anybody to test the library on simple examples. Both must be started on their respective directories. From the build directory, to start the server execute:

$ cd apps/server
$ ./pir_server

And to start the client execute (on a different terminal):

$ cd apps/client
$ ./pir_client

By default the client tries to reach a local server but a given IP address and port can be specified, use --help to get help on the different options for distant connections.

If run without options the PIR server will look for files in a directory db inside the server directory and consider each file is a database element. The client will present a catalog of the files and ask the user to choose a file. When this is done the client will run an optimizer to decide which are the best cryptographic and PIR parameters to retrieve the file. Then he will send an encrypted PIR Query (i.e. a query that the server will mix with the database without understanding which element it allows to retrieve) to the server. The server then computes an encrypted PIR reply and sends it to the client. Finally, the client will decrypt this reply and store the resulting file in the reception directory inside the client directory.

Available options for the server (pir_server command):

-h,--help
Print a help message with the different options.

-z, --driven
Server-driven mode. This mode is to be used when multiple clients will connect to the server with the same cryptographic and PIR parameters. This allows the server to import the database into RAM and to perform precomputations over the database for the first client which significantly increases the performance for the following clients if LWE-based cryptography is used. The first client will ask for a given configuration (depending on its optimizer and on the command-line constraints given to the client). After this configuration client, the server will tell the following clients that he is in server-driven mode and that the configuration is imposed. The configuration given by the first client is stored in file arg or in exp/PIRParams.cfg if arg is not specified for further usage (see -L option).

-L, --load_file arg
Load cryptographic and PIR parameters from arg file. Currently unavailable (see issues).

-s, --split_file arg (=1)
Only use first file in db directory and split it in arg database elements. This allows to have a large database with many fixed size elements (e.g. bits, bytes, 24-bit depth points) into a single file which is much more efficient from a file-system point of view than having many small files. Building databases from a single file with more complex approaches (e.g. csv, or sqlite files) would be a great feature to add to XPIR.

-p, --port arg (=1234)
Port used by the server to listen to incoming connections, by default 1234.

--db-generator
Generate a fake database with random elements instead of reading it from a directory. This is useful for performance tests. It allows to deal with arbitrary databases without having to build them on the file-system and to evaluate performance costs without considering disk access limitations.

-n, --db-generator-files arg (=10)
Number of files for the virtual database provided by the DB generator.

-l [ --db-generator-filesize ] arg (=12800000)
Filesize in bytes for the files in the virtual database provided by the DB generator.

--no-pipeline
No pipeline mode. In this mode the server executes each task separately (getting the PIR Query, computing the reply, sending it). Only useful to measure the performance of each step separately.

Available options for the client (pir_client command):

-h, --help
Display a help message.

-i, --serverip arg (=127.0.0.1)
Define the IP address at which the client will try to contact the pir_server.

-p [ --port ] arg (=1234)
Define the port at which the client will try to contact the pir_server.

-c, --autochoice
Don't display the catalog of database elements and automatically choose the first element without waiting for user input.

--dry-run
Enable dry-run mode. In this mode the client does not send a PIR Query. It runs the optimizer taking into account the command-line options and outputs the best parameters for each cryptosystem (currently NoCryptography, Paillier and LWE) with details on the costs evaluated for each phase (query generation, query sending, reply generation, reply sending, reply decryption). If a server is available it interacts with it to set the parameters: client-server throughput and server-client throughput. It also requests from the server the performance cache to evaluate how fast the server can process the database for each possible set of cryptographic parameters. If no server is available it uses default performance measures. The other parameters are set for the default example: a thousand mp3 files over ADSL, aggregation disabled and security k=80. Each of these parameters can be overridden on the command line.

--verbose-optim
Ask the optimizer to be more verbose on the intermediate choices and evaluations (as much output as in the dry-run mode).

--dont-write
Don't write the result to a file. For testing purposes, it still will process the reply (decryption of the whole answer).

-f, --file arg
Use a config file to test different optimizations in dry-run mode (see exp/sample.conf). Must be used with the --dry-run option or it is ignored.

Available options for the optimizer (through pir_client command):

-n, --file-nbr arg
Used in dry-run mode only: Override the default number of database elements.

-l, --file-size ] arg
Used in dry-run mode only: Override the default database element size (in bits).

-u, --upload arg
Force client upload speed in bits/s (bandwidth test will be skipped). This is valid in dry-run or normal mode (e.g. if a user does not want to use more than a given amount of his bandwidth).

-d, --download arg
Force client download speed in bits/s (bandwidth test will be skipped). This is valid in dry-run or normal mode.

-r, --crypto-params arg
Limit with a regular expression arg to a subset of the possible cryptographic parameters. Parameters are dependent on each cryptographic system:

  • NoCryptography if a trivial full database download is to be done after which pir_client stores only the element the user is interested in.
  • Paillier:A:B:C if Paillier's cryptosystem is to be used with A security bits, a plaintext modulus of B bits and a ciphertext modulus of C bits.
  • LWE:A:B:C if LWE is to be used with A security bits, polynomials of degree B and polynomial coefficients of C bits. For example it is possible to force just the cryptosystem with NoCryptography.* or LWE.*, or ask for a specific parameter set like Paillier:80:1024:2048. Specifying the security with this option is tricky as it must match exactly so better use -k for this purpose.

-k, --security arg (=80)
Minimum security bits required for a set of cryptographic parameters to be considered by the optimizer.

--dmin arg (=1)
Min dimension value considered by the optimizer. Dimension is also called recursion in the literature. It is done trivially (see the scientific paper) and thus for dimension d query size is proportional to d n^{1/d} and reply size is exponential in d. For databases with many small elements a d>1 can give the best results, but only in exceptional situations having d>4 is interesting.

--dmax arg (=4)
Max dimension value considered by the optimizer.

-a, --alphaMax arg (=0)
Max aggregation value to test (1 = no aggregation, 0 = no limit). It is sometimes interesting to aggregate a database with many small elements into a database with fewer but larger aggregated elements (e.g. if database elements are one bit long). This value forces the optimizer to respect a maximum value for aggregation, 1 meaning that elements cannot be aggregated.

-x, --fitness arg (=1)
Set fitness method to: 0=SUM Sum of the times on each task 1=MAX Max of server times + Max of client times 2=CLOUD Dollars in a cloud model (see source code) This sets the target function of the optimizer. When studying the different parameters the optimizer will choose the one that minimizes this function. 0 corresponds to minimizing the resources spent, 1 to minimizing the round-trip time (given that server operations have are pipelined and client operations are also, independently, pipelined), 2 corresponds to minimizing the cost by associating CPU cycles and bits transmitted to money using a cloud computing model.

Contributors:

This project has been imported to GitHub once mature enough by Carlos Aguilar Melchor which erased all the possible "blames" attributing to each contributor each line of code. In order to give a fair idea of the line contributions per author, the following command:

git ls-tree --name-only -z -r HEAD|egrep -z -Z -E '\.(cc|h|cpp|hpp|c|txt)$'  |egrep -z -Z -E -v dependencies|egrep -z -Z -E -v boost|xargs -0 -n1 git blame --line-porcelain|grep "^author "|sort|uniq -c|sort -nr

which counts the lines by contributor for the initial commit put into GitHub (removing the dependencies directory which does not correspond to code we developed) gave the following output (aggregating aliases) just before transfering the project to GitHub:

7655 author Carlos Aguilar-Melchor
5693 author Joris Barrier
1153 author Marc-Olivier Killijian

Besides this line counting, the roles were distributed as follows:
Carlos Aguilar-Melchor (Associate Professor): co-supervisor, developer
Joris Barrier (PhD student): developer
Marc-Olivier Killijian (Researcher): co-supervisor, developer

Affiliations:

Carlos Aguilar Melchor has been during almost all this project affiliated to the XLIM laboratory in Limoges, he is currently at IRIT laboratory in Toulouse. Joris Barrier and Marc-Olivier Killijian are affiliated to the LAAS-CNRS laboratory in Toulouse.

The contributors thank Laurent Fousse for his help on evaluating the performance of NTRU.

xpir's People

Contributors

carlosaguilarmelchor avatar kirija avatar mandragorian avatar nicolasamat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xpir's Issues

PIROptimizer: Error when writing optimization data, aborting.

Hi,

I receive error PIROptimizer: Error when writing optimization data, aborting. from the server when trying to do a query. Any ideas what could cause this?

I'm compiling on Ubuntu 14.04 running in docker. At first I had trouble finishing the check-correctness.sh script but it went through after I granted more virtual memory -- wondering if this could play a part in the error from the optimiser?

Thanks in advance,
Morten

unused functions

Several functions are actually unused.

Functions :
'DEBUG_MESSAGE' in /XPIR-master/crypto/NFLlib.cpp
'allocandcomputeShouppoly' in /XPIR-master/crypto/NFLlib.hpp
'allocpoly' in /XPIR-master/crypto/NFLlib.cpp
'bitsplitter_backtoback_internal_test' in /XPIR-master/crypto/NFLlib.cpp
'blo' in XPIR-master/apps/server/PIRSession.cpp
'file_exists' in /XPIR-master/apps/client/DESC.cpp
'generateReplyExternal' in /XPIR-master/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp
'getAllOptimData' in /XPIR-master/pir/optim/OptimService.cpp
'getChosenElement' in /XPIR-master/pir/queryGen/PIRQueryGenerator_internal.cpp
'getChosenFileSize' in /XPIR-master/pir/replyExtraction/PIRReplyExtraction_internal.cpp
'getCiphertextBytesize' in /XPIR-master/crypto/HomomorphicCrypto.hpp
'getCiphertextSize' in /XPIR-master/crypto/NFLLWEPublicParameters.cpp
'getOneCryptoSystem' in /XPIR-master/crypto/HomomorphicCryptoFactory_internal.cpp
'getParameters' in /XPIR-master/apps/optim/PIROptimizer.cpp
'getbits' in /XPIR-master/crypto/PaillierKeys.cpp
'getsecretKey' in /XPIR-master/crypto/NFLLWE.cpp
'gotoLine' in /XPIR-master/pir/optim/OptimService.cpp
'isFinished' in /XPIR-master/apps/server/PIRSession.cpp
'mulandaddPolyNTTShoup' in /XPIR-master/crypto/NFLlib.hpp
'mulmodPolyNTT' in /XPIR-master/crypto/NFLlib.hpp
'mulmodPolyNTTShoup' in /XPIR-master/crypto/NFLlib.hpp
'setPIRParameters' in /XPIR-master/pir/queryGen/PIRQueryGenerator_internal.cpp
'setType' in /XPIR-master/pir/optim/OptimVars.cpp
'setWrittenSize' in /XPIR-master/pir/events/WriteEvent.cpp
'sighandler' in /XPIR-master/apps/client/main.cpp
'submodPoly' in /XPIR-master/crypto/NFLlib.hpp
'verifyOptimData' in /XPIR-master/pir/optim/OptimService.cpp
'writeLWEFile' in /XPIR-master/pir/optim/OptimService.cpp
'writeOptimData' in /XPIR-master/pir/optim/OptimService.cpp

make error

Scanning dependencies of target db
[ 36%] Generating .db, WORKING_DIR, .
apps/server/CMakeFiles/db.dir/build.make:60: recipe for target 'apps/server/.db' failed
make[2]: *** [apps/server/.db] Error 126
CMakeFiles/Makefile2:383: recipe for target 'apps/server/CMakeFiles/db.dir/all' failed
make[1]: *** [apps/server/CMakeFiles/db.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2

What dependencies do I need to install, the database? Are there any version restrictions?

Ubuntu installing error

Hello, I encountered the following problem when compiling XPIR on Ubuntu 20.04
After running 'sh.helper-scirpt.sh','mkdir _build'.
Run 'cmake .. -DCMAKE_BUILD_TYPE=Release' and report an error:

-- The C compiler identification is GNU 9.4.0
-- The CXX compiler identification is GNU 9.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test COMPILER_SUPPORTS_CXX11
-- Performing Test COMPILER_SUPPORTS_CXX11 - Success
-- Building RELEASE version
-- Performing Test COMPILER_SUPPORTS-MARCH=NATIVE
-- Performing Test COMPILER_SUPPORTS-MARCH=NATIVE - Success
-- Performing Test COMPILER_SUPPORTS-MTUNE=NATIVE
-- Performing Test COMPILER_SUPPORTS-MTUNE=NATIVE - Success
-- Performing Test COMPILER_SUPPORTS-FUNROLL-LOOPS
-- Performing Test COMPILER_SUPPORTS-FUNROLL-LOOPS - Success
-- The ASM compiler identification is GNU
-- Found assembler: /usr/bin/cc
CMake Error at /usr/lib/x86_64-linux-gnu/cmake/Boost-1.71.0/BoostConfig.cmake:117 (find_package):
Could not find a package configuration file provided by "boost_exception"
(requested version 1.71.0) with any of the following names:

boost_exceptionConfig.cmake
boost_exception-config.cmake

Add the installation prefix of "boost_exception" to CMAKE_PREFIX_PATH or
set "boost_exception_DIR" to a directory containing one of the above files.
If "boost_exception" provides a separate development package or SDK, be
sure it has been installed.
Call Stack (most recent call first):
/usr/lib/x86_64-linux-gnu/cmake/Boost-1.71.0/BoostConfig.cmake:182 (boost_find_component)
/usr/share/cmake-3.16/Modules/FindBoost.cmake:443 (find_package)
CMakeLists.txt:18 (find_package)

CMake Warning (dev) in /usr/share/cmake-3.16/Modules/FindBoost.cmake:
Policy CMP0011 is not set: Included scripts do automatic cmake_policy PUSH
and POP. Run "cmake --help-policy CMP0011" for policy details. Use the
cmake_policy command to set the policy and suppress this warning.

The included script

/usr/share/cmake-3.16/Modules/FindBoost.cmake

affects policy settings. CMake is implying the NO_POLICY_SCOPE option for
compatibility, so the effects are applied to the including context.
Call Stack (most recent call first):
CMakeLists.txt:18 (find_package)
This warning is for project developers. Use -Wno-dev to suppress it.

-- Configuring incomplete, errors occurred!
See also "/home/ubuntu-1/XPIR-master/_build/CMakeFiles/CMakeOutput.log".

Catalog object

Hi,

I was wondering if you could help me with something. I was adapting the apps/server and apps/client to work with my solution (pipeline version) but I am kind of stuck in the the client's startProcessResult() function.

In your case you use data from this 'catalog' object both for the reply extractor and reply writer. How can I do the same thing without this 'catalog' ? (client knows which element to query so no need for this catalog step).

Thanks,
Joao Sa

problem with LWE

i am using this library , but when the choice algorithm for the PIR is LWE , the protocol run and it display succes, when i go inside reception folder i find another file not that i have chosen

Error installing XPIR on Mac OSX (El Capitan)

Although XPIR is running on Ubuntu I tried to install this same library (release 0.2.0) on MAC OSX (my work computer).

I tried following all compiling steps (mac ports, clang 3.6/gcc4.8) using brew to install the remaining dependencies (GMP/mprf/boost) but I always get the same error during make (which I presume to be something related with boost linkage). I already tried to install boost from source files and even tried to statically assign the BOOST_ROOT variable (BOOST_ROOT=/usr/local/Cellar/boost).Any ideas on what else should I do to make this work?

$ cmake .. -DCMAKE_BUILD_TYPE=Release -DCMAKE_C_COMPILER=/opt/local/bin/gcc-mp-4.8 -DCMAKE_CXX_COMPILER=/opt/local/bin/clang-mp-3.6

-- The C compiler identification is GNU 4.8.5
-- The CXX compiler identification is Clang 3.6.2
-- Checking whether C compiler has -isysroot
-- Checking whether C compiler has -isysroot - yes
-- Checking whether C compiler supports OSX deployment target flag
-- Checking whether C compiler supports OSX deployment target flag - yes
-- Check for working C compiler: /opt/local/bin/gcc-mp-4.8
-- Check for working C compiler: /opt/local/bin/gcc-mp-4.8 -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /opt/local/bin/clang-mp-3.6
-- Check for working CXX compiler: /opt/local/bin/clang-mp-3.6 -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Performing Test COMPILER_SUPPORTS_CXX11
-- Performing Test COMPILER_SUPPORTS_CXX11 - Success
-- Building RELEASE version
-- Performing Test COMPILER_SUPPORTS-MARCH=NATIVE
-- Performing Test COMPILER_SUPPORTS-MARCH=NATIVE - Failed
-- Performing Test COMPILER_SUPPORTS-MTUNE=NATIVE
-- Performing Test COMPILER_SUPPORTS-MTUNE=NATIVE - Failed
-- Performing Test COMPILER_SUPPORTS-FUNROLL-LOOPS
-- Performing Test COMPILER_SUPPORTS-FUNROLL-LOOPS - Failed
-- The ASM compiler identification is GNU
-- Found assembler: /opt/local/bin/gcc-mp-4.8
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - found
-- Found Threads: TRUE
-- Boost version: 1.60.0
-- Found the following Boost libraries:
-- atomic
-- chrono
-- date_time
-- exception
-- program_options
-- regex
-- system
-- thread
-- program_options
-- GMP libs: /opt/local/lib/libgmp.dylib /opt/local/lib/libgmpxx.dylib
-- Found GMP: /opt/local/include (Required is at least version "6")
-- mpfr libs: /opt/local/lib/libmpfr.dylib
-- Found MPFR: /opt/local/include (Required is at least version "3.1.2")
-- Send the catalog to the client
-- Use multi-threading
-- Show performance measurements during execution
-- GMP libs: /opt/local/lib/libgmp.dylib /opt/local/lib/libgmpxx.dylib
-- Configuring done
-- Generating done

$ make

Undefined symbols for architecture x86_64:
"boost::program_options::to_internal(std::basic_string<char, std::char_traits, std::allocator > const&)", referenced from:
boost::program_options::basic_parsed_options boost::program_options::parse_command_line(int, char const const, boost::program_options::options_description const&, int, boost::function1<std::pair<std::basic_string<char, std::char_traits, std::allocator >, std::basic_string<char, std::char_traits, std::allocator > >, std::basic_string<char, std::char_traits, std::allocator > const&>) in main.cpp.o
"boost::program_options::options_description::options_description(std::basic_string<char, std::char_traits, std::allocator > const&, unsigned int, unsigned int)", referenced from:
_main in main.cpp.o
(...)
vtable for boost::program_options::typed_value<unsigned int, char> in main.cpp.o
ld: symbol(s) not found for architecture x86_64
collect2: error: ld returned 1 exit status

make[2]: *** [apps/client/pir_client] Error 1
make[1]: *** [apps/client/CMakeFiles/pir_client.dir/all] Error 2

Best regards,
Joao Sa

Have multiple replyarrays for multiple iterations

Modify PIRReplyGeneratorNFL (make replyArray a 2D array, pass the iteration index to replyGeneration)
Modify PIRSession (uploadWorker must send all the arrays)
Restore memory deallocation (uploadWorker)

Recursion returning extra elements

I'm trying to play around with recursion using the simple_pir.cpp example with libPIR. Specifically, I've been modifying Test 3/7. I've changed the database size to be 1<<13, still with 100 files. This means each file is now 82 bytes, which should fit inside the absorption size for LWE. I then set params.d = 2, and set params.n[0] = 10 and params.n[1] = 10.

I expect that the reply generator should return only a single reply of ciphertext size (65536 bytes), but instead, I receive back 6 replies. Why is this? Given that I'm requesting only one file that fits within absorption size, why does using recursion give me these 5 additional blocks?

I verified with the same database size but without recursion (using params.d = 1 and params.n[0] = 100) that this only returns a single reply, but I can't figure out why with recursion it returns multiple blocks.

Am I missing something on how recursion works, or is this a bug?

Changes to the 'one file' db approach

I am currently developing a small demo using the 'multiple files' (in folder) db but I was wondering if it would be too hard to change the 'one file' approach. In other words, what could I possible do to enable the division of the file into different asymmetric byte chunks (e.g. interpreting different type of files with different size of entries)

I know that this may involve big changes in the framework/code but for now I was just wanted to know if it is feasible or not.

Joao Sa

Problem with the print inside the library

For our project we make use of the standard output to communicate with a python script. I think is better to avoid prints inside the library if DEBUG is not defined.
In the file PIRReplyExtraction_internal.cpp in function extractReply() there are printouts.

Troubleshooting - error running simple pir_server, pir_client

Hi,

Sorry to bother you but I was trying to run you XPIR framework but whenever I try to execute the simple client/server application I get this error...

PIROptimizer: Error when writing optimization data, aborting.

Do you know why does this happen?

Best regards,
Joao Sa

Gaussian noise was zeroed in b

The code in NFLLWE::recomputeNoiseAmplifiers() was messed up resulting in no noise at all in the b part of ciphers.

it did :

tmpz1=0
tmpz2=0
foreach modulus
   tmpz2=2**Abits
   tmpz2=modulus
   Abitmod = tmpz1 % tmpz2
endforeach

I guess the point was to do Abitmod = 2Abits % modulus and that "tmpz2=2Abits" was meant to be "tmpz1=2**Abits"

PR coming with this correction.

Initialize Database from memory

Since having a file for each database element can be untractable for many applications, I believe having a way to initialize the databse from memory, for example an std::vector, would be great.

From a quick look at the code I think that this should not be that difficult, except that the DBHandler intrface uses ifstreams which are file specific.

Maybe a new interface like the following could be more generic:

virtual bool openStream(uint64_t streamNb, uint64_t requested_offset)=0;
virtual uint64_t readStream(uint64_t streamNb, char * buf, uint64_t size)=0;
virtual void closeStream(unit64_t streamNb)=0;

Database handlers that deal with files would open an ifstream when openStream is called and store them in a private mapping with keys the streamNbs. Calls to readStream would retrieve the opened ifstream from the mapping and otherwise work as they already do. closeStream would be updated similarly.

The new database handler that deals with vectors would keep an offset for each vector that would represent how many bytes have already been read from that "stream". Using that information it would simulate reading and writting from a file.

Unread and Unused Variables

Several variables are actually unused or unread in the code.

Variables with the actual location :

  • 'read_size' in /XPIR/apps/optimPIROptimizer.cpp line 1015
  • 'nbFiles' in /XPIR/apps/server/PIRSession.cpp line 48
  • 'start' in /XPIR/apps/server/PIRSession.cpp line 387
  • 'i' in /XPIR/apps/server/PIRSession.cpp line 516
  • 'wasVerbose' in /XPIR/apps/server/PIRSession.cpp line 438
  • 'maxFileBytesize' in /XPIR/apps/simplepir/simplePIR.cpp line 303
  • 'A_bits' in /XPIR/crypto/NFLLWE.cpp line 289
  • 'k_str' in /XPIR/crypto/NFLLWE.cp line 568
  • 'isnt_unit32PerChunk' in /XPIR/crypto/NFLIB.cpp line 455
  • 'subchunkMasks' in /XPIR/crypto/NFLlib.cpp line 470
  • 'isint_unit64PerChunk' in /XPIR/crypto/NFLlib.cpp line 682
  • 'i' in /XPIR/crypto/PaillierAdapter.cpp line 299
  • 'i' in /XPIR/pir/libpir.cpp line 29
  • 'current_bytesskipped' in /XPIR/PIR/replyExtraction/PIRReplyWritter.cpp line 125
  • 'abs_size' in /XPIR/pir/replyGenerator/PIRReplyGeneratorGMP.cpp line 126
  • raw_data' in /XPIR/pir/replyGenerator/PIRReplyGeneratorGMP.cpp line 127
  • 'affichages in /XPIR/pir/replyGenerator/PIRReplyGeneratorGMP.cpp line 304
  • 'now' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 98
  • 'delta' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 98
    -' nflptr' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 476
  • 'jumpcipher' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 170
  • 'usable_memory' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 216
  • 'max_memory_per_file' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 206
  • 'database_size' in /XPIR/pir/replyGenerator/PIRReplyGeneratorNFL_internal.cpp line 310

Maybe I forgot some variables...

Others trival imporvments will be find.

make error with boost/bind/bing.hpp

/usr/include/boost/bind/bind.hpp:319:35: error: no match for call to ‘(boost::mfi::mf1<void, PIRView, MessageEvent&>) (PIRViewCLI*&, std::Placeholder<1>&’
319 | unwrapper::unwrap(f, 0)(a[base_type::a1
], a[base_type::a2
]);
| ~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.