Code for the paper: "Head Start: Digit Extraction in TFHE from MSB to LSB"
The code is implemented on top of the tfhe-rs library by Zama, and can be applied as a patch on the open-source library.
- Clone the tfhe-rs library:
git clone https://github.com/zama-ai/tfhe-rs.git ; cd tfhe-rs - Switch to the correct version (V1.1.0):
git checkout 2cd16ac70af19308e7a4578083b4e2e3730964ca - Copy patch file and apply patch to the repo:
git apply ../HeadStart.patch
- Clone the lattice estimator by Albrecht et al.: git clone https://github.com/malb/lattice-estimator.git
- Switch to the correct version: cd lattice-estimator ; get checkout 14a362513c9197dd959bc72428425abe0309779a ; cd ..
- Run the parameter estimation script using sage: sage -python parameterTest.py
- Change fixed parameters N, k (polynomial size and GLWE size) within the find_parameters function
- Change optimization requirements (security, error probability, usage of mean compensation) in the main function
- security_cache.pkl caches all security estimates that were generated before to accelerate the program. Remove this file and run parameterTest.py to rerun all lattice security estimates.
- Run
make test_vector_sumto test the summation application. It also automatically prints the number of bootstraps that were required to perform the operation.
- Run
make bench_integerto get the DirtyMSB technique latency results using 8-threads for the summation application.
The patch file also contains specific code from:
- SMOOTHIE: (Multi-)Scalar Multiplication Optimizations On TFHE
- Don't be mean: Reducing Approximation Noise in TFHE through Mean Compensation
TODO