Skip to content

Support for fully homomorphic encryption on training, finetuning, and inference #584

Open
@sirus20x6

Description

Training: Enabling homomorphic encryption would allow the use of data that is licensed for training to be given out to clients without leaking the training data itself. Beyond licensing, what data you chose to train on may be a trade secret.

Inference: Homomorphic encryption for inference would mean people could use something like petals without the risk of leaking sensitive information on their work to others.

Finetuning: Being able to fine tune on your data without leaking it back to participating clients.

Resources:

Hugging Face - Towards Encrypted Large Language Models with FHE

Arxiv - Enabling Homomorphically Encrypted Inference
for Large DNN Models

Github - Concrete ML is a Privacy-Preserving Machine Learning PPML
open-source set of tools built on top of Concrete by Zama. It aims to simplify the use of fully homomorphic encryption (FHE) for data scientists to help them automatically turn machine learning models into their homomorphic equivalent.

Arxiv - Enabling Homomorphically Encrypted Inference
for Large DNN Models

Nvidia - Federated Learning with Homomorphic Encryption

Github - tenSEAL A library for doing homomorphic encryption operations on tensors

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions