Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add SynthID (watermerking by Google DeepMind) #34350

Merged
merged 52 commits into from
Oct 23, 2024
Merged

Add SynthID (watermerking by Google DeepMind) #34350

merged 52 commits into from
Oct 23, 2024

Conversation

gante
Copy link
Member

@gante gante commented Oct 23, 2024

What does this PR do?

Adds SynthID, a watermarking by DeepMind.

https://deepmind.google/technologies/synthid/

Applying watermarking and using a detector is added to transfomers. Training a detector is added as a research project.

@gante gante changed the title Synthid Add SynthID (watermerking by Google DeepMind) Oct 23, 2024
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@gante gante marked this pull request as ready for review October 23, 2024 16:18
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First path! Looks really good!

return all_masks, all_g_values


def tpr_at_fpr(detector, detector_inputs, w_true, minibatch_size, target_fpr=0.01) -> torch.Tensor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have no idea what tpr and fpr means, let's either be explicit, or have a small docstring

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

expanded docstring 👍

examples/research_projects/synthid_text/utils.py Outdated Show resolved Hide resolved
src/transformers/generation/logits_process.py Outdated Show resolved Hide resolved
Comment on lines +300 to +301
self.beta = torch.nn.Parameter(-2.5 + 0.001 * torch.randn(1, 1, watermarking_depth))
self.delta = torch.nn.Parameter(0.001 * torch.randn(1, 1, self.watermarking_depth, watermarking_depth))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these are usually initialized in the _init_weights function rather than here!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ihere we init with zeros or empty

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not common, but we do have this pattern in other places (e.g.)

(I also have no idea how to set this specific initialization in _init_weights 😅 )


# [batch_size, seq_len, watermarking_depth]
# Long tensor doesn't work with einsum, so we need to switch to the same dtype as self.delta (FP32)
logits = torch.einsum("ijkl,ijkl->ijk", self.delta, x.type(self.delta.dtype)) + self.beta
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be a lot better if we can avoid einsums! 🤗

(i, j, k, l) x (i, j, k, l) -> (i, j, k)

would be:

(i, j, k, 1, l) x (i, j, k, l, 1) -> (i, j, k,1)

so:

self.delta[.., None,:] @ x.transpose(-2,-1)[..., None])

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea!

(the correct form is then (self.delta[.., None,:] @ x[..., None]).squeeze())

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating! 🤗

@gante gante merged commit b0f0c61 into main Oct 23, 2024
26 checks passed
@gante gante deleted the synthid branch October 23, 2024 20:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants