Skip to content

fix: check for duplicates in keyword validate #14585

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 32 additions & 1 deletion lib/elixir/lib/keyword.ex
Original file line number Diff line number Diff line change
Expand Up @@ -281,7 +281,15 @@ defmodule Keyword do
end

defp validate([], values1, values2, acc, []) do
{:ok, move_pairs!(values1, move_pairs!(values2, acc))}
list = move_pairs!(values1, move_pairs!(values2, acc))

{has_duplicate, duplicate_key} = find_duplicate_keys(list)

if has_duplicate do
{:error, [duplicate_key]}
else
{:ok, list}
end
end
Comment on lines 283 to 293
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@josevalim I've put a benchmark in the PR description which I think is encouraging.

If it makes sense to move forward with this solution I'll add doctests in validate/2 and validate!/2

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm re-running benchmarks because I realized I was testing 2 very close cases due to alphabetical ordering.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the updated results it's clear that there is an impact in checking the duplicates.
I did get bitten in production by this, so perhaps we could introduce validate/3 in the worst case scenario where there's an opt-in flag for this check

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if there's a more optimal version possible by merging the functionallity of move_pairs! and find_duplicate_keys in 1.
It's going to probably be more complex, but maybe a bit more efficient?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about adding a map of key => frequency to move pairs, initialized to the equivalent value from acc

Then at the end of move pairs, just traverse that and return the keys with count > 1

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@josevalim @tcoopman I updated the benchmark with a compiled module (also attached in the PR body).

I didn't have much luck with getting results as good as the ones without the validation. In absolute time it's not much of a difference, but it's a very significative ratio


defp validate([], _values1, _values2, _acc, bad_keys) do
Expand Down Expand Up @@ -312,6 +320,29 @@ defmodule Keyword do
"expected the second argument to be a list of atoms or tuples, got: #{inspect(other)}"
end

defp find_duplicate_keys([]) do
{false, nil}
end

defp find_duplicate_keys(l) do
[{k, _} | t] = List.keysort(l, 0)
# on a sorted list, we'll only have to check consecutive elements
# for duplicates
find_duplicate_keys(t, k)
end

defp find_duplicate_keys([], _k) do
{false, nil}
end

defp find_duplicate_keys([{k, _} | _t], k) do
{true, k}
end

defp find_duplicate_keys([{h, _} | t], _k) do
find_duplicate_keys(t, h)
end

@doc """
Similar to `validate/2` but returns the keyword or raises an error.

Expand Down