Skip to content

Conversation

@Jefffrey
Copy link
Contributor

Which issue does this PR close?

Part of #12725

Rationale for this change

Prefer to avoid user_defined for consistency in function definitions.

What changes are included in this PR?

Refactor signatures of crc32 & sha1 to use coercion API instead of being user defined.

Also add support for FixedSizeBinary inputs to these functions.

Are these changes tested?

Existing tests

Are there any user-facing changes?

No.

@github-actions github-actions bot added sqllogictest SQL Logic Tests (.slt) spark labels Nov 13, 2025
@Jefffrey Jefffrey marked this pull request as ready for review November 13, 2025 02:58
Ok(spark_crc32_impl(input.iter()))
}
dt => {
internal_err!("Unsupported data type for crc32: {dt}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest still keeping exec_err

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I switched them to internal_err as during normal execution these errors should never occur; the signature should guard against them for us, so only if something went wrong with the signature/coercion logic elsewhere in the engine would we run into these errors (hence considered an internal error)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right so idea of internal is highlight something really unexpected happening like a guard saying there is internal engine bug.

Will it trigger if user calls the crc32 with invalid param like

select crc32(arrow_cast(null, 'Dictionary(Int32, Utf8)'))

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an interesting case, it actually surfaced a bug in arrow-rs.

So on main this would fail as such:

1. query failed: DataFusion error: Error during planning: Execution error: Function 'crc32' user-defined coercion failed with "Execution error: `crc32` function does not support type Dictionary(Int32, Utf8)" No function matches the given name and argument types 'crc32(Dictionary(Int32, Utf8))'. You might need to add explicit type casts.
        Candidate functions:
        crc32(UserDefined)
[SQL] select crc32(arrow_cast(null, 'Dictionary(Int32, Utf8)'))
at /Users/jeffrey/Code/datafusion/datafusion/sqllogictest/test_files/spark/hash/crc32.slt:78

On this PR it instead fails as such:

1. query failed: DataFusion error: Optimizer rule 'simplify_expressions' failed
caused by
Arrow error: Compute error: Internal Error: Cannot cast BinaryView to BinaryArray of expected type
[SQL] select crc32(arrow_cast(null, 'Dictionary(Int32, Utf8)'))
at /Users/jeffrey/Code/datafusion/datafusion/sqllogictest/test_files/spark/hash/crc32.slt:84

The error originates from here: https://github.com/apache/arrow-rs/blob/2bc269c3eec23f6794fcd793b641ea4c08325d54/arrow-cast/src/cast/dictionary.rs#L107-L125

So our type coercion logic tries to cast the dictionary to a binary view (which I believe is valid), but arrow-rs has a bug which prevents the cast happening. I'll raise an issue on arrow-rs and will add this test case in this PR so we can track when the fix comes in to DataFusion.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added test case: 0353951

Raised arrow-rs issue: apache/arrow-rs#8841

Ok(spark_sha1_impl(input.iter()))
}
dt => {
internal_err!("Unsupported data type for sha1: {dt}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
internal_err!("Unsupported data type for sha1: {dt}")
exec_err!("Unsupported data type for sha1: {dt}")

Ok(spark_crc32_impl(input.iter()))
}
dt => {
internal_err!("Unsupported data type for crc32: {dt}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
internal_err!("Unsupported data type for crc32: {dt}")
exec_err!("Unsupported data type for crc32: {dt}")

Copy link
Contributor

@comphead comphead left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Jefffrey it is good to go

@Jefffrey Jefffrey added this pull request to the merge queue Nov 15, 2025
Merged via the queue into apache:main with commit c39334f Nov 15, 2025
74 of 76 checks passed
@Jefffrey Jefffrey deleted the refactor-crc32-sha1-sig branch November 15, 2025 02:23
jizezhang pushed a commit to jizezhang/datafusion that referenced this pull request Nov 15, 2025
## Which issue does this PR close?

<!--
We generally require a GitHub issue to be filed for all bug fixes and
enhancements and this helps us generate change logs for our releases.
You can link an issue to this PR using the GitHub syntax. For example
`Closes apache#123` indicates that this PR will close issue apache#123.
-->

Part of apache#12725

## Rationale for this change

<!--
Why are you proposing this change? If this is already explained clearly
in the issue then this section is not needed.
Explaining clearly why changes are proposed helps reviewers understand
your changes and offer better suggestions for fixes.
-->

Prefer to avoid user_defined for consistency in function definitions.

## What changes are included in this PR?

<!--
There is no need to duplicate the description in the issue here but it
is sometimes worth providing a summary of the individual changes in this
PR.
-->

Refactor signatures of crc32 & sha1 to use coercion API instead of being
user defined.

Also add support for FixedSizeBinary inputs to these functions.

## Are these changes tested?

<!--
We typically require tests for all PRs in order to:
1. Prevent the code from being accidentally broken by subsequent changes
2. Serve as another way to document the expected behavior of the code

If tests are not included in your PR, please explain why (for example,
are they covered by existing tests)?
-->

Existing tests

## Are there any user-facing changes?

<!--
If there are user-facing changes then we may require documentation to be
updated before approving the PR.
-->

No.

<!--
If there are any breaking changes to public APIs, please add the `api
change` label.
-->
logan-keede pushed a commit to logan-keede/datafusion that referenced this pull request Nov 23, 2025
## Which issue does this PR close?

<!--
We generally require a GitHub issue to be filed for all bug fixes and
enhancements and this helps us generate change logs for our releases.
You can link an issue to this PR using the GitHub syntax. For example
`Closes apache#123` indicates that this PR will close issue apache#123.
-->

Part of apache#12725

## Rationale for this change

<!--
Why are you proposing this change? If this is already explained clearly
in the issue then this section is not needed.
Explaining clearly why changes are proposed helps reviewers understand
your changes and offer better suggestions for fixes.
-->

Prefer to avoid user_defined for consistency in function definitions.

## What changes are included in this PR?

<!--
There is no need to duplicate the description in the issue here but it
is sometimes worth providing a summary of the individual changes in this
PR.
-->

Refactor signatures of crc32 & sha1 to use coercion API instead of being
user defined.

Also add support for FixedSizeBinary inputs to these functions.

## Are these changes tested?

<!--
We typically require tests for all PRs in order to:
1. Prevent the code from being accidentally broken by subsequent changes
2. Serve as another way to document the expected behavior of the code

If tests are not included in your PR, please explain why (for example,
are they covered by existing tests)?
-->

Existing tests

## Are there any user-facing changes?

<!--
If there are user-facing changes then we may require documentation to be
updated before approving the PR.
-->

No.

<!--
If there are any breaking changes to public APIs, please add the `api
change` label.
-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

spark sqllogictest SQL Logic Tests (.slt)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants