-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Address Intellij inspection findings #10583
Conversation
5aef101
to
2ab97a5
Compare
@@ -410,7 +410,7 @@ private void checkAndAddPartitionName(String name, Integer sourceColumnId) { | |||
} | |||
} | |||
Preconditions.checkArgument( | |||
name != null && !name.isEmpty(), "Cannot use empty or null partition name: %s", name); | |||
!name.isEmpty(), "Cannot use empty or null partition name: %s", name); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can remove "or null" text from message also.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 on updating the error msg here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this still needs to be addressed
@@ -220,16 +220,15 @@ private void writeDataWithFailOnPartition( | |||
final int numPartitions = 10; | |||
final int partitionToFail = new Random().nextInt(numPartitions); | |||
MapPartitionsFunction<Row, Row> failOnFirstPartitionFunc = | |||
(MapPartitionsFunction<Row, Row>) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can fix it for other version of spark folders also.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar comment for all Spark and Flink module related changes. Only default modules are handled currently. Maybe follow up PR also is fine.
@@ -76,19 +76,15 @@ public static ByteBuffer longToOrderedBytes(long val, ByteBuffer reuse) { | |||
* ByteBuffer)} | |||
*/ | |||
public static ByteBuffer shortToOrderedBytes(short val, ByteBuffer reuse) { | |||
ByteBuffer bytes = ByteBuffers.reuse(reuse, PRIMITIVE_BUFFER_SIZE); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a) should we replace contents of intToOrderedBytes
to call longToOrderedBytes
?
b) Replace all other intToOrderedBytes
to call longToOrderedBytes
to avoid double casting (type -> int -> long)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let me rephrase this comment.
a) Why didn't we handle the same for intToOrderedBytes
? It is a duplicate of longToOrderedBytes
b) Why are we not calling longToOrderedBytes
in the new places? calling intToOrderedBytes
will leads to double casting.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this awesome clean up work. It was easy to review commit by commit.
Few minor comments. It is good to go.
The non-default modules (of Flink and Spark) can be handled as a follow up PR.
I'd really prefer to keep non-IJ-autofixes out of this PR. |
The PR is already quite large and given that we need to fix those things across all Flink/Spark versions I'd suggest to extract the Flink/Spark related things into a separate PR |
I think the current changes are easy to review if we review commit by commit. The backport to Flink/Spark can be handled in the follow up. |
2ab97a5
to
4012df7
Compare
should be good to go once https://github.com/apache/iceberg/pull/10583/files#r1663651011 has been addressed |
Sorry, it's really out of scope of this PR ("Adress IntelliJ Inspection findings"). |
We're improving the code and the null check is being removed since it wasn't null in the first place, so what prevents us from updating |
"improving code" is very broad - and the null-check was never hit. Really - this PR is meant to address the intellij findings and nothing more. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, @snazy can you please follow up and backport those changes to older Flink/Spark versions?
I let IntelliJ run all inspections on the Iceberg source tree, and it returned quite some amount of issues. A bunch are reported as issues, that require attention.
This PR addresses a lot of these inspection issues, none of the individual changes seem controversial, so I opted to push one PR with a couple commits.