HADOOP-19793. S3A: use long for file size in S3A content providers, data blocks#8225
HADOOP-19793. S3A: use long for file size in S3A content providers, data blocks#8225ajfabbri wants to merge 3 commits intoapache:trunkfrom
Conversation
bf2c793 to
e179121
Compare
|
Majority of CI failures due to HADOOP-19790 |
d0403b1 to
1e1c09c
Compare
|
💔 -1 overall
This message was automatically generated. |
steveloughran
left a comment
There was a problem hiding this comment.
+1
all looks good, and I reviewed that SingleFilePendingCommit file size too.
Regarding test failures
#2 may mean you aren't set up to create a session for the target user/account. I can help there
#3 things have been playing up with the MR cluster tests since the move to JUnit 5...getting anything working was a challenge enough.
|
fix the spotbugs by addressing overrideable methods in verify, or make SinglePendingCommit final. |
587e5de to
eed4c14
Compare
|
I already fixed it but CI has been stuck for over 24 hours! https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8225/8/ |
eed4c14 to
f94a921
Compare
|
Force-push: rebase on latest trunk. |
f94a921 to
01e1456
Compare
steveloughran
left a comment
There was a problem hiding this comment.
aah, I don't see what spotbugs is complaining about here...I suspect it's the duplicate Class declaration in the same match
| <!-- Despite adding `final` as suggested, spotbugs kept complaining. --> | ||
| <Match> | ||
| <Class name="org.apache.hadoop.fs.s3a.commit.files.PendingSet"/> | ||
| <Class name="org.apache.hadoop.fs.s3a.commit.files.SinglePendingCommit"/> |
There was a problem hiding this comment.
maybe you have to only do one Class per match?
There was a problem hiding this comment.
Thanks for the idea. I tried it. 🤷♂️
7945e63 to
c7640dc
Compare
|
💔 -1 overall
This message was automatically generated. |
|
I am at a loss why CI (spotbugs) is failing here. It passes for me locally. That is, |
Adding `final` didn't work so I had to add an exclude. Experimental Warnings: MC_OVERRIDABLE_METHOD_CALL_IN_READ_OBJECT in o.a.h.fs.s3a.commit.files.PendingSet.readObject(ObjectInputStream) and In method o.a.h.fs.s3a.commit.files.SinglePendingCommit.readObject(ObjectInputStream) Called method o.a.h.fs.s3a.commit.files.SinglePendingCommit.validate() s3a: spotbugs: make entire class final Despite changing validate() to final, still get this warning: Unknown bug pattern MC_OVERRIDABLE_METHOD_CALL_IN_READ_OBJECT in org.apache.hadoop.fs.s3a.commit.files.SinglePendingCommit.readObject(ObjectInputStream) At SinglePendingCommit.java:org.apache.hadoop.fs.s3a.commit.files.SinglePendingCommit.readObject(ObjectInputStream) At SinglePendingCommit.java:[line 201] try making the whole class final then. hadoop-aws: add excludes for spotbugs being buggy
Supresses an existing warning that my edit re-triggered. Disabled this check for all of hadoop-aws--it is of questionable value.
c7640dc to
f32ae58
Compare
|
Force-push: rebase on latest trunk |
|
💔 -1 overall
This message was automatically generated. |
steveloughran
left a comment
There was a problem hiding this comment.
+1
hadoop-tools/hadoop-aws generated 0 new + 0 unchanged - 2 fixed = 0 total (was 2)
you've fixed the spotbugs...those two were "extant" in the existing code. They probably crept in from a spotbugs update.
merge at your leisure
Description of PR
From HADOOP-19793:
How was this patch tested?
Tested that large scale test with localstack S3.
Ran all integration tests locally against S3 in us-west-2. All passed except:
ITestS3ACannedACLs>AbstractS3ATestBase.setup:111->AbstractFSContractTestBase.setup:197->AbstractFSContractTestBase.mkdirs:355 » ... S3Exception:The bucket does not allow ACLsITestS3ATemporaryCredentials.testSessionTokenPropagation:202getFileStatus on s3a://fabbri-s3a/job-00-fork-0004/test/testSTS/c20306d0-3eca-4264-b210-3d7beb8c80c7: software.amazon.awssdk.services.s3.model.S3Exception: Forbidden (Service: S3, Status Code: 403, Request ID: 6GV4SMMX9EXVDB0J, Extended Request ID: nB4bQ+CHn+Y2BiR4lLSgLtHyWVd05eerIXrsNDhV9Z0o9I/KZ+N8VD3khlSsiJvh0HkuCjDMtB4=):nullThe first failure seems OK given I haven't enabled bucket ACLs.
I'm not sure what the problem is with 2 & 3.
For code changes:
LICENSE,LICENSE-binary,NOTICE-binaryfiles?AI Tooling
If an AI tool was used:
Nopewhere is the name of the AI tool used.
https://www.apache.org/legal/generative-tooling.html