Skip to content

HADOOP-18480. Upgrade aws sdk to 1.12.316 #4972

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

steveloughran
Copy link
Contributor

@steveloughran steveloughran commented Oct 5, 2022

Description of PR

move up to latest aws sdk before the next release

How was this patch tested?

tests against s3 london in progress

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

Change-Id: I078b5f6806857bfc30e15c6b19d05ef60e0ef243
@steveloughran
Copy link
Contributor Author

no accidental export of dependencies from the new JAR

@steveloughran
Copy link
Contributor Author

tests are getting slower on trunk; new prefetcher adds more and many of the existing tests seem to be struggling today

[INFO] Tests run: 45, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 130.748 s - in org.apache.hadoop.fs.s3a.select.ITestS3Select
[INFO] Running org.apache.hadoop.fs.s3a.ITestS3AEncryptionAlgorithmValidation
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.008 s - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionAlgorithmValidation
[INFO] Running org.apache.hadoop.fs.s3a.ITestS3AEncryptionWithDefaultS3Settings
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.417 s - in org.apache.hadoop.fs.s3a.ITestS3AEndpointRegion
[INFO] Running org.apache.hadoop.fs.s3a.ITestS3AIOStatisticsContext
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.004 s - in org.apache.hadoop.fs.s3a.performance.ITestS3ADeleteCost
[WARNING] Tests run: 4, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 9.263 s - in org.apache.hadoop.fs.s3a.ITestS3AEncryptionWithDefaultS3Settings
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.609 s - in org.apache.hadoop.fs.s3a.ITestS3AIOStatisticsContext
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 265.62 s - in org.apache.hadoop.fs.s3a.ITestS3APrefetchingInputStream
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 83.283 s - in org.apache.hadoop.fs.s3a.ITestS3AContractGetFileStatusV1List
[INFO] Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 165.717 s - in org.apache.hadoop.fs.s3a.performance.ITestDirectoryMarkerListing
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 755.827 s - in org.apache.hadoop.fs.s3a.scale.ITestS3AInputStreamPerformance

i am doing a hadoop 3.3 release in docker, but that isn't killing all cpu/network.

one failure

ITestS3AContractVectoredRead.testStopVectoredIoOperationsUnbuffer:143 Expected an exception of type class java.io.InterruptedIOException

@steveloughran
Copy link
Contributor Author

test run against s3 london, -Dscale; also ran ILoadTestS3ABulkDeleteThrottling

[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.hadoop.fs.s3a.scale.ILoadTestS3ABulkDeleteThrottling
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 311.367 s - in org.apache.hadoop.fs.s3a.scale.ILoadTestS3ABulkDeleteThrottling

reviewing output showed up new warning message about the default env vars provider. Filed https://issues.apache.org/jira/browse/HADOOP-18481 ; we should keep quiet and on v2 sdk, just map to a new one.

big upload took a long time

time bin/hadoop fs -copyFromLocal -t 10  share/hadoop/tools/lib/*aws*jar $BUCKET/

2022-10-05 15:46:38,451 [main] WARN  s3a.SDKV2Upgrade (LogExactlyOnce.java:warn(39)) - Directly referencing AWS SDK V1 credential provider com.amazonaws.auth.EnvironmentVariableCredentialsProvider. AWS SDK V1 credential providers will be removed once S3A is upgraded to SDK V2
2022-10-05 15:46:38,872 [main] INFO  impl.DirectoryPolicyImpl (DirectoryPolicyImpl.java:getDirectoryPolicy(189)) - Directory markers will be kept
2022-10-05 15:52:01,205 [shutdown-hook-0] INFO  statistics.IOStatisticsLogging (IOStatisticsLogging.java:logIOStatisticsAtLevel(269)) - IOStatistics: counters=((action_executor_acquired=10)
(action_http_head_request=12)
(audit_request_execution=49)
(audit_span_creation=13)
(files_copied=2)
(files_copied_bytes=290454036)
(files_created=2)
(files_deleted=2)
(multipart_upload_completed=1)
(object_copy_requests=2)
(object_delete_objects=2)
(object_delete_request=2)
(object_list_request=8)
(object_metadata_request=12)
(object_multipart_initiated=2)
(object_put_bytes=290454036)
(object_put_request=1)
(object_put_request_completed=10)
(op_create=2)
(op_get_file_status=7)
(op_get_file_status.failures=4)
(op_glob_status=1)
(op_rename=2)
(store_io_request=49)
(stream_write_block_uploads=19)
(stream_write_bytes=290454036)
(stream_write_queue_duration=121585)
(stream_write_total_data=580908072)
(stream_write_total_time=2213558));

gauges=((stream_write_block_uploads_pending=1));

minimums=((action_executor_acquired.min=0)
(action_http_head_request.min=34)
(object_delete_request.min=57)
(object_list_request.min=39)
(object_multipart_initiated.min=143)
(object_put_request.min=5560)
(op_create.min=52)
(op_get_file_status.failures.min=81)
(op_get_file_status.min=1)
(op_glob_status.min=4)
(op_rename.min=548));

maximums=((action_executor_acquired.max=121575)
(action_http_head_request.max=815)
(object_delete_request.max=110)
(object_list_request.max=219)
(object_multipart_initiated.max=143)
(object_put_request.max=5560)
(op_create.max=85)
(op_get_file_status.failures.max=888)
(op_get_file_status.max=58)
(op_glob_status.max=4)
(op_rename.max=1745));

means=((action_executor_acquired.mean=(samples=19, sum=243159, mean=12797.8421))
(action_http_head_request.mean=(samples=12, sum=1591, mean=132.5833))
(object_delete_request.mean=(samples=2, sum=167, mean=83.5000))
(object_list_request.mean=(samples=8, sum=616, mean=77.0000))
(object_multipart_initiated.mean=(samples=2, sum=284, mean=142.0000))
(object_put_request.mean=(samples=1, sum=5560, mean=5560.0000))
(op_create.mean=(samples=2, sum=137, mean=68.5000))
(op_get_file_status.failures.mean=(samples=4, sum=1610, mean=402.5000))
(op_get_file_status.mean=(samples=3, sum=111, mean=37.0000))
(op_glob_status.mean=(samples=1, sum=4, mean=4.0000))
(op_rename.mean=(samples=2, sum=2293, mean=1146.5000)));


________________________________________________________
Executed in  323.54 secs    fish           external
   usr time   12.77 secs   49.00 micros   12.77 secs
   sys time    3.53 secs  836.00 micros    3.53 secs


notable that time to wait for an executor was tangible.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 20m 21s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 shelldocs 0m 1s Shelldocs was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-1 ❌ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ trunk Compile Tests _
+0 🆗 mvndep 15m 27s Maven dependency ordering for branch
+1 💚 mvninstall 30m 26s trunk passed
+1 💚 compile 27m 18s trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04
+1 💚 compile 23m 40s trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
+1 💚 mvnsite 22m 2s trunk passed
+1 💚 javadoc 9m 18s trunk passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 8m 1s trunk passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
+1 💚 shadedclient 40m 35s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 38s Maven dependency ordering for patch
+1 💚 mvninstall 26m 16s the patch passed
+1 💚 compile 27m 42s the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04
+1 💚 javac 27m 42s the patch passed
+1 💚 compile 24m 21s the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
+1 💚 javac 24m 21s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 mvnsite 22m 56s the patch passed
+1 💚 shellcheck 0m 1s No new issues.
+1 💚 javadoc 9m 47s the patch passed with JDK Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04
+1 💚 javadoc 9m 0s the patch passed with JDK Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
+1 💚 shadedclient 45m 30s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 1071m 6s /patch-unit-root.txt root in the patch passed.
+1 💚 asflicense 2m 21s The patch does not generate ASF License warnings.
1403m 41s
Reason Tests
Failed junit tests hadoop.hdfs.server.namenode.ha.TestObserverNode
Subsystem Report/Notes
Docker ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/artifact/out/Dockerfile
GITHUB PR #4972
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient codespell detsecrets xmllint shellcheck shelldocs
uname Linux 762b195f39e5 4.15.0-191-generic #202-Ubuntu SMP Thu Aug 4 01:49:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / e351ff3
Default Java Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.16+8-post-Ubuntu-0ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_342-8u342-b07-0ubuntu1~20.04-b07
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/testReport/
Max. process+thread count 3438 (vs. ulimit of 5500)
modules C: hadoop-project . U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4972/1/console
versions git=2.25.1 maven=3.6.3 shellcheck=0.7.0
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@steveloughran
Copy link
Contributor Author

test failure is covered in "HDFS-16142. TestObservernode#testMkdirsRaceWithObserverRead is flaky"

Copy link
Contributor

@mukund-thakur mukund-thakur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM+1.
Ran the test. Everything seems ok apart from the know failures.

@steveloughran steveloughran merged commit 540a660 into apache:trunk Oct 10, 2022
asfgit pushed a commit that referenced this pull request Oct 10, 2022
asfgit pushed a commit that referenced this pull request Oct 10, 2022
HarshitGupta11 pushed a commit to HarshitGupta11/hadoop that referenced this pull request Nov 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants