Skip to content

HADOOP-18631 (ADDENDUM) Use LogCapturer to match audit log pattern and remove hdfs async audit log configs #5451

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Mar 17, 2023

Conversation

virajjasani
Copy link
Contributor

@virajjasani virajjasani commented Mar 3, 2023

We don't need to delete the audit log file as such. Even with other tests writing log entries, TestFsck should be able to find the specific pattern of audit log entries. For that, we don't need to delete the file as a post test cleanup as other parallel tests might also be writing audit logs at the same location at the same time.

Remove configs:

  • dfs.namenode.audit.log.async
  • dfs.namenode.audit.log.async.blocking
  • dfs.namenode.audit.log.async.buffer.size

@virajjasani
Copy link
Contributor Author

Even though the test never failed as part of any build runs on PR #5418, after the PR was merged, the test failed on the daily builds (e.g. here)

@virajjasani virajjasani changed the title HADOOP-18631 (ADDENDUM) Avoid deleting audit log file under heavy writes HADOOP-18631 (ADDENDUM) Avoid deleting audit log file when concurrently accessed by other tests Mar 3, 2023
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 47s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 39m 38s trunk passed
+1 💚 compile 1m 27s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 24s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 7s trunk passed
+1 💚 mvnsite 1m 31s trunk passed
+1 💚 javadoc 1m 9s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 31s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 33s trunk passed
+1 💚 shadedclient 23m 8s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 19s the patch passed
+1 💚 compile 1m 23s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 23s the patch passed
+1 💚 compile 1m 15s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 15s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 51s the patch passed
+1 💚 mvnsite 1m 20s the patch passed
+1 💚 javadoc 0m 52s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 28s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 21s the patch passed
+1 💚 shadedclient 23m 10s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 205m 5s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 51s The patch does not generate ASF License warnings.
314m 8s
Reason Tests
Failed junit tests hadoop.hdfs.server.sps.TestExternalStoragePolicySatisfier
hadoop.hdfs.server.datanode.TestDirectoryScanner
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/1/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 3b522cdbd452 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 8a834fe
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/1/testReport/
Max. process+thread count 3256 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/1/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Copy link
Member

@ayushtkn ayushtkn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For that, we don't need to delete the file as a post test cleanup as other parallel tests might also be writing audit logs at the same location at the same time.

Today, they might not collide, but tomorrow new tests or changes in the code, they might collide.

Have different & Unique location per test

@virajjasani
Copy link
Contributor Author

Have different & Unique location per test

This is an option and I thought of moving the entire test to hdfs-client module. Do you think that would be good? Moving the test to module other than hadoop-hdfs is the only way we can cleanly use different file location of audit logs. Does client module look good to you? Or maybe I can just move those 3 tests out of TestFsck and move to new test named say TestFsckAuditLogs in hadoop-hdfs-client module?

@virajjasani
Copy link
Contributor Author

Basically the clean way is to have different log4j properties file for this test. Hence, the need to move the test to different module.

@ayushtkn
Copy link
Member

ayushtkn commented Mar 3, 2023

This is an option and I thought of moving the entire test to hdfs-client module. Do you think that would be good?

Nopes, I don't think so we can move the test, have to find some other way out.

@virajjasani
Copy link
Contributor Author

Btw no other test is actually deleting the file, this was the only test in the previous PR. Not deleting the file will not cause any issues for other tests too because other tests too will read from the same file as this test.

@ayushtkn
Copy link
Member

ayushtkn commented Mar 3, 2023

The path was unique earlier

  static final String AUDITLOG_FILE =
      GenericTestUtils.getTempPath("TestFsck-audit.log");

@virajjasani
Copy link
Contributor Author

The path was unique earlier

  static final String AUDITLOG_FILE =
      GenericTestUtils.getTempPath("TestFsck-audit.log");

That's because the previous version was using dynamically added appenders and disregarding the actual appenders set in log4j properties, this was a hack and it would not work with log4j2. Hence we changed that with previous PR. Now the tests share the same file location for the audit log, because it's the log4j properties that all tests share.

We can make a new custom log4j properties and reload it dynamically, that's an option but again it would be another hack because in log4j2, while this is allowed, the APIs are not public and can break compatibilities.

@virajjasani
Copy link
Contributor Author

Here is the most imp distinguisher for tests before and after PR #5418 :

Before patch: The tests were deciding which log file to use and also disabling async appender soon after the expected logs were in. Hence, the tests can match the expected regex. This is hack because overriding log4j properties is not compatible across log4j versions.

After patch: The tests share the same log file for audit log and no longer applies hack of changing the log file dynamically and stopping appenders when needed etc. Hence, now the tests are made to search for regex only for "specific file operations" because audit logs will have operations for other tests as well. We no longer have any hacks here that log4j2 does not support (or recommend doing).

@virajjasani
Copy link
Contributor Author

virajjasani commented Mar 3, 2023

The options we have now:

  1. Do not delete file in TestFsck because it's already used by other tests (this should anyways be followed even though we go with other solutions)
  2. Have new log4j properties for TestFsck
  • Moving the test to other module so it can use different log4j properties (not preferred)
  • Have a new custom log4j properties and hack loading it dynamically for TestFsck (some yarn test does this hack as of today, but the API usage are not meant to be used by downstreamers and can break anytime so it's not safe option).

@ayushtkn
Copy link
Member

ayushtkn commented Mar 3, 2023

Do not delete file in TestFsck because it's already used by other tests (this should anyways be followed even though we go with other solutions)

&

Moving the test to other module so it can use different log4j properties (not preferred)

I am against this. First one is being too much confident, that tests won't collide and today we know they aren't. tommorow if suddenly they do, other folks who doesn't know about it would be struggling to figure this out. We don't want to leave the code in a worse state than we have today.

Second one I already have said the reason above

Have a new custom log4j properties and hack loading it dynamically for TestFsck (some yarn test does this hack as of today)

If they are doing only, then we have a solution ready only, lets do it or pull that into a util and use it everywhere. I don't have hard feeling about other options(as of now), but this one sounds best to me

@virajjasani
Copy link
Contributor Author

virajjasani commented Mar 3, 2023

If they are doing only, then we have a solution ready only, lets do it or pull that into a util and use it everywhere.

It's just another hack that's why I was bit reluctant to use it, also log4j2 has no public API for it, this can break anytime :(
Let me check if that would solve our problem for sure.

@virajjasani
Copy link
Contributor Author

Sorry Ayush, this would not work as it would screw up state of MiniDfsCluster and Namenode audit logger for other tests too.

@virajjasani
Copy link
Contributor Author

virajjasani commented Mar 3, 2023

Please ignore this comment. We have better solution (the next comment section).

Also, FWIW, in this particular revision, I also removed the file delete operation from TestAuditLogs too because deleting the file is no longer needed as all tests share the same file.
bace039

What we can do to ensure avoiding collision is keep all audit log specific tests to one test file only, for instance TestAuditLogs. It would ensure file is cleaned up before starting the test and it will do it's own duty to check and verify if all necessary regex patterns are matching.
TestFsck has only 3 tests that need to validate regex of audit logs, so we can move them to TestAuditLogs and check for fsck specific audit log regex too.

@virajjasani
Copy link
Contributor Author

Ayush, we have a better way. We can avoid accessing file directly and use log capture utility. I have tested the change.

@virajjasani virajjasani changed the title HADOOP-18631 (ADDENDUM) Avoid deleting audit log file when concurrently accessed by other tests HADOOP-18631 (ADDENDUM) Use LogCapturer by TestFsck to match audit log pattern Mar 3, 2023
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 1m 37s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 40m 14s trunk passed
+1 💚 compile 1m 26s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 22s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 7s trunk passed
+1 💚 mvnsite 1m 32s trunk passed
+1 💚 javadoc 1m 10s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 30s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 37s trunk passed
+1 💚 shadedclient 24m 3s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 32s the patch passed
+1 💚 compile 1m 22s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 22s the patch passed
+1 💚 compile 1m 19s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 19s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 54s the patch passed
+1 💚 mvnsite 1m 21s the patch passed
+1 💚 javadoc 0m 50s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 24s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 16s the patch passed
+1 💚 shadedclient 23m 3s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 205m 21s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 51s The patch does not generate ASF License warnings.
316m 26s
Reason Tests
Failed junit tests hadoop.hdfs.server.datanode.TestDirectoryScanner
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/2/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux b41b0c2a83d3 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 257f8f6
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/2/testReport/
Max. process+thread count 3260 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/2/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 53s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 41m 17s trunk passed
+1 💚 compile 1m 32s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 19s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 7s trunk passed
+1 💚 mvnsite 1m 28s trunk passed
+1 💚 javadoc 1m 8s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 30s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 37s trunk passed
+1 💚 shadedclient 25m 45s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 24s the patch passed
+1 💚 compile 1m 23s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 23s the patch passed
+1 💚 compile 1m 14s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 14s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 55s the patch passed
+1 💚 mvnsite 1m 22s the patch passed
+1 💚 javadoc 0m 52s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 24s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 33s the patch passed
+1 💚 shadedclient 25m 26s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 226m 40s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 43s The patch does not generate ASF License warnings.
341m 34s
Reason Tests
Failed junit tests hadoop.hdfs.server.datanode.TestDirectoryScanner
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/3/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux d58ea4e17fc9 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 167274c
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/3/testReport/
Max. process+thread count 2324 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/3/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@virajjasani virajjasani requested a review from ayushtkn March 4, 2023 19:11
Copy link
Member

@ayushtkn ayushtkn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TestAuditLog is also doing this File logic in the last PR, should change it to use the same approach there as well.

@ayushtkn
Copy link
Member

ayushtkn commented Mar 5, 2023

Rest in general things looks good to me now.
@jojochuang was the original reviewer, looping him as well, have to wait for him, if he ain't convinced with this then we have to rollback the original PR and find another solution.

To me the present stuff looks ok though...

@virajjasani
Copy link
Contributor Author

TestAuditLog is also doing this File logic in the last PR, should change it to use the same approach there as well.

Yeah that's fine too.
On the other hand, we can keep things as is too, i.e. one test reads file, another test reads through console appender (as of current state of this PR). The contents read are used by both to compare regex for the particular filesystem operations they both perform. There is no attempt to update the file status (like delete the file) anymore, hence things should be good AFAICT.

Thank you @ayushtkn for the review. And sure let's wait for @jojochuang too.

@ayushtkn
Copy link
Member

ayushtkn commented Mar 5, 2023

Ok, I think I got misled then. Do you mean to say we are reading via different mechanisms but reading from the same place?
The Appender logic will be catching output from other tests as well when run in Parallel? I have to debug that I think….

And one more to try if I delete the file while using appender, will the test using appender fail?

@virajjasani
Copy link
Contributor Author

virajjasani commented Mar 5, 2023

Do you mean to say we are reading via different mechanisms but reading from the same place?

Basically they will read from their own WriterAppender so there is no sync issue while reading.
The beauty of using LogCapturer is that every test that uses it gets to use their own appender. Hence, every test with new instance of LogCapturer will capture the logs of the given logger (audit log in our case) in their own WriterAppender.

The Appender logic will be catching output from other tests as well when run in Parallel?

Multiple tests can read logs for the given single logger instance (slf4j/log4j).
Every test has their own new appender instance so no two tests using different instance of LogCapturer has to worry about reading from common place (like console or file) as they have their own private appender to read the logs from. This part does that nice logic:

    private LogCapturer(Logger logger) {
      this.logger = logger;
      Appender defaultAppender = Logger.getRootLogger().getAppender("stdout");
      if (defaultAppender == null) {
        defaultAppender = Logger.getRootLogger().getAppender("console");
      }
      final Layout layout = (defaultAppender == null) ? new PatternLayout() :
          defaultAppender.getLayout();
      this.appender = new WriterAppender(layout, sw);
      logger.addAppender(this.appender);
    }

So, let's say tomorrow we write a new test that also needs to read namenode audit logs, the test should just create new LogCapturer object and extract logs from it, that's it. It doesn't interfere with any other tests, if run simultaneously. TestFsck has it's own writer appender. TestAuditLogs on the other hand is reading directly from the file which is used by log4j properties as primary RFA appender, which I think is good so that at least we have one test that also validates output from primary appender directly and verifies regex. Now even if TestAuditLogs deletes the file such that every log produced by TestFsck are gone, it's still not a concern for TestFsck because TestFsck has it's own writer appender instance as a secondary appender and as part of reading logs, it now uses this secondary appender and no longer relies on primary appender (file).

FWIW, I believe LogCapturer is really nice utility.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 1m 17s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 39m 11s trunk passed
+1 💚 compile 1m 27s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 23s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 8s trunk passed
+1 💚 mvnsite 1m 32s trunk passed
+1 💚 javadoc 1m 11s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 37s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 42s trunk passed
+1 💚 shadedclient 22m 31s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 16s the patch passed
+1 💚 compile 1m 25s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 25s the patch passed
+1 💚 compile 1m 16s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 16s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 51s the patch passed
+1 💚 mvnsite 1m 24s the patch passed
+1 💚 javadoc 0m 54s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 25s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 33s the patch passed
+1 💚 shadedclient 22m 25s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 211m 17s hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 49s The patch does not generate ASF License warnings.
319m 1s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/6/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux da5ca8c350f6 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 1483940
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/6/testReport/
Max. process+thread count 3767 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/6/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 38s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 38m 56s trunk passed
+1 💚 compile 1m 32s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 25s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 6s trunk passed
+1 💚 mvnsite 1m 29s trunk passed
+1 💚 javadoc 1m 11s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 38s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 36s trunk passed
+1 💚 shadedclient 22m 42s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 21s the patch passed
+1 💚 compile 1m 23s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 23s the patch passed
+1 💚 compile 1m 15s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 15s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 54s the patch passed
+1 💚 mvnsite 1m 25s the patch passed
+1 💚 javadoc 0m 54s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 30s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 20s the patch passed
+1 💚 shadedclient 22m 18s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 213m 32s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 47s The patch does not generate ASF License warnings.
320m 43s
Reason Tests
Failed junit tests hadoop.hdfs.server.namenode.ha.TestObserverNode
hadoop.hdfs.TestReconstructStripedFileWithRandomECPolicy
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/7/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 8ce823950cca 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 1483940
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/7/testReport/
Max. process+thread count 3388 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/7/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 50s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 39m 1s trunk passed
+1 💚 compile 1m 28s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 21s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 7s trunk passed
+1 💚 mvnsite 1m 31s trunk passed
+1 💚 javadoc 1m 10s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 34s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 30s trunk passed
+1 💚 shadedclient 22m 32s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 18s the patch passed
+1 💚 compile 1m 17s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 17s the patch passed
+1 💚 compile 1m 16s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 16s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 51s the patch passed
+1 💚 mvnsite 1m 19s the patch passed
+1 💚 javadoc 0m 50s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 28s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 13s the patch passed
+1 💚 shadedclient 22m 11s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 204m 25s hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 49s The patch does not generate ASF License warnings.
311m 4s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/9/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 950e387bc2aa 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 86ae052
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/9/testReport/
Max. process+thread count 3455 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/9/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Copy link
Member

@ayushtkn ayushtkn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, this somehow in general looks ok to me from test correctness if you have verified now running parallel doesn't affect each other.

Not chasing this log4j upgrade thing. So, will leave it to @jojochuang to confirm if everything is cool here. If I don't hear it from anyone chasing this upgrade, will revert the original PR by early next week, so that original reviewers can come back, that is the only way I can get some attention of original reviewers...

@virajjasani
Copy link
Contributor Author

@ayushtkn I am requesting you to please not revert original PR. This PR is anyways restricting the log output to individual test by using their own writer appenders.
I hope you already know it's very difficult to get reviews for individual sub-tasks when not everyone has bandwidth to do so :(

@virajjasani
Copy link
Contributor Author

If you run parallel tests, you will be easily able to verify that logs produced by one test is not being written to another test's output as both have their own appenders only responsible for their individual log outputs. Hence, no two tests using LogCapturer utility will collide i.e. they will ever use each other's log output.

@jojochuang @ayushtkn

Copy link
Member

@ayushtkn ayushtkn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.
Still I will wait one week before I merge this to give enough time to others

@virajjasani
Copy link
Contributor Author

Thank you @ayushtkn!

@ayushtkn
Copy link
Member

@virajjasani was going to commit this, considering @jojochuang ain't interested in this, but the original now looks incompatible to me, it took away some abilities which we could do today and the Deprecated is also misunderstood, AFAIK deprecated means, we aren't gonna maintain this and this won't work in future, rather than it is not working use something else

@virajjasani
Copy link
Contributor Author

it took away some abilities which we could do today and the Deprecated is also misunderstood

Ayush, I have added a temporary appender (which will later be removed when we move to log4j2 finally) that provides us the same abilities.

I posted on the parent Jira as well HADOOP-16206 :

  • Migrating Async appenders from code to log4j properties for namenode audit logger as well as datanode/namenode metric loggers
  • Provided sample log4j properties on how we can configure AsyncAppender to wrap RFA for the loggers
  • Incompatible change as three hdfs-site configs are no longer in use, they are to be replaced with log4j properties.
  • Deprecated them, added javadoc and log to indicate the log4j replacement rather than relying on the configs. The configs:
    • dfs.namenode.audit.log.async
    • dfs.namenode.audit.log.async.blocking
    • dfs.namenode.audit.log.async.buffer.size
  • namenode audit logger as well as datanode/namenode metric loggers now use SLF4J logger rather than log4j logger directly

@virajjasani
Copy link
Contributor Author

"Incompatible change as three hdfs-site configs are no longer in use, they are to be replaced with log4j properties."
The overall log4j2 migration is anyways incompatible. There has been lots of discussions around that in the parent Jira initially.

cc @Apache9 @jojochuang

@virajjasani
Copy link
Contributor Author

virajjasani commented Mar 15, 2023

I have added release notes for HADOOP-18631 to explain the same. Also, both parent Jira (HADOOP-16206) as well as this current Jira (HADOOP-18631) both are marked Incompatible change. For the current Jira, I marked it but the parent Jira was marked Incompatible long back.

This is the log added with the current Jira:

  private static void checkForAsyncLogEnabledByOldConfigs(Configuration conf) {
    if (conf.getBoolean(DFS_NAMENODE_AUDIT_LOG_ASYNC_KEY, DFS_NAMENODE_AUDIT_LOG_ASYNC_DEFAULT)) {
      LOG.warn("Use log4j properties to enable async log for audit logs. {} is deprecated",
          DFS_NAMENODE_AUDIT_LOG_ASYNC_KEY);
    }
  }

Copy link
Member

@ayushtkn ayushtkn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I take my vote back.
Ok, this is incompatible, you must be having enough agreement for this change, not getting into that...

It is incompatible ok, but the configs deprecated? They aren't deprecated they are removed itself.

@jojochuang
Copy link
Contributor

+1 merging this one. In any case, it is unfortunately impossible to upgrade to log4j2 without any friction. We do what we can do at code review, but ultimately it's up to the downstream application developers to adopt and change. Most likely we'll find issues during integration tests but I don't see any other way.

@ayushtkn
Copy link
Member

Still Ok, I don't want to get into this activity, I will trust you on this, But fundamentally why are you marking the config deprecated, just remove it, It isn't working why you want to keep it marked as deprecated, you aren't removing it in the "future" release, you "removed" it.

but ultimately it's up to the downstream application developers to adopt and change

Hope they do that....

@virajjasani
Copy link
Contributor Author

Ok sir, updated the PR. Let me update title to reflect this and also update release notes accordingly.
Thank you @ayushtkn @jojochuang

@ayushtkn
Copy link
Member

Wei-Chui go ahead committing, no objections from my side. Test gets fixed. I am happy with just that.

@virajjasani virajjasani changed the title HADOOP-18631 (ADDENDUM) Use LogCapturer to match audit log pattern HADOOP-18631 (ADDENDUM) Use LogCapturer to match audit log pattern and remove hdfs async audit log configs Mar 15, 2023
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 33s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 43m 32s trunk passed
+1 💚 compile 1m 30s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 20s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 9s trunk passed
+1 💚 mvnsite 1m 31s trunk passed
+1 💚 javadoc 1m 9s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 31s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 31s trunk passed
+1 💚 shadedclient 22m 45s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 16s the patch passed
+1 💚 compile 1m 18s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 18s the patch passed
+1 💚 compile 1m 10s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 10s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 53s the patch passed
+1 💚 mvnsite 1m 18s the patch passed
+1 💚 javadoc 0m 51s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 27s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 17s the patch passed
+1 💚 shadedclient 22m 35s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 205m 17s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 50s The patch does not generate ASF License warnings.
316m 28s
Reason Tests
Failed junit tests hadoop.tools.TestHdfsConfigFields
hadoop.hdfs.server.datanode.TestDirectoryScanner
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/10/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 8a50d0c733dc 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / af017d1
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/10/testReport/
Max. process+thread count 3202 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/10/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@virajjasani virajjasani requested a review from ayushtkn March 16, 2023 05:16
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 25m 43s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 40m 26s trunk passed
+1 💚 compile 1m 37s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 27s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 17s trunk passed
+1 💚 mvnsite 1m 34s trunk passed
+1 💚 javadoc 1m 8s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 39s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 53s trunk passed
+1 💚 shadedclient 24m 15s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 18s the patch passed
+1 💚 compile 1m 17s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 17s the patch passed
+1 💚 compile 1m 13s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 13s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 54s the patch passed
+1 💚 mvnsite 1m 20s the patch passed
+1 💚 javadoc 0m 50s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 27s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 15s the patch passed
+1 💚 shadedclient 22m 20s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 100m 18s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 44s The patch does not generate ASF License warnings.
235m 28s
Reason Tests
Failed junit tests hadoop.hdfs.server.datanode.fsdataset.impl.TestFsVolumeList
hadoop.hdfs.server.datanode.TestDnRespectsBlockReportSplitThreshold
hadoop.hdfs.TestDecommissionWithBackoffMonitor
hadoop.hdfs.server.datanode.TestDataNodeExit
hadoop.hdfs.TestEncryptedTransfer
hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistPolicy
hadoop.hdfs.server.datanode.TestTriggerBlockReport
hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistFiles
hadoop.hdfs.server.datanode.TestReadOnlySharedStorage
hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl
hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistLockedMemory
hadoop.hdfs.server.datanode.fsdataset.impl.TestDatanodeRestart
hadoop.hdfs.server.datanode.TestDataNodeLifeline
hadoop.hdfs.server.datanode.TestDiskError
hadoop.hdfs.server.datanode.TestHSync
hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetCache
hadoop.hdfs.server.datanode.fsdataset.impl.TestScrLazyPersistFiles
hadoop.hdfs.server.datanode.TestBlockRecovery2
hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureToleration
hadoop.hdfs.server.datanode.TestDataNodeTcpNoDelay
hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureReporting
hadoop.hdfs.server.datanode.TestBatchIbr
hadoop.hdfs.server.datanode.TestDataNodeRollingUpgrade
hadoop.hdfs.server.datanode.TestIncrementalBlockReports
hadoop.hdfs.server.datanode.TestCachingStrategy
hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyWriter
hadoop.hdfs.server.datanode.TestDataNodeInitStorage
hadoop.hdfs.server.datanode.TestDataNodeErasureCodingMetrics
hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistReplicaRecovery
hadoop.hdfs.server.datanode.TestLargeBlockReport
hadoop.hdfs.server.datanode.TestBlockHasMultipleReplicasOnSameDN
hadoop.hdfs.server.datanode.TestDataNodeMXBean
hadoop.hdfs.server.datanode.TestDataNodeECN
hadoop.hdfs.server.datanode.TestDataNodeHotSwapVolumes
hadoop.hdfs.server.datanode.TestRefreshNamenodes
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/11/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint
uname Linux 89234d0aeb9b 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 6ecd8d9
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/11/testReport/
Max. process+thread count 2748 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/11/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 12m 8s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 40m 10s trunk passed
+1 💚 compile 1m 25s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 20s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 8s trunk passed
+1 💚 mvnsite 1m 31s trunk passed
+1 💚 javadoc 1m 8s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 37s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 30s trunk passed
+1 💚 shadedclient 22m 38s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 17s the patch passed
+1 💚 compile 1m 18s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 18s the patch passed
+1 💚 compile 1m 13s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 13s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 54s the patch passed
+1 💚 mvnsite 1m 19s the patch passed
+1 💚 javadoc 0m 50s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 28s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 11s the patch passed
+1 💚 shadedclient 22m 21s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 203m 40s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 47s The patch does not generate ASF License warnings.
322m 41s
Reason Tests
Failed junit tests hadoop.hdfs.server.datanode.TestDirectoryScanner
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/13/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint
uname Linux 2c846b421ffd 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 6ecd8d9
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/13/testReport/
Max. process+thread count 3743 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/13/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 50s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 44m 25s trunk passed
+1 💚 compile 1m 30s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 1m 21s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 8s trunk passed
+1 💚 mvnsite 1m 30s trunk passed
+1 💚 javadoc 1m 9s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 29s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 37s trunk passed
+1 💚 shadedclient 25m 51s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 20s the patch passed
+1 💚 compile 1m 25s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 1m 25s the patch passed
+1 💚 compile 1m 16s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 1m 16s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 56s the patch passed
+1 💚 mvnsite 1m 23s the patch passed
+1 💚 javadoc 0m 54s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 1m 25s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 3m 27s the patch passed
+1 💚 shadedclient 25m 52s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 225m 12s /patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt hadoop-hdfs in the patch passed.
+1 💚 asflicense 1m 30s The patch does not generate ASF License warnings.
344m 45s
Reason Tests
Failed junit tests hadoop.hdfs.server.datanode.TestDirectoryScanner
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/12/artifact/out/Dockerfile
GITHUB PR #5451
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint
uname Linux 02a9dbcb10b1 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 6ecd8d9
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/12/testReport/
Max. process+thread count 2430 (vs. ulimit of 5500)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5451/12/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@virajjasani
Copy link
Contributor Author

@ayushtkn @jojochuang does the current state of the PR look good?

@jojochuang
Copy link
Contributor

Thanks Ayush I'll merge it.

@jojochuang jojochuang merged commit b6a9d7b into apache:trunk Mar 17, 2023
ferdelyi pushed a commit to ferdelyi/hadoop that referenced this pull request May 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants