Skip to content

HADOOP-18666. A whitelist of endpoints to skip Kerberos authentication doesn't work for ResourceManager and Job History Server #5480

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 22, 2023

Conversation

eubnara
Copy link
Contributor

@eubnara eubnara commented Mar 15, 2023

Description of PR

Thanks to HADOOP-16527, we can add a whitelist of endpoints to skip Kerberos authentication such as /isActive, /jmx, /prom.
However, I found that ResourceManager and Job History Server doesn't repect hadoop.http.authentication.kerberos.endpoint.whitelist.

To workaround this issue for ResourceManager, set yarn.resourcemanager.webapp.delegation-token-auth-filter.enabled=true in yarn-site.xml.
However, there is no workaround for Job History Server.

This bug is caused by HttpServer2#initSpnego call without proper configurations which starts with "hadoop.http.authentication.".

How was this patch tested?

Manually tested in internal cluster. It works with ResourceManager (without yarn.resourcemanager.webapp.delegation-token-auth-filter.enabled=true set), Job History Server.

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

Without this patch, whitelist are ignored. (I tested with hadoop 3.3.4)
image

Even though hadoop.http.filter.initializers are set with org.apache.hadoop.security.AuthenticationFilterInitializer,org.apache.hadoop.security.HttpCrossOriginFilterInitializer, HTTPServer#initSpnego call makes overwrite AuthenticationFilter (because it has the same name authentication) filter and ignores hadoop.http.authentication.kerberos.endpoint.whitelist.

On ResourceManager with configurations:

# core-site.xml
hadoop.http.filter.initializers=org.apache.hadoop.security.AuthenticationFilterInitializer,org.apache.hadoop.security.HttpCrossOriginFilterInitializer
hadoop.http.authentication.kerberos.endpoint.whitelist=/isActive,/jmx,/prom

...

# yarn-site.xml
yarn.resourcemanager.webapp.delegation-token-auth-filter.enabled=false

One curios thing is that if AuthenticationFilterInitializer is set in hadoop.http.filter.initializers in core-site.xml, AuthenticationFilter filter added twice unnecessarily because initSpnego() adds this filter, too.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

HDFS-16129 is not in hadoop 3.3.4. If you want to fix this issue in hadoop 3.3.4, try patch file on https://issues.apache.org/jira/browse/HADOOP-18666.

@tasanuma
Copy link
Member

In my environment, hadoop.http.authentication.kerberos.endpoint.whitelist works for ResourceManager and Job History Server without this change. Did you set hadoop.http.authentication.type=kerberos?

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

As I mentioned, if you set yarn.resourcemanager.webapp.delegation-token-auth-filter.enabled=false, it won't work.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

Hmm... I use hadoop 3.3.4. And I set the configuration you mentioned but it doesn't work.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

Which version do you use?

@tasanuma
Copy link
Member

I see. We are using 3.3.0 + many patches, which doesn't include HDFS-16129.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

Yeah, the reason why I mentioned https://issues.apache.org/jira/browse/HDFS-16129 is that without this patch will be different.
I'm not saying HDFS-16129 introduces this bug.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

what is the value of hadoop.http.filter.initializers in your env?

@tasanuma
Copy link
Member

hadoop.http.filter.initializers =org.apache.hadoop.security.AuthenticationFilterInitializer,org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.security.HttpCrossOriginFilterInitializer

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

Sorry, it is hard to find why your version of Hadoop just works with insufficient information.
Let me try to reproduce this bug in vanilla Hadoop 3.3.0.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

You can access specific endpoints in whitelist without kerberos authentication with your modified version of hadoop 3.3.0, right?

@tasanuma
Copy link
Member

Yes, I can access only the whitelist endpoints.

my-jobhistoryserver configs:

<property name="hadoop.http.authentication.kerberos.endpoint.whitelist" value="/isActive,/jmx,/prom"/>
<property name="hadoop.http.authentication.kerberos.keytab" value="/path/to/spnego.keytab"/>
<property name="hadoop.http.authentication.kerberos.principal" value="HTTP/_HOST@MY_REALM"/>
<property name="hadoop.http.authentication.type" value="kerberos"/>
<property name="hadoop.http.filter.initializers" value="org.apache.hadoop.security.AuthenticationFilterInitializer,org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.security.HttpCrossOriginFilterInitializer"/>
$ curl -s http://my-jobhistoryserver:19888/jmx | head -n 3
{
  "beans" : [ {
    "name" : "Hadoop:service=JobHistoryServer,name=RpcActivityForPort10033",

$ curl -s http://my-jobhistoryserver:19888/conf | head
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401 Authentication required</h2>
<table>
<tr><th>URI:</th><td>/conf</td></tr>
<tr><th>STATUS:</th><td>401</td></tr>
<tr><th>MESSAGE:</th><td>Authentication required</td></tr>

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

Thanks for reply, I'll try to reproduce it with vanilla hadoop 3.3.0. 😢

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

HADOOP-17371 upgrades jetty version. I found that updateBeans(_filters,holders); may affect this behavior. (jetty/jetty.project@b1e08ba#diff-139accaf0a9751b1ec9461855e9e0f1d8fb9966c3c4685b8d6f8714add747a37R1551)

Maybe it makes AuthenticationFilter overwrite previously added AuthenticationFilter filter?
(I'm not sure yet....)

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 57s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-1 ❌ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ trunk Compile Tests _
+1 💚 mvninstall 38m 42s trunk passed
+1 💚 compile 23m 12s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 20m 25s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 12s trunk passed
+1 💚 mvnsite 1m 42s trunk passed
+1 💚 javadoc 1m 15s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 50s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 44s trunk passed
+1 💚 shadedclient 22m 24s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 0s the patch passed
+1 💚 compile 22m 28s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 22m 28s the patch passed
+1 💚 compile 20m 32s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 20m 32s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 1m 4s /results-checkstyle-hadoop-common-project_hadoop-common.txt hadoop-common-project/hadoop-common: The patch generated 2 new + 51 unchanged - 0 fixed = 53 total (was 51)
+1 💚 mvnsite 1m 40s the patch passed
+1 💚 javadoc 1m 6s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 50s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 43s the patch passed
+1 💚 shadedclient 22m 16s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 27s hadoop-common in the patch passed.
+1 💚 asflicense 1m 1s The patch does not generate ASF License warnings.
207m 15s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/1/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux bc12818dfe93 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 5f39e69
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/1/testReport/
Max. process+thread count 1278 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/1/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@tasanuma
Copy link
Member

Thanks for your investigation. Actually, my modified version has already included HADOOP-17371. But surely the different versions of jetty could introduce the problem.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

I tested hadoop 3.3.4 with jetty 9.4.20.v20190813 but this issue has been reproduced.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

@tasanuma
I tested on Hadoop 3.3.0 with some patches to fix build failure. But this issue has been reproduced.

core-site.xml
  <configuration  xmlns:xi="http://www.w3.org/2001/XInclude">
    
    <property>
      <name>fs.azure.user.agent.prefix</name>
      <value>User-Agent: APN/1.0 Hortonworks/1.0 HDP/None</value>
    </property>
    
    <property>
      <name>fs.defaultFS</name>
      <value>hdfs://ambari-agent-1.example.com:8020</value>
      <final>true</final>
    </property>
    
    <property>
      <name>fs.gs.application.name.suffix</name>
      <value> (GPN:Hortonworks; version 1.0) HDP/None</value>
    </property>
    
    <property>
      <name>fs.gs.path.encoding</name>
      <value>uri-path</value>
    </property>
    
    <property>
      <name>fs.gs.working.dir</name>
      <value>/</value>
    </property>
    
    <property>
      <name>fs.s3a.user.agent.prefix</name>
      <value>User-Agent: APN/1.0 Hortonworks/1.0 HDP/None</value>
    </property>
    
    <property>
      <name>fs.trash.interval</name>
      <value>360</value>
    </property>
    
    <property>
      <name>ha.failover-controller.active-standby-elector.zk.op.retries</name>
      <value>120</value>
    </property>
    
    <property>
      <name>hadoop.http.authentication.kerberos.endpoint.whitelist</name>
      <value>/isActive,/jmx,/prom</value>
    </property>
    
    <property>
      <name>hadoop.http.authentication.kerberos.keytab</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>
    
    <property>
      <name>hadoop.http.authentication.kerberos.principal</name>
      <value>HTTP/_HOST@EXAMPLE.COM</value>
    </property>
    
    <property>
      <name>hadoop.http.authentication.type</name>
      <value>kerberos</value>
    </property>
    
    <property>
      <name>hadoop.http.filter.initializers</name>
      <value>org.apache.hadoop.security.AuthenticationFilterInitializer,org.apache.hadoop.security.HttpCrossOriginFilterInitializer</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.*</name>
      <value>*</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.hdfs.groups</name>
      <value>*</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.hdfs.hosts</name>
      <value>*</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.HTTP.groups</name>
      <value>hadoop</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.root.groups</name>
      <value>*</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.root.hosts</name>
      <value>ambari-server.example.com</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.yarn.groups</name>
      <value>*</value>
    </property>
    
    <property>
      <name>hadoop.proxyuser.yarn.hosts</name>
      <value>ambari-agent-2.example.com</value>
    </property>
    
    <property>
      <name>hadoop.security.auth_to_local</name>
      <value>RULE:[1:$1@$0](ambari-qa-eub@EXAMPLE.COM)s/.*/ambari-qa/
RULE:[1:$1@$0](hdfs-eub@EXAMPLE.COM)s/.*/hdfs/
RULE:[1:$1@$0](.*@EXAMPLE.COM)s/@.*//
RULE:[2:$1@$0](dn@EXAMPLE.COM)s/.*/hdfs/
RULE:[2:$1@$0](jhs@EXAMPLE.COM)s/.*/mapred/
RULE:[2:$1@$0](nm@EXAMPLE.COM)s/.*/yarn/
RULE:[2:$1@$0](nn@EXAMPLE.COM)s/.*/hdfs/
RULE:[2:$1@$0](rm@EXAMPLE.COM)s/.*/yarn/
DEFAULT</value>
    </property>
    
    <property>
      <name>hadoop.security.authentication</name>
      <value>kerberos</value>
    </property>
    
    <property>
      <name>hadoop.security.authorization</name>
      <value>true</value>
    </property>
    
    <property>
      <name>io.compression.codecs</name>
      <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
    </property>
    
    <property>
      <name>io.file.buffer.size</name>
      <value>131072</value>
    </property>
    
    <property>
      <name>io.serializations</name>
      <value>org.apache.hadoop.io.serializer.WritableSerialization</value>
    </property>
    
    <property>
      <name>ipc.client.connect.max.retries</name>
      <value>50</value>
    </property>
    
    <property>
      <name>ipc.client.connection.maxidletime</name>
      <value>30000</value>
    </property>
    
    <property>
      <name>ipc.client.idlethreshold</name>
      <value>8000</value>
    </property>
    
    <property>
      <name>ipc.server.tcpnodelay</name>
      <value>true</value>
    </property>
    
    <property>
      <name>mapreduce.jobtracker.webinterface.trusted</name>
      <value>false</value>
    </property>
    
    <property>
      <name>net.topology.script.file.name</name>
      <value>/etc/hadoop/conf/topology_script.py</value>
    </property>
    
  </configuration>
yarn-site.xml
  <configuration  xmlns:xi="http://www.w3.org/2001/XInclude">
    
    <property>
      <name>hadoop.registry.secure</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hadoop.registry.system.accounts</name>
      <value>sasl:yarn,sasl:mapred,sasl:hadoop,sasl:hdfs,sasl:rm</value>
    </property>
    
    <property>
      <name>manage.include.files</name>
      <value>false</value>
    </property>
    
    <property>
      <name>yarn.acl.enable</name>
      <value>true</value>
    </property>
    
    <property>
      <name>yarn.admin.acl</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.application.classpath</name>
      <value>/etc/hadoop/conf,/usr/lib/hadoop/*,/usr/lib/hadoop/lib/*,/usr/lib/hadoop-hdfs/*,/usr/lib/hadoop-hdfs/lib/*,/usr/lib/hadoop-yarn/*,/usr/lib/hadoop-yarn/lib/*,/usr/lib/hadoop-mapreduce/*,/usr/lib/hadoop-mapreduce/lib/*</value>
    </property>
    
    <property>
      <name>yarn.http.policy</name>
      <value>HTTP_ONLY</value>
    </property>
    
    <property>
      <name>yarn.log-aggregation-enable</name>
      <value>true</value>
    </property>
    
    <property>
      <name>yarn.log-aggregation.retain-seconds</name>
      <value>2592000</value>
    </property>
    
    <property>
      <name>yarn.log.server.url</name>
      <value>http://ambari-agent-2.example.com:19888/jobhistory/logs</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.address</name>
      <value>0.0.0.0:45454</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.admin-env</name>
      <value>MALLOC_ARENA_MAX=$MALLOC_ARENA_MAX</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.aux-services</name>
      <value>mapreduce_shuffle</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.aux-services.mapreduce_shuffle.class</name>
      <value>org.apache.hadoop.mapred.ShuffleHandler</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.container-executor.class</name>
      <value>org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.container-monitor.interval-ms</name>
      <value>3000</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.delete.debug-delay-sec</name>
      <value>0</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.disk-health-checker.min-healthy-disks</name>
      <value>0.25</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.health-checker.interval-ms</name>
      <value>135000</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.health-checker.script.timeout-ms</name>
      <value>60000</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.keytab</name>
      <value>/etc/security/keytabs/nm.service.keytab</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.linux-container-executor.group</name>
      <value>hadoop</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.local-dirs</name>
      <value>/hadoop/yarn/local</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.log-aggregation.compression-type</name>
      <value>gz</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.log-dirs</name>
      <value>/hadoop/yarn/log</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.log.retain-seconds</name>
      <value>604800</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.principal</name>
      <value>nm/_HOST@EXAMPLE.COM</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.remote-app-log-dir</name>
      <value>/app-logs</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.remote-app-log-dir-suffix</name>
      <value>logs</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.resource.memory-mb</name>
      <value>12288</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.vmem-check-enabled</name>
      <value>false</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.vmem-pmem-ratio</name>
      <value>2.1</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.webapp.spnego-keytab-file</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>
    
    <property>
      <name>yarn.nodemanager.webapp.spnego-principal</name>
      <value>HTTP/_HOST@EXAMPLE.COM</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.address</name>
      <value>ambari-agent-2.example.com:8050</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.admin.address</name>
      <value>ambari-agent-2.example.com:8141</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.am.max-attempts</name>
      <value>2</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.hostname</name>
      <value>ambari-agent-2.example.com</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.keytab</name>
      <value>/etc/security/keytabs/rm.service.keytab</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.nodes.exclude-path</name>
      <value>/etc/hadoop/conf/yarn.exclude</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.principal</name>
      <value>rm/_HOST@EXAMPLE.COM</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.proxy-user-privileges.enabled</name>
      <value>true</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.proxyuser.*.groups</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.proxyuser.*.hosts</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.proxyuser.*.users</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.resource-tracker.address</name>
      <value>ambari-agent-2.example.com:8025</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.scheduler.address</name>
      <value>ambari-agent-2.example.com:8030</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.scheduler.class</name>
      <value>org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.CapacityScheduler</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.webapp.address</name>
      <value>ambari-agent-2.example.com:8088</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.webapp.delegation-token-auth-filter.enabled</name>
      <value>false</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.webapp.https.address</name>
      <value>ambari-agent-2.example.com:8090</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.webapp.spnego-keytab-file</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>
    
    <property>
      <name>yarn.resourcemanager.webapp.spnego-principal</name>
      <value>HTTP/_HOST@EXAMPLE.COM</value>
    </property>
    
    <property>
      <name>yarn.scheduler.maximum-allocation-mb</name>
      <value>12288</value>
    </property>
    
    <property>
      <name>yarn.scheduler.minimum-allocation-mb</name>
      <value>4096</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.address</name>
      <value>localhost:10200</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.enabled</name>
      <value>false</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.active-dir</name>
      <value>/ats/active/</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.cleaner-interval-seconds</name>
      <value>3600</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.done-dir</name>
      <value>/ats/done/</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.group-id-plugin-classes</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.retain-seconds</name>
      <value>604800</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.scan-interval-seconds</name>
      <value>60</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.entity-group-fs-store.summary-store</name>
      <value>org.apache.hadoop.yarn.server.timeline.RollingLevelDBTimelineStore</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.generic-application-history.store-class</name>
      <value>org.apache.hadoop.yarn.server.applicationhistoryservice.NullApplicationHistoryStore</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.cookie.domain</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.cookie.path</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.kerberos.name.rules</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.proxyuser.*.groups</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.proxyuser.*.hosts</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.proxyuser.*.users</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.signature.secret</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.signature.secret.file</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.signer.secret.provider</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.signer.secret.provider.object</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.token.validity</name>
      <value></value>
    </property>
    
    <property>
      <name>yarn.timeline-service.http-authentication.type</name>
      <value>kerberos</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.leveldb-timeline-store.path</name>
      <value>/var/log/hadoop-yarn/timeline</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.leveldb-timeline-store.ttl-interval-ms</name>
      <value>300000</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.recovery.enabled</name>
      <value>true</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.store-class</name>
      <value>org.apache.hadoop.yarn.server.timeline.LeveldbTimelineStore</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.ttl-enable</name>
      <value>true</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.ttl-ms</name>
      <value>2678400000</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.webapp.address</name>
      <value>localhost:8188</value>
    </property>
    
    <property>
      <name>yarn.timeline-service.webapp.https.address</name>
      <value>localhost:8190</value>
    </property>
    
  </configuration>

The source code I use when testing it is here => https://github.com/eubnara/hadoop/tree/eub-3.3.0.
I build it with apache/bigtop project.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

[root@ambari-agent-2 conf]# yum list installed | grep hadoop
hadoop.x86_64                       3.3.0-1.el7                 @BGTP-1.0-repo-1
hadoop-client.x86_64                3.3.0-1.el7                 @BGTP-1.0-repo-1
hadoop-hdfs.x86_64                  3.3.0-1.el7                 @BGTP-1.0-repo-1
hadoop-libhdfs.x86_64               3.3.0-1.el7                 @BGTP-1.0-repo-1
hadoop-mapreduce.x86_64             3.3.0-1.el7                 @BGTP-1.0-repo-1
hadoop-yarn.x86_64                  3.3.0-1.el7                 @BGTP-1.0-repo-1

RM

[root@ambari-agent-2 conf]# curl http://ambari-agent-2.example.com:8088/jmx -v
* About to connect() to ambari-agent-2.example.com port 8088 (#0)
*   Trying 172.20.0.2...
* Connected to ambari-agent-2.example.com (172.20.0.2) port 8088 (#0)
> GET /jmx HTTP/1.1
> User-Agent: curl/7.29.0
> Host: ambari-agent-2.example.com:8088
> Accept: */*
> 
< HTTP/1.1 401 Authentication required
< Date: Wed, 15 Mar 2023 22:07:43 GMT
< Date: Wed, 15 Mar 2023 22:07:43 GMT
< Pragma: no-cache
< X-Content-Type-Options: nosniff
< X-XSS-Protection: 1; mode=block
< WWW-Authenticate: Negotiate
< Set-Cookie: hadoop.auth=; HttpOnly
< Cache-Control: must-revalidate,no-cache,no-store
< Content-Type: text/html;charset=iso-8859-1
< Content-Length: 263
< 
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /jmx. Reason:
<pre>    Authentication required</pre></p>
</body>
</html>
* Connection #0 to host ambari-agent-2.example.com left intact

JHS

[root@ambari-agent-2 conf]# curl http://ambari-agent-2.example.com:19888/jmx -v
* About to connect() to ambari-agent-2.example.com port 19888 (#0)
*   Trying 172.20.0.2...
* Connected to ambari-agent-2.example.com (172.20.0.2) port 19888 (#0)
> GET /jmx HTTP/1.1
> User-Agent: curl/7.29.0
> Host: ambari-agent-2.example.com:19888
> Accept: */*
> 
< HTTP/1.1 401 Authentication required
< Date: Wed, 15 Mar 2023 22:08:03 GMT
< Date: Wed, 15 Mar 2023 22:08:03 GMT
< Pragma: no-cache
< X-Content-Type-Options: nosniff
< X-XSS-Protection: 1; mode=block
< WWW-Authenticate: Negotiate
< Set-Cookie: hadoop.auth=; HttpOnly
< Cache-Control: must-revalidate,no-cache,no-store
< Content-Type: text/html;charset=iso-8859-1
< Content-Length: 263
< 
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /jmx. Reason:
<pre>    Authentication required</pre></p>
</body>
</html>
* Connection #0 to host ambari-agent-2.example.com left intact

@eubnara
Copy link
Contributor Author

eubnara commented Mar 15, 2023

I cannot figure out why your version of hadoop just works without your full source code and configurations. Sorry.

@tasanuma
Copy link
Member

@eubnara Thanks for doing the test with 3.3.0! Hmm... your configurations seem good. I'm not sure the cause.

The whitelist also works with ResourceManager even if it is  yarn.resourcemanager.webapp.delegation-token-auth-filter.enabled=false in my environment (without this PR).

Anyway, if the problem exists, we need to fix it. Could you create a unit test for this change?

@eubnara
Copy link
Contributor Author

eubnara commented Mar 17, 2023

@tasanuma
I added an unit test. Thanks.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 17, 2023

If I commented out following lines, the unit test fails.


image


image

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 56s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
-1 ❌ mvninstall 38m 6s /branch-mvninstall-root.txt root in trunk failed.
+1 💚 compile 23m 18s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 20m 34s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 11s trunk passed
+1 💚 mvnsite 1m 41s trunk passed
+1 💚 javadoc 1m 16s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 50s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 41s trunk passed
+1 💚 shadedclient 23m 15s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 0s the patch passed
+1 💚 compile 22m 33s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 22m 33s the patch passed
+1 💚 compile 20m 38s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 20m 38s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 1m 5s /results-checkstyle-hadoop-common-project_hadoop-common.txt hadoop-common-project/hadoop-common: The patch generated 8 new + 51 unchanged - 0 fixed = 59 total (was 51)
+1 💚 mvnsite 1m 40s the patch passed
+1 💚 javadoc 1m 6s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 50s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 41s the patch passed
+1 💚 shadedclient 22m 30s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 22s hadoop-common in the patch passed.
+1 💚 asflicense 1m 0s The patch does not generate ASF License warnings.
207m 39s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/2/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 201843dbf2bf 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 564738b
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/2/testReport/
Max. process+thread count 3159 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/2/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 17, 2023

I think this build failure is caused by downloading Node.js. It may not be related to this PR.
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/2/artifact/out/branch-mvninstall-root.txt

[INFO] Total time:  37:54 min
[INFO] Finished at: 2023-03-17T09:40:13Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal com.github.eirslett:frontend-maven-plugin:1.11.2:install-node-and-yarn (install node and yarn) on project hadoop-yarn-applications-catalog-webapp: Could not download Node.js: Got error code 404 from the server. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :hadoop-yarn-applications-catalog-webapp

@tasanuma
Copy link
Member

Thanks for creating the unit test, @eubnara. The changes mostly seem good to me.

  • In the unit test, could you use allowList and denyList for the variable names? Although we cannot change the name of hadoop.http.authentication.kerberos.endpoint.whitelist to keep the backward compatibility, we are trying to remove non-inclusive terminology as much as possible. ( HADOOP-17168 )
  • Could you fix the checkstyle issues?

@eubnara
Copy link
Contributor Author

eubnara commented Mar 20, 2023

Thanks for feedback! I got to know how to use checkstyle.xml in my IDE thanks to you.
Actually, I didn't realize that there are checkstyle issues and the existence of checkstyle.xml.

@tasanuma
Copy link
Member

Thanks for updating it. +1 if the CI result is ok.

If you are using IntelliJ, I also recommend using dev-support/code-formatter/hadoop_idea_formatter.xml. You can configure it by Preferences > Editor > Code Style.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 52s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 40m 2s trunk passed
+1 💚 compile 26m 49s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 23m 20s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 10s trunk passed
+1 💚 mvnsite 1m 37s trunk passed
+1 💚 javadoc 1m 11s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 51s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 47s trunk passed
+1 💚 shadedclient 22m 1s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 2s the patch passed
+1 💚 compile 25m 28s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 25m 28s the patch passed
+1 💚 compile 23m 11s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 23m 11s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 58s /results-checkstyle-hadoop-common-project_hadoop-common.txt hadoop-common-project/hadoop-common: The patch generated 2 new + 51 unchanged - 0 fixed = 53 total (was 51)
+1 💚 mvnsite 1m 40s the patch passed
+1 💚 javadoc 1m 6s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 48s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 47s the patch passed
+1 💚 shadedclient 22m 25s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 55s hadoop-common in the patch passed.
+1 💚 asflicense 1m 3s The patch does not generate ASF License warnings.
220m 19s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/4/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux cc00aaf8c2cc 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 05a035b
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/4/testReport/
Max. process+thread count 1278 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/4/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 54s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 39m 45s trunk passed
+1 💚 compile 26m 54s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 23m 27s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 7s trunk passed
+1 💚 mvnsite 1m 41s trunk passed
+1 💚 javadoc 1m 12s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 41s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 53s trunk passed
+1 💚 shadedclient 22m 15s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 2s the patch passed
+1 💚 compile 25m 50s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 25m 50s the patch passed
+1 💚 compile 23m 5s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 23m 5s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 1m 4s /results-checkstyle-hadoop-common-project_hadoop-common.txt hadoop-common-project/hadoop-common: The patch generated 2 new + 51 unchanged - 0 fixed = 53 total (was 51)
+1 💚 mvnsite 1m 45s the patch passed
+1 💚 javadoc 1m 7s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 48s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 51s the patch passed
+1 💚 shadedclient 22m 35s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 58s hadoop-common in the patch passed.
+1 💚 asflicense 1m 2s The patch does not generate ASF License warnings.
221m 4s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/3/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 7675c30478a5 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 05a035b
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/3/testReport/
Max. process+thread count 1278 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/3/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 12m 38s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 43m 3s trunk passed
+1 💚 compile 23m 4s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 20m 30s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 12s trunk passed
+1 💚 mvnsite 1m 43s trunk passed
+1 💚 javadoc 1m 15s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 50s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 42s trunk passed
+1 💚 shadedclient 22m 35s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 59s the patch passed
+1 💚 compile 22m 21s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 22m 21s the patch passed
+1 💚 compile 20m 30s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 20m 30s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 5s the patch passed
+1 💚 mvnsite 1m 40s the patch passed
+1 💚 javadoc 1m 3s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 51s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 40s the patch passed
+1 💚 shadedclient 22m 15s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 10s hadoop-common in the patch passed.
+1 💚 asflicense 1m 2s The patch does not generate ASF License warnings.
222m 28s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/5/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 43f2ecf51bf4 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 2f79e36
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/5/testReport/
Max. process+thread count 1291 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/5/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@eubnara
Copy link
Contributor Author

eubnara commented Mar 20, 2023

Wow, thanks for your kind advice!

Comment on lines 202 to 204
spnegoConf.set("hadoop.prometheus.endpoint.enabled", "true");
spnegoConf.set("hadoop.http.filter.initializers",
"org.apache.hadoop.security.AuthenticationFilterInitializer");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@eubnara Sorry, I have one more request. Could you use constants here?

Suggested change
spnegoConf.set("hadoop.prometheus.endpoint.enabled", "true");
spnegoConf.set("hadoop.http.filter.initializers",
"org.apache.hadoop.security.AuthenticationFilterInitializer");
spnegoConf.set(CommonConfigurationKeysPublic.HADOOP_PROMETHEUS_ENABLED, "true");
spnegoConf.set(FILTER_INITIALIZER_PROPERTY, AuthenticationFilterInitializer.class.getName());

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right. Thanks.

…n doesn't work for ResourceManager and Job History Server
@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 46s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 49m 49s trunk passed
+1 💚 compile 24m 35s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 21m 56s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 6s trunk passed
+1 💚 mvnsite 1m 40s trunk passed
+1 💚 javadoc 1m 9s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 43s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 45s trunk passed
+1 💚 shadedclient 23m 51s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 0s the patch passed
+1 💚 compile 24m 2s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 24m 2s the patch passed
+1 💚 compile 22m 11s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 22m 12s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 59s the patch passed
+1 💚 mvnsite 1m 39s the patch passed
+1 💚 javadoc 1m 2s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 44s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 38s the patch passed
+1 💚 shadedclient 23m 45s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 39s hadoop-common in the patch passed.
+1 💚 asflicense 0m 57s The patch does not generate ASF License warnings.
225m 53s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/6/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux aedd8b734199 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / b051971
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/6/testReport/
Max. process+thread count 3159 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/6/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 36s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 48m 8s trunk passed
+1 💚 compile 23m 2s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 20m 29s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 13s trunk passed
+1 💚 mvnsite 1m 46s trunk passed
+1 💚 javadoc 1m 17s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 50s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 44s trunk passed
+1 💚 shadedclient 22m 30s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 1s the patch passed
+1 💚 compile 22m 20s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 22m 20s the patch passed
+1 💚 compile 20m 32s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 20m 32s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 6s the patch passed
+1 💚 mvnsite 1m 42s the patch passed
+1 💚 javadoc 1m 6s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 52s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 40s the patch passed
+1 💚 shadedclient 22m 4s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 13s hadoop-common in the patch passed.
+1 💚 asflicense 1m 1s The patch does not generate ASF License warnings.
215m 41s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/8/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux a11b1536447e 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 61b8153
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/8/testReport/
Max. process+thread count 1705 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/8/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 40s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 1s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 1 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 51m 45s trunk passed
+1 💚 compile 25m 0s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 compile 21m 27s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 checkstyle 1m 25s trunk passed
+1 💚 mvnsite 1m 42s trunk passed
+1 💚 javadoc 1m 9s trunk passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 43s trunk passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 50s trunk passed
+1 💚 shadedclient 23m 59s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 3s the patch passed
+1 💚 compile 24m 32s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javac 24m 32s the patch passed
+1 💚 compile 21m 20s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 javac 21m 20s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 5s the patch passed
+1 💚 mvnsite 1m 41s the patch passed
+1 💚 javadoc 1m 5s the patch passed with JDK Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1
+1 💚 javadoc 0m 48s the patch passed with JDK Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
+1 💚 spotbugs 2m 52s the patch passed
+1 💚 shadedclient 22m 35s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 18m 11s hadoop-common in the patch passed.
+1 💚 asflicense 1m 1s The patch does not generate ASF License warnings.
226m 49s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/7/artifact/out/Dockerfile
GITHUB PR #5480
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux be00764bb402 4.15.0-206-generic #217-Ubuntu SMP Fri Feb 3 19:10:13 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 61b8153
Default Java Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.18+10-post-Ubuntu-0ubuntu120.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_362-8u362-ga-0ubuntu1~20.04.1-b09
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/7/testReport/
Max. process+thread count 2299 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5480/7/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Copy link
Member

@tasanuma tasanuma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@tasanuma tasanuma merged commit 67e02a9 into apache:trunk Mar 22, 2023
@tasanuma
Copy link
Member

Merged it. Thanks for your contribution, @eubnara!

@eubnara
Copy link
Contributor Author

eubnara commented Mar 22, 2023

@tasanuma Thanks for your feedbacks!

ferdelyi pushed a commit to ferdelyi/hadoop that referenced this pull request May 26, 2023
…n doesn't work for ResourceManager and Job History Server (apache#5480)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants