-
Notifications
You must be signed in to change notification settings - Fork 28.7k
[SPARK-16856] [WEBUI] [CORE] Link the application's executor page to the master's UI #14461
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Test build #63129 has finished for PR 14461 at commit
|
I take some time to check out and run this later today, but on a quick pass the code looks ok. I have some issues with the actual pr though. The link on the application summary page is redundant, clicking the spark logo goes back to the master summary already. As for the Executors page, I'll have to test it, but I think there are quite a few situations where a admin might not want a user to access the master UI, even in a standalone cluster. |
I forgot to comment, but I did check this out and it runs well, I still think the link on the app summary is redundant though. |
Test build #63302 has finished for PR 14461 at commit
|
# Conflicts: # project/MimaExcludes.scala
@ajbozarth Thanks for pointing out that. I have removed the link on the application summary page and reverted related changes. |
retest it please |
Test build #63303 has finished for PR 14461 at commit
|
Test build #63306 has finished for PR 14461 at commit
|
I think I have fixed the conflict correctly. Why's that? |
The excludes are a list, you forgot to add commas between them |
Oh yes. Thanks :) |
Test build #63308 has finished for PR 14461 at commit
|
@nblintao I'd like to take another look at this, mind rebasing? |
@ajbozarth I finally have a chance to rebase it in the winter break. Could you please have a look? Thanks! |
Test build #70698 has finished for PR 14461 at commit
|
Thanks for rebasing, I'll check this out sometime this week |
@nblintao did you try this code on different cluster types? Like local, standalone, yarn, mesos? |
I did some testing and this properly shows up on local and standalone and doesn't show the link on yarn. So this LGTM |
@@ -126,7 +126,8 @@ case class SparkListenerApplicationStart( | |||
time: Long, | |||
sparkUser: String, | |||
appAttemptId: Option[String], | |||
driverLogs: Option[Map[String, String]] = None) extends SparkListenerEvent | |||
driverLogs: Option[Map[String, String]] = None, | |||
masterWebUiUrl: Option[String] = None) extends SparkListenerEvent |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you need this? It seems the standalone registration messages already take care of this.
If you want this, are you intentionally not saving this in event logs? The way it is, this is only ever useful when the application is starting but hasn't yet registered with the master. That's a very short period of time and probably not worth it.
If you add code to save to event logs, I'd rename this to be more cluster manager-agnostic. e.g. "clusterWebUiUrl".
@@ -50,6 +50,10 @@ private[ui] class ExecutorsPage( | |||
private val listener = parent.listener | |||
|
|||
def render(request: HttpServletRequest): Seq[Node] = { | |||
val masterWebUiUrl = listener.masterWebUiUrl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this information makes way more sense in the "Environment" tab.
(gentle ping @nblintao) |
@HyukjinKwon Thanks for reminding. I will have a look asap. Too many things to do recently. |
(gentle ping @nblintao) |
@HyukjinKwon Thanks for reminding. I am finals this week. I will fix this after 15th. |
What changes were proposed in this pull request?
Added a "Back to Master" link on Executor Page of Application Detail UI, if the application is run in standalone mode.
The links link to the main page of Master UI.
How was this patch tested?
Run Spark as a "local cluster" by
./bin/spark-shell --master=local-cluster[2,1,1024]
.Then view the Executor Page on Application Detail UI (http://192.168.1.104:4040/executors/). Users could go to the Master Page by the link "Back to Master" on that page.