Replies: 2 comments
-
u didn't reply on slack |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hello @tooptoop4 I was able to solve this problem. It was indeed the problem with unix principal - spark running in docker containers. However I had already tried adding in docker a username and also other suggestions in stack - but nothing seemed to work. I got a hint to solve the problem by reading this, it looked like the argo container was not able to provide a username to the spark-docker. Therefore I added these in to argo yml under templates container:
Adding this solved the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I tried to search a lot in the web and debug this issue, unfortunately in vain.
I have created a simple
pyspark
application (dockerized) which I am trying to run in Argo workflows. While the pyspark application just creates a dataframe and prints it (that is it), it runs properly when I deploy that manually in Kubernetes cluster. However when I run the same docker image in the same namespace and cluster in Kubernetes using argo workflows - I am getting thisKerberosAuthException
. Can anyone point me what to do? There is no use ofkerberos
in my application at all.As I said, this is only happening when I am running it via argo. Otherwise the application runs perfectly standalone in Kubernetes. Any help is appreciated!
Pod description when manually running through k8s (successful):
Pod desc when running through argo-wf (Exception)
Python code:
Beta Was this translation helpful? Give feedback.
All reactions