-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Closed
Labels
area:corekind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetpending-response
Description
Apache Airflow version
2.7.2
What happened
Currently I am using -
Python version: 3.8.18
Airflow version: 2.7.1v (on K8S)
When we run SparkKubernetesOperator function with YAML file path, it work fine. But when we are running the same after reading the YAML file, it fails.
Example -
def load_template_v1():
....
with open(template_path) as pyspark_template_file:
template = yaml.safe_load(pyspark_template_file)
return templateWhat you think should happen instead
It should launch Spark Operator successfully.
How to reproduce
Instead of passing YAML file, pass the content of file.
Operating System
K8S
Versions of Apache Airflow Providers
Python version: 3.8.18
Airflow version: 2.7.1
Deployment
Official Apache Airflow Helm Chart
Deployment details
HELM
Python version: 3.8.18
Airflow version: 2.7.1
Anything else
No
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
Metadata
Metadata
Assignees
Labels
area:corekind:bugThis is a clearly a bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yetlabel for new issues that we didn't triage yetpending-response