Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug fix in AWS glue operator when specifying the WorkerType & NumberOfWorkers #19787

Merged
merged 9 commits into from
Dec 6, 2021

Conversation

Ritika-Singhal
Copy link
Contributor

@Ritika-Singhal Ritika-Singhal commented Nov 23, 2021


Existing issue:

closes: #19207
related: #19207

This commit is to fix a bug in the AWS Glue Job operator. In the case when we specify the WorkerType and NumberOfWorkers as the parameters in the create_job_kwargs, the glue job creation would fail. This is because AWS Glue API does not allow to use "AllocatedCapacity" or "MaxCapacity" parameters when the 'WorkerType' and 'NumberOfWorkers' are being assigned. This is the issue which I have addressed and fixed in this commit.

I have removed the "AllocatedCapacity" or "MaxCapacity" parameters from the glue_client.create_job function call if the WorkerType and NumberOfWorkers are specified.

@boring-cyborg boring-cyborg bot added area:providers provider:amazon-aws AWS/Amazon - related issues labels Nov 23, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Nov 23, 2021

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: dev@airflow.apache.org
    Slack: https://s.apache.org/airflow-slack

@uranusjr
Copy link
Member

Instead of sliently dropping num_of_dpus, I feel we should explicitly fail with ValueError in __init__.

@Ritika-Singhal
Copy link
Contributor Author

Instead of sliently dropping num_of_dpus, I feel we should explicitly fail with ValueError in __init__.

In the AwsGlueJobHook __init__, the num_of_dpus are being initialized by default to 10 if the user did not specify this parameter. So, to throw a ValueError, we would need to remove the default initialization of num_of_dpus (otherwise it will always throw the error because num_of_dpus will always have a valid value). It will also need to check if the user has specified one of the following arguments or not i.e. either (num_of_dpus) or (WorkerType and NumberOfWorkers) but not both. The default initialization of the num_of_dpus is keeping the job from failing if the user did not specify either of the parameters.

Currently, the num_of_dpus parameter works as a default option in the else block if the user did not specify the WorkerType and NumberOfWorkers. So, it's not completed dropped.

@uranusjr
Copy link
Member

We can do something like

def __init__(
    self,
    ...,
    num_of_dpus: Optional[int] = None,
    ...,
) -> None:
    create_job_kwargs = create_job_kwargs or {}
    if "WorkerType" in create_job_kwargs and "NumberOfWorkers" in create_job_kwargs:
        if num_of_dpus is not None:
            raise ValueError("Cannot specify num_of_dpus with custom WorkerType")
    elif num_of_dpus is None:
        self.num_of_dpus = 10
    else:
        self.num_of_dpus = num_of_dpus

@Ritika-Singhal
Copy link
Contributor Author

I have committed the suggested change i.e. adding the ValueError in __init__ and added some additional checks for the WorkerType and NumberOfWorkers.

Copy link
Member

@uranusjr uranusjr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the CI passes... (just kicked it start)

@github-actions github-actions bot added the okay to merge It's ok to merge this PR as it does not require more tests label Nov 24, 2021
@github-actions
Copy link

The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease.

@potiuk potiuk merged commit 6e15e3a into apache:main Dec 6, 2021
@boring-cyborg
Copy link

boring-cyborg bot commented Dec 6, 2021

Awesome work, congrats on your first merged pull request!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:providers okay to merge It's ok to merge this PR as it does not require more tests provider:amazon-aws AWS/Amazon - related issues
Projects
None yet
4 participants