Skip to content

Conversation

@GlenboLake
Copy link
Contributor

@GlenboLake GlenboLake commented Aug 27, 2024

Apache Airflow version

2.8.4 in my environment, but the issue is still present in main

What happened

A client submitted an S3 file to my workflow with an octothorpe in the filename, essentially s3://my-bucket/path/to/key/email campaign - PO# 123456_REPORT.csv. When my Airflow DAG tried to parse this URL, part of the filename was lost:

>>> s3_key = 's3://my-bucket/path/to/key/email campaign - PO# 123456_REPORT.csv'
>>> S3Hook.parse_s3_url(s3_key)
('my-bucket', 'path/to/key/email campaign - PO')

What you think should happen instead

The key should not be truncated. The result of the above example should be ('my-bucket', 'path/to/key/email campaign - PO# 123456_REPORT.csv')

How to reproduce:

Call S3Hook.parse_s3_url() with a # character in the S3 URL. Everything after the # is lost, because urllib.parse.urlsplit() is current called with the default option allow_fragments=True.

This PR passes allow_fragments=False to urlsplit to prevent this error. As far as I can tell, there are no valid cases of # in an S3 key being treated as a fragment, and no existing GitHub issue to fix this.

Operating System

Ubuntu 22.04

Versions of Apache Airflow Providers

Provider Version
apache-airflow-providers-amazon 8.20.0
apache-airflow-providers-celery 3.6.2
apache-airflow-providers-common-io 1.3.1
apache-airflow-providers-common-sql 1.12.0
apache-airflow-providers-ftp 3.8.0
apache-airflow-providers-http 4.10.1
apache-airflow-providers-imap 3.5.0
apache-airflow-providers-postgres 5.10.2
apache-airflow-providers-sendgrid 3.4.0
apache-airflow-providers-sftp 4.9.1
apache-airflow-providers-smtp 1.6.1
apache-airflow-providers-sqlite 3.7.1
apache-airflow-providers-ssh 3.10.1

Deployment

This may be reproduced without deploying

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@boring-cyborg boring-cyborg bot added area:providers provider:amazon AWS/Amazon - related issues labels Aug 27, 2024
@boring-cyborg
Copy link

boring-cyborg bot commented Aug 27, 2024

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contributors' Guide (https://github.com/apache/airflow/blob/main/contributing-docs/README.rst)
Here are some useful points:

  • Pay attention to the quality of your code (ruff, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
  • Always keep your Pull Requests rebased, otherwise your build might fail due to changes not related to your commits.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: dev@airflow.apache.org
    Slack: https://s.apache.org/airflow-slack

Copy link
Contributor

@vincbeck vincbeck left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Love it!

@vincbeck
Copy link
Contributor

Static checks are failing, running the pre-commit should auto resolve them

@GlenboLake GlenboLake force-pushed the main branch 2 times, most recently from 8fa89da to ca4aa16 Compare August 29, 2024 12:34
@eladkal
Copy link
Contributor

eladkal commented Aug 29, 2024

static tests are failing

The current implementation of parse_s3_url will truncate a key if it contains
an octothorpe character. By passing the allow_fragments=False argument to
urlsplit, keys will be correctly parsed.
@eladkal eladkal merged commit 062fb3a into apache:main Aug 29, 2024
@boring-cyborg
Copy link

boring-cyborg bot commented Aug 29, 2024

Awesome work, congrats on your first merged pull request! You are invited to check our Issue Tracker for additional contributions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:providers provider:amazon AWS/Amazon - related issues

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants