Skip to content

Conversation

@gopidesupavan
Copy link
Member


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

@gopidesupavan
Copy link
Member Author

Will extract into multiple PR, thats will work to iterate easily, core, task-sdk, providers(may be multiple prs)

@gopidesupavan gopidesupavan changed the title Bump mypy to 1.16.1 and fix mypy core errors Bump mypy to 1.16.1 Jul 8, 2025
@gopidesupavan
Copy link
Member Author

have extracted here for airflow-core #53004

@gopidesupavan gopidesupavan requested a review from kaxil as a code owner July 8, 2025 21:38
@gopidesupavan
Copy link
Member Author

extracted task-dk mypy fixes here: #53047

@gopidesupavan
Copy link
Member Author

amazon: #53088

@gopidesupavan gopidesupavan force-pushed the bump-mypy-core branch 2 times, most recently from 3586f2d to 620b4be Compare July 9, 2025 19:45
@gopidesupavan
Copy link
Member Author

Provider mypy fixes required:

providers/elasticsearch/src/airflow/providers/elasticsearch/version_compat.py:48: error:
Incompatible types in assignment (expression has type
"type[list[tuple[str, str]]]", variable has type
"UnionType[list[StructuredLogMessage], str]")  [assignment]
        EsLogMsgType = list[tuple[str, str]]  # type: ignore[misc]
                       ^~~~~~~~~~~~~~~~~~~~~
providers/elasticsearch/src/airflow/providers/elasticsearch/version_compat.py:48: note: Error code "assignment" not covered by "type: ignore" comment
providers/edge3/src/airflow/providers/edge3/models/edge_worker.py:113: error:
Incompatible return value type (got "Any | None", expected "dict[Any, Any]") 
[return-value]
            return json.loads(self.sysinfo) if self.sysinfo else None
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/apache/cassandra/src/airflow/providers/apache/cassandra/hooks/cassandra.py:105: error:
Argument 1 to "get_lb_policy" of "CassandraHook" has incompatible type
"Any | None"; expected "str"  [arg-type]
            lb_policy = self.get_lb_policy(policy_name, policy_args)
                                           ^~~~~~~~~~~
providers/keycloak/src/airflow/providers/keycloak/auth_manager/keycloak_auth_manager.py:211: error:
Redundant cast to "Literal['GET', 'POST', 'PUT', 'DELETE', 'MENU']" 
[redundant-cast]
                    (cast("ExtendedResourceMethod", "MENU"), menu_item.val...
                     ^
providers/http/src/airflow/providers/http/hooks/http.py:238: error:
Incompatible types in assignment (expression has type "Any | None", variable has
type "int")  [assignment]
            session.max_redirects = self.merged_extra.get("max_redirects",...
                                    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
providers/samba/src/airflow/providers/samba/hooks/samba.py:260: error: Missing
positional argument "path" in call to "__call__" of "_Wrapped"  [call-arg]
            with open(local_filepath, "rb") as f, self.open_file(destinati...
                                                  ^~~~~~~~~~~~~~~~~~~~~~~~...
providers/samba/src/airflow/providers/samba/hooks/samba.py:260: error: Argument
1 to "__call__" of "_Wrapped" has incompatible type "str"; expected "SambaHook" 
[arg-type]
    ... open(local_filepath, "rb") as f, self.open_file(destination_filepath,...
                                                        ^~~~~~~~~~~~~~~~~~~~
providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/operators/resource.py:111: error:
Incompatible types in assignment (expression has type "Any | None", variable has
type "dict[Any, Any]")  [assignment]
                metadata: dict = body.get("metadata", None)
                                 ^~~~~~~~~~~~~~~~~~~~~~~~~~
providers/opensearch/src/airflow/providers/opensearch/log/os_task_handler.py:55: error:
Incompatible types in assignment (expression has type
"type[list[tuple[str, str]]]", variable has type
"UnionType[list[StructuredLogMessage], str]")  [assignment]
        OsLogMsgType = list[tuple[str, str]]  # type: ignore[misc]
                       ^~~~~~~~~~~~~~~~~~~~~
providers/opensearch/src/airflow/providers/opensearch/log/os_task_handler.py:55: note: Error code "assignment" not covered by "type: ignore" comment
providers/google/src/airflow/providers/google/cloud/links/kubernetes_engine.py:62: error:
Item "None" of "Any | None" has no attribute "name"  [union-attr]
                cluster_name=cluster.name,
                             ^~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:265: error:
Incompatible types in assignment (expression has type "int", variable has type
"VerificationStatus")  [assignment]
                    monitoring_v3.NotificationChannel.VerificationStatus.V...
                    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:277: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                    channel.name = None
                                   ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:287: error:
Incompatible types in assignment (expression has type "None", variable has type
"MutationRecord")  [assignment]
                policy.creation_record = None
                                         ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:288: error:
Incompatible types in assignment (expression has type "None", variable has type
"MutationRecord")  [assignment]
                policy.mutation_record = None
                                         ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:304: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                    policy.name = None
                                  ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:306: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                        condition.name = None
                                         ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:535: error:
Incompatible types in assignment (expression has type "int", variable has type
"VerificationStatus")  [assignment]
                    monitoring_v3.NotificationChannel.VerificationStatus.V...
                    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:547: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                    channel.name = None
                                   ^~~~
providers/samba/src/airflow/providers/samba/transfers/gcs_to_samba.py:200: error:
Missing positional argument "path" in call to "__call__" of "_Wrapped" 
[call-arg]
            samba_hook.makedirs(dir_path, exist_ok=True)
            ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/samba/src/airflow/providers/samba/transfers/gcs_to_samba.py:200: error:
Argument 1 to "__call__" of "_Wrapped" has incompatible type "str"; expected
"SambaHook"  [arg-type]
            samba_hook.makedirs(dir_path, exist_ok=True)
                                ^~~~~~~~
providers/google/src/airflow/providers/google/cloud/sensors/datafusion.py:135: error:
Unsupported operand types for in ("Any | None" and "Iterable[str]")  [operator]
            return pipeline_status in self.expected_statuses
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/openlineage/src/airflow/providers/openlineage/plugins/listener.py:108: error:
Incompatible redefinition (redefinition with type
"Callable[[OpenLineageListener, TaskInstanceState, TaskInstance, Any], None]",
original type
"Callable[[OpenLineageListener, TaskInstanceState, RuntimeTaskInstance], Any]") 
[misc]
            def on_task_instance_running(
            ^
providers/openlineage/src/airflow/providers/openlineage/plugins/listener.py:260: error:
Incompatible redefinition (redefinition with type
"Callable[[OpenLineageListener, TaskInstanceState, TaskInstance, Any], None]",
original type
"Callable[[OpenLineageListener, TaskInstanceState, RuntimeTaskInstance | TaskInstance], None]")
 [misc]
            def on_task_instance_success(
            ^
providers/openlineage/src/airflow/providers/openlineage/plugins/listener.py:389: error:
Incompatible redefinition (redefinition with type
"Callable[[OpenLineageListener, TaskInstanceState, TaskInstance, str | BaseException | None, Any], None]",
original type
"Callable[[OpenLineageListener, TaskInstanceState, RuntimeTaskInstance | TaskInstance, str | BaseException | None], None]")
 [misc]
            def on_task_instance_failed(
            ^
providers/google/src/airflow/providers/google/cloud/sensors/dataflow.py:363: error:
Incompatible return value type (got "list[dict[Any, Any]] | Any", expected
"bool")  [return-value]
            return result if self.callback is None else self.callback(resu...
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/sensors/dataflow.py:485: error:
Incompatible return value type (got "list[dict[Any, Any]] | Any", expected
"bool")  [return-value]
            return result if self.callback is None else self.callback(resu...
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/apache/kafka/src/airflow/providers/apache/kafka/operators/consume.py:137: error:
"str" not callable  [operator]
                apply_callable = partial(
                                 ^
providers/apache/kafka/src/airflow/providers/apache/kafka/operators/consume.py:144: error:
"str" not callable  [operator]
                apply_callable = partial(
                                 ^
providers/apache/kafka/src/airflow/providers/apache/kafka/operators/produce.py:119: error:
"str" not callable  [operator]
            producer_callable = partial(
                                ^
providers/google/src/airflow/providers/google/cloud/triggers/bigquery.py:594: error:
Incompatible types in assignment (expression has type "Any | None", variable has
type "list[Any]")  [assignment]
                        records = records.pop(0) if records else None
                                  ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/hooks/cloud_sql.py:1106: error:
Unsupported left operand type for + ("None")  [operator]
            return self.project_id + ":" + self.location + ":" + self.inst...
                   ^~~~~~~~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/hooks/cloud_sql.py:1106: note: Left operand is of type "Any | None"
providers/google/src/airflow/providers/google/cloud/hooks/cloud_sql.py:1141: error:
Argument "project_id" to "CloudSqlProxyRunner" has incompatible type
"Any | None"; expected "str"  [arg-type]
                project_id=self.project_id,
                           ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:271: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:359: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:420: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:491: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/operators/bigquery.py:1163: error:
List comprehension has incompatible type List[Any | list[Any]]; expected
List[dict[str, Any]]  [misc]
                    table_data = [row.values() if isinstance(row, Row) els...

@gopidesupavan
Copy link
Member Author

Provider mypy fixes required:

providers/elasticsearch/src/airflow/providers/elasticsearch/version_compat.py:48: error:
Incompatible types in assignment (expression has type
"type[list[tuple[str, str]]]", variable has type
"UnionType[list[StructuredLogMessage], str]")  [assignment]
        EsLogMsgType = list[tuple[str, str]]  # type: ignore[misc]
                       ^~~~~~~~~~~~~~~~~~~~~
providers/elasticsearch/src/airflow/providers/elasticsearch/version_compat.py:48: note: Error code "assignment" not covered by "type: ignore" comment
providers/edge3/src/airflow/providers/edge3/models/edge_worker.py:113: error:
Incompatible return value type (got "Any | None", expected "dict[Any, Any]") 
[return-value]
            return json.loads(self.sysinfo) if self.sysinfo else None
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/apache/cassandra/src/airflow/providers/apache/cassandra/hooks/cassandra.py:105: error:
Argument 1 to "get_lb_policy" of "CassandraHook" has incompatible type
"Any | None"; expected "str"  [arg-type]
            lb_policy = self.get_lb_policy(policy_name, policy_args)
                                           ^~~~~~~~~~~
providers/keycloak/src/airflow/providers/keycloak/auth_manager/keycloak_auth_manager.py:211: error:
Redundant cast to "Literal['GET', 'POST', 'PUT', 'DELETE', 'MENU']" 
[redundant-cast]
                    (cast("ExtendedResourceMethod", "MENU"), menu_item.val...
                     ^
providers/http/src/airflow/providers/http/hooks/http.py:238: error:
Incompatible types in assignment (expression has type "Any | None", variable has
type "int")  [assignment]
            session.max_redirects = self.merged_extra.get("max_redirects",...
                                    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
providers/samba/src/airflow/providers/samba/hooks/samba.py:260: error: Missing
positional argument "path" in call to "__call__" of "_Wrapped"  [call-arg]
            with open(local_filepath, "rb") as f, self.open_file(destinati...
                                                  ^~~~~~~~~~~~~~~~~~~~~~~~...
providers/samba/src/airflow/providers/samba/hooks/samba.py:260: error: Argument
1 to "__call__" of "_Wrapped" has incompatible type "str"; expected "SambaHook" 
[arg-type]
    ... open(local_filepath, "rb") as f, self.open_file(destination_filepath,...
                                                        ^~~~~~~~~~~~~~~~~~~~
providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/operators/resource.py:111: error:
Incompatible types in assignment (expression has type "Any | None", variable has
type "dict[Any, Any]")  [assignment]
                metadata: dict = body.get("metadata", None)
                                 ^~~~~~~~~~~~~~~~~~~~~~~~~~
providers/opensearch/src/airflow/providers/opensearch/log/os_task_handler.py:55: error:
Incompatible types in assignment (expression has type
"type[list[tuple[str, str]]]", variable has type
"UnionType[list[StructuredLogMessage], str]")  [assignment]
        OsLogMsgType = list[tuple[str, str]]  # type: ignore[misc]
                       ^~~~~~~~~~~~~~~~~~~~~
providers/opensearch/src/airflow/providers/opensearch/log/os_task_handler.py:55: note: Error code "assignment" not covered by "type: ignore" comment
providers/google/src/airflow/providers/google/cloud/links/kubernetes_engine.py:62: error:
Item "None" of "Any | None" has no attribute "name"  [union-attr]
                cluster_name=cluster.name,
                             ^~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:265: error:
Incompatible types in assignment (expression has type "int", variable has type
"VerificationStatus")  [assignment]
                    monitoring_v3.NotificationChannel.VerificationStatus.V...
                    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:277: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                    channel.name = None
                                   ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:287: error:
Incompatible types in assignment (expression has type "None", variable has type
"MutationRecord")  [assignment]
                policy.creation_record = None
                                         ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:288: error:
Incompatible types in assignment (expression has type "None", variable has type
"MutationRecord")  [assignment]
                policy.mutation_record = None
                                         ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:304: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                    policy.name = None
                                  ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:306: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                        condition.name = None
                                         ^~~~
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:535: error:
Incompatible types in assignment (expression has type "int", variable has type
"VerificationStatus")  [assignment]
                    monitoring_v3.NotificationChannel.VerificationStatus.V...
                    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~...
providers/google/src/airflow/providers/google/cloud/hooks/stackdriver.py:547: error:
Incompatible types in assignment (expression has type "None", variable has type
"str")  [assignment]
                    channel.name = None
                                   ^~~~
providers/samba/src/airflow/providers/samba/transfers/gcs_to_samba.py:200: error:
Missing positional argument "path" in call to "__call__" of "_Wrapped" 
[call-arg]
            samba_hook.makedirs(dir_path, exist_ok=True)
            ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/samba/src/airflow/providers/samba/transfers/gcs_to_samba.py:200: error:
Argument 1 to "__call__" of "_Wrapped" has incompatible type "str"; expected
"SambaHook"  [arg-type]
            samba_hook.makedirs(dir_path, exist_ok=True)
                                ^~~~~~~~
providers/google/src/airflow/providers/google/cloud/sensors/datafusion.py:135: error:
Unsupported operand types for in ("Any | None" and "Iterable[str]")  [operator]
            return pipeline_status in self.expected_statuses
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/openlineage/src/airflow/providers/openlineage/plugins/listener.py:108: error:
Incompatible redefinition (redefinition with type
"Callable[[OpenLineageListener, TaskInstanceState, TaskInstance, Any], None]",
original type
"Callable[[OpenLineageListener, TaskInstanceState, RuntimeTaskInstance], Any]") 
[misc]
            def on_task_instance_running(
            ^
providers/openlineage/src/airflow/providers/openlineage/plugins/listener.py:260: error:
Incompatible redefinition (redefinition with type
"Callable[[OpenLineageListener, TaskInstanceState, TaskInstance, Any], None]",
original type
"Callable[[OpenLineageListener, TaskInstanceState, RuntimeTaskInstance | TaskInstance], None]")
 [misc]
            def on_task_instance_success(
            ^
providers/openlineage/src/airflow/providers/openlineage/plugins/listener.py:389: error:
Incompatible redefinition (redefinition with type
"Callable[[OpenLineageListener, TaskInstanceState, TaskInstance, str | BaseException | None, Any], None]",
original type
"Callable[[OpenLineageListener, TaskInstanceState, RuntimeTaskInstance | TaskInstance, str | BaseException | None], None]")
 [misc]
            def on_task_instance_failed(
            ^
providers/google/src/airflow/providers/google/cloud/sensors/dataflow.py:363: error:
Incompatible return value type (got "list[dict[Any, Any]] | Any", expected
"bool")  [return-value]
            return result if self.callback is None else self.callback(resu...
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/sensors/dataflow.py:485: error:
Incompatible return value type (got "list[dict[Any, Any]] | Any", expected
"bool")  [return-value]
            return result if self.callback is None else self.callback(resu...
                   ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/apache/kafka/src/airflow/providers/apache/kafka/operators/consume.py:137: error:
"str" not callable  [operator]
                apply_callable = partial(
                                 ^
providers/apache/kafka/src/airflow/providers/apache/kafka/operators/consume.py:144: error:
"str" not callable  [operator]
                apply_callable = partial(
                                 ^
providers/apache/kafka/src/airflow/providers/apache/kafka/operators/produce.py:119: error:
"str" not callable  [operator]
            producer_callable = partial(
                                ^
providers/google/src/airflow/providers/google/cloud/triggers/bigquery.py:594: error:
Incompatible types in assignment (expression has type "Any | None", variable has
type "list[Any]")  [assignment]
                        records = records.pop(0) if records else None
                                  ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/hooks/cloud_sql.py:1106: error:
Unsupported left operand type for + ("None")  [operator]
            return self.project_id + ":" + self.location + ":" + self.inst...
                   ^~~~~~~~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/hooks/cloud_sql.py:1106: note: Left operand is of type "Any | None"
providers/google/src/airflow/providers/google/cloud/hooks/cloud_sql.py:1141: error:
Argument "project_id" to "CloudSqlProxyRunner" has incompatible type
"Any | None"; expected "str"  [arg-type]
                project_id=self.project_id,
                           ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:271: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:359: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:420: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py:491: error:
Call to abstract method "execute" of "_TeradataComputeClusterOperator" with
trivial body via super() is unsafe  [safe-super]
            super().execute(context)
            ^~~~~~~~~~~~~~~
providers/google/src/airflow/providers/google/cloud/operators/bigquery.py:1163: error:
List comprehension has incompatible type List[Any | list[Any]]; expected
List[dict[str, Any]]  [misc]
                    table_data = [row.values() if isinstance(row, Row) els...

cc: @jscheffl

@gopidesupavan
Copy link
Member Author

cool We have mypy green :)

@potiuk
Copy link
Member

potiuk commented Jul 11, 2025

Wooohooooo

@gopidesupavan
Copy link
Member Author

merging failure not related

@gopidesupavan gopidesupavan merged commit 67d6e4b into apache:main Jul 11, 2025
85 of 86 checks passed
@jscheffl
Copy link
Contributor

Wohoo!

stephen-bracken pushed a commit to stephen-bracken/airflow that referenced this pull request Jul 15, 2025
* Bump mypy to 1.16.1

* Bump mypy to 1.16.1 and fix core mypy errors

* Bump mypy to 1.16.1 and fix core mypy errors

* Remove ParsedLog casting
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:API Airflow's REST/HTTP API area:serialization

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants