Skip to content

Commit 977e445

Browse files
committed
[SPARK-41864][INFRA][PYTHON] Fix mypy linter errors
Currently, the GitHub Action Python linter job is broken. This PR will recover Python linter failure. There are two kind of failures. 1. https://github.com/apache/spark/actions/runs/3829330032/jobs/6524170799 ``` python/pyspark/pandas/sql_processor.py:221: error: unused "type: ignore" comment Found 1 error in 1 file (checked 380 source files) ``` 2. After fixing (1), we hit the following. ``` ModuleNotFoundError: No module named 'py._path'; 'py' is not a package ``` No. Pass the GitHub CI on this PR. Or, manually run the following. ``` $ dev/lint-python starting python compilation test... python compilation succeeded. starting black test... black checks passed. starting flake8 test... flake8 checks passed. starting mypy annotations test... annotations passed mypy checks. starting mypy examples test... examples passed mypy checks. starting mypy data test... annotations passed data checks. all lint-python tests passed! ``` Closes #39373 from dongjoon-hyun/SPARK-41864. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org> (cherry picked from commit 13b2856) Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
1 parent 2da30ad commit 977e445

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed

dev/requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,3 +43,4 @@ PyGithub
4343

4444
# pandas API on Spark Code formatter.
4545
black
46+
py

python/pyspark/pandas/sql_processor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -218,7 +218,7 @@ def _get_ipython_scope() -> Dict[str, Any]:
218218
in an IPython notebook environment.
219219
"""
220220
try:
221-
from IPython import get_ipython # type: ignore[import]
221+
from IPython import get_ipython
222222

223223
shell = get_ipython()
224224
return shell.user_ns

0 commit comments

Comments
 (0)