Closed
Description
Bug report
the document exception type for the tokenize.*
functions is TokenError
-- however it appears it now raises SyntaxError
: https://docs.python.org/3.12/library/tokenize.html#tokenize.TokenError
Raised when either a docstring or expression that may be split over several lines is not completed anywhere in the file, for example:
import io
import tokenize
list(tokenize.generate_tokens(io.StringIO('"""').readline))
$ python3.11 t.py
Traceback (most recent call last):
File "/home/asottile/workspace/switch-microcontroller/t.py", line 4, in <module>
list(tokenize.generate_tokens(io.StringIO('"""').readline))
File "/usr/lib/python3.11/tokenize.py", line 465, in _tokenize
raise TokenError("EOF in multi-line string", strstart)
tokenize.TokenError: ('EOF in multi-line string', (1, 0))
and
# python3.13 t.py
Traceback (most recent call last):
File "/y/pyupgrade/t.py", line 4, in <module>
list(tokenize.generate_tokens(io.StringIO('"""').readline))
File "/usr/lib/python3.13/tokenize.py", line 526, in _generate_tokens_from_c_tokenizer
for info in it:
File "<string>", line 1
"""
^
SyntaxError: unterminated triple-quoted string literal (detected at line 1)
Your environment
- CPython versions tested on: c7bf74b
- Operating system and architecture: ubuntu 22.04 x86_64