Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do you test that annotations are consitent with expected behavior #7150

Closed
InvisibleHandOfDoom opened this issue Jul 4, 2019 · 2 comments
Labels

Comments

@InvisibleHandOfDoom
Copy link

Hi. I am not sure if that's that right place to ask, so if I've made a mistake please forgive me.

I'd like to add type annotations to a package I develop. However I cannot wrap my head around testing part. I know I can use mypy directly to check if annotations are consistent, as typeshed project does.

However I'd be more comfortable with more comprehensive coverage to be confident that my annotations really express what I need.

Let's say I have:

def foo(x: int, y: int) -> int: ....

I'd like to be able to express both positive

foo(1, 2)  # typecheck: should pass

and negative test cases:

foo(1, "bar")  # typecheck: should fail

Are there any established patterns for managing such test cases?

For the positive ones I can imagine some workarounds (like testing against examples or doctests), but it won't cut it for the negative cases.

Thanks in advance.

@ilevkivskyi
Copy link
Member

First, please note it is not yet possible to test stubs against sources #5028 (although IIUC it is not exactly what you need).

For some ideas unit-testing the types, see #6115 (also some projects like sqlalchemy-stubs just (ab)use mypy's own test framework).

@sobolevn
Copy link
Member

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants