-
Notifications
You must be signed in to change notification settings - Fork 893
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add E2E test for tools #3325
Add E2E test for tools #3325
Conversation
Signed-off-by: SajidAlamQB <90610031+SajidAlamQB@users.noreply.github.com>
Signed-off-by: SajidAlamQB <90610031+SajidAlamQB@users.noreply.github.com>
…rg/kedro into dev/e2e-test-for-add-ons
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The structure of the tests looks great, but I would make the check in check_created_project_structure_from_addons
more specific to the selected tools. So if pyspark
is selected, also check if spark.yml
exists and for data
that there's a data directory etc. For viz you might just have to check the run logs for the reporting
pipeline.
…rg/kedro into dev/e2e-test-for-add-ons
QQ: How is the |
Also: could we add tests that use the shorthand flag as well? |
The e2e tests are running via the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding these!
Porting over our conversation offline - it would be nice to see these tests ensure that the tools we provide are ready to go for the projects we add them to
e.g. pytest
works for the test tool, black .
and ruff check .
work for linting, kedro viz
should work for viz. Docs, logs, data structure, and pyspark are fine though - I'm not sure there's much more to test for those.
What do you think?
Signed-off-by: SajidAlamQB <90610031+SajidAlamQB@users.noreply.github.com>
Signed-off-by: SajidAlamQB <90610031+SajidAlamQB@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These tests are great, thanks for adding them @SajidAlamQB ! ⭐
@AhdraMeraliQB raises an interesting point, although I would personally say these are a "nice to have", because it's not technically our responsibility to ensure the behaviour of pytest
and ruff
and black
. In the unit tests we check that the right requirements are added right? So then I think it's an okay assumption that when installed these work okay. But would be good to hear other opinions 🙂
^^ We could just run |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! 💯
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor suggestions.
I agree with this as well, I feel it is a bit out of the scope of this PR and more of a nice to have. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SajidAlamQB , excellent work! LGTM
Signed-off-by: SajidAlamQB <90610031+SajidAlamQB@users.noreply.github.com>
Description
For most of the add-ons flow I've been testing the work manually: create a new project with a combination of add-ons, install requirements and try a kedro run. This process has uncovered several bugs already.
Context
We should add automated tests to test at least some of the add-ons flow.
Implementation
At a minimum have tests for:
All of the above should have an example so kedro run can be tested.
Development Notes
Checklist
RELEASE.md
file