Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run pytest inside the docker container #444

Closed
wants to merge 7 commits into from

Conversation

yakutovicha
Copy link
Member

fixes #443

@yakutovicha yakutovicha marked this pull request as draft March 8, 2023 10:42
@yakutovicha
Copy link
Member Author

Ok, so up to now it is possible to run tests by entering the awb folder and executing the following:

docker build -t aiidalab/full-stack-with-browsers:latest tests/

JUPYTER_TOKEN=$(openssl rand -hex 32)

DOCKER_ID=$(docker run -it -d --rm -v $(pwd):/home/jovyan/apps/aiidalab-widgets-base -e JUPYTER_TOKEN=${JUPYTER_TOKEN} aiidalab/full-stack-with-browsers:latest)

docker exec -it ${DOCKER_ID} bash -c "cd apps/aiidalab-widgets-base && pip install -e.[dev,smiles]"

docker exec -it ${DOCKER_ID} bash -c "cd apps/aiidalab-widgets-base && pytest --driver Chrome -s"

docker stop ${DOCKER_ID}

I haven't yet figured out how to convert this script to github action. But I am going to try that now.

docker build -t aiidalab/full-stack-with-browsers:latest tests/

JUPYTER_TOKEN=$(openssl rand -hex 32)

DOCKER_ID=$(docker run -it -d --rm -v $(pwd):/home/jovyan/apps/aiidalab-widgets-base -e JUPYTER_TOKEN=${JUPYTER_TOKEN} aiidalab/full-stack-with-browsers:latest)

docker exec -it ${DOCKER_ID} bash -c "cd apps/aiidalab-widgets-base && pip install -e.[dev,smiles]"

docker exec -it ${DOCKER_ID} bash -c "cd apps/aiidalab-widgets-base && pytest --driver Chrome -s"

docker stop ${DOCKER_ID}
@danielhollas
Copy link
Contributor

danielhollas commented Mar 9, 2023

This is a great effort, but I am just wondering, are we sure we want to introduce this much additional complexity to the current test suite?

btw: When I was thinking about this problem for my own app, I could not wrap my head around how to get code coverage data from a jupyter python kernel. Have you been able to solve that issue?

@yakutovicha
Copy link
Member Author

This is a great effort, but I am just wondering, are we sure we want to introduce this much additional complexity to the current test suite?

I am reducing complexity actually. Everything happens inside the container.

btw: When I was thinking about this problem for my own app, I could not wrap my head around how to get code coverage data from a jupyter python kernel. Have you been able to solve that issue?

It looks like it is not possible when using selenium. But nbmake does help here, see my PR #441. It executes the notebook cells as a normal python interpreter.

@yakutovicha
Copy link
Member Author

yakutovicha commented Mar 9, 2023

But, I have to admit, I struggle with the GitHub actions file. It is simply not possible to easily convert the script above into the GitHub actions and let it run. One gets errors that one even doesn't expect.

For example, currently, I don't understand why this is happening:

      running egg_info
      creating aiidalab_widgets_base.egg-info
      error: could not create 'aiidalab_widgets_base.egg-info': Permission denied
      [end of output]

Why can I not modify a folder mounted from a container...?

@yakutovicha yakutovicha force-pushed the tests/run-pytest-inside-container branch 2 times, most recently from 118bcbb to 82625ef Compare March 9, 2023 14:42
@danielhollas
Copy link
Contributor

I am reducing complexity actually. Everything happens inside the container.

I was referring to the fact of having to build extra Docker image and extra Gihub actions. :-) Also, even if this work, without Selenium interaction the code coverage will not be complete anyway right? Since just executing the notebook is not enough, some code paths get executed through the Selenium interactions.

An alternative would be to focus on a more unit-style testing without the notebooks and measure that. Of course that has issues as well. 🤷

@yakutovicha
Copy link
Member Author

Let me explain my intention, because I sort of fail to see why it is more complex.

Here is what we have now:

  1. We start a docker container by pytest-selenium. That makes it impossible to extract the coverage since the pytest is located outside of the container and the jupyter container runs inside of it.
  2. While trying to access jupyter notebooks we install Chrome/Firefox drivers. This is unavoidable.

So here are the options:

  1. We do everything on the GitHub runner machine. But this requires configuring AiiDA, PGSQL, Rabbitmq and all the rest. I don't think it is easy. AiiDAlab image has it all.
  2. We do everything inside the container, but that requires installing Chrome/Firefox drivers to it.

Here is what I am trying to do

  1. I moved driver installation from GitHub actions to Dockerfile.
  2. I ran the container and execute pytest in it. Pytest will execute all the test_ files and all the notebooks. The only test we do not get coverage from is the selenium one. But that is fine, I believe. We will mimic those by running unit tests (as you mentioned).

So my conclusion. It is fine to keep the selenium tests. But we use them more as smoke tests. The same notebooks run as smoke tests can be run with nbmake which will report coverage of launching the widgets. The test coverage interacting with widgets will have to be approximated by unit-test style tests.

It is not clear to me what would be an alternative. But if you have an idea - please share.

@yakutovicha yakutovicha force-pushed the tests/run-pytest-inside-container branch 3 times, most recently from 2dea6fd to 3f8d84e Compare March 9, 2023 16:02
@yakutovicha yakutovicha force-pushed the tests/run-pytest-inside-container branch from 3f8d84e to 62e309a Compare March 9, 2023 16:23
@danielhollas
Copy link
Contributor

We do everything inside the container, but that requires installing Chrome/Firefox drivers to it.

This seems like a false dichotomy. I would just separate different style of tests into separate Github jobs.
If we're not able to get the coverage from selenium tests anyway, I'd keep them exactly as they are right now. No need for extra docker image. Unit tests or nbmake-style test can run in a separate Github workflow. Or am I missing something?

@alex-treebeard
Copy link

hey sorry for intruding. nbmake maintainer here.

RE the container permissions issue, I've had a lot more success using the Devcontainer CLI in github actions.

RE coverage: I am not aware of anybody producing coverage data via nbmake. My initial belief is you won't see any coverage because under the hood we launch a kernel and message it via the local network. Keen to know if I'm wrong!

RE testing widgets: What you have put in place for this project is very impressive, and I am aware that others are trying to do the same. Do you recommend this pattern of using Selenium? Could we improve it together? (Issue)

@unkcpz
Copy link
Member

unkcpz commented Mar 13, 2023

@alex-treebeard Thanks a lot for chiming in and for the very useful advice!

RE the #444 (comment), I've had a lot more success using the Devcontainer CLI in github actions.

I think it deserves a try and it will allow us to drop pytest-docker by using this solution. I didn't investigate how the built image can be further used in github CI. Just want to mention that the aiida-core start using this feature in aiidateam/aiida-core#5913, and we can give it a try here as well.

If we're not able to get the coverage from selenium tests anyway, I'd keep them exactly as they are right now. No need for extra docker image. Unit tests or nbmake-style test can run in a separate Github workflow. Or am I missing something?

For the PR itself, I tend to agree with @danielhollas here that for the test coverage, it makes more sense to focus on unit tests. I make a first attempt in #435 where I separate the tests_notebook out and implemented a unit test for the ProcessInputWidget. It seems to work fine. For this repository specifically which provide the basic widgets as building blocks of other AiiDAlab app, I think it is more proper to have unit type test rather than the smock tests. The ultimate goal is we reach a high test coverage without using the smoke test. The smoke tests are leaving to such as QeApp to having a black box test for integrity, and that is where the devcontainers can still be used as mentioned above.

@danielhollas
Copy link
Contributor

Hi @alex-treebeard 👋

thanks for insightful comments!

I am not aware of anybody producing coverage data via nbmake. My initial belief is you won't see any coverage because under the hood we launch a kernel and message it via the local network. Keen to know if I'm wrong!

Yeah, that's what I was afraid of. I did a bit of googling, and found another similar project, nbval, that claims to be able to generate coverage for code imported to the notebook via integration with pytest-cov. Do you think that is something nbmake could support in the future?

RE testing widgets: What you have put in place for this project is very impressive, and I am aware that others are trying to do the same. Do you recommend this pattern of using Selenium? Could we improve it together? (treebeardtech/nbmake#99)

I don't want to speak for others, but it is a new technology for me. It is great when it works and it already caught some bugs for me, but it is a bit hard to get right. It is very easy to create a flaky test if one is not careful, and the Python API documentation is not the best. It would be great to see another examples where people are doing this.

We also might be unlucky in that one of the widgets here is doing 3D rendering, which brings its own challenges.

You can find some more tests in my repo:
https://github.com/ispg-group/aiidalab-ispg/tree/main/tests

@yakutovicha
Copy link
Member Author

Thanks, everybody, for the insightful comments. Indeed, I was wrong regarding the test coverage produced by nbmake - it does not do it, it seems.

At the moment, I can't continue with this idea due to time constraints. Even if we decide to move back to it at some point, this would require careful thinking.

So I am closing the PR.

@yakutovicha yakutovicha deleted the tests/run-pytest-inside-container branch April 26, 2023 09:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Run pytest inside the container
4 participants