-
Notifications
You must be signed in to change notification settings - Fork 20
Open
Description
Overview
Develop specific performance (accuracy, and/or efficiency) testing of hsp2 model runs. Extends work #135
- base is test10 https://github.com/respec/HSPsquared/tree/master/tests/test10/HSP2results
- existing tests verify that:
hsp2 importcompletes correctlyhsp2 runcompletes without error
- Test should verify:
- Assert that run data in a given output in
test10.h5is simulated correctly, like- Mean, median, min Q??
- ?
- Assert that run data in a given output in
Tasks:
- Develop accuracy tests for test10 - basic testing now compares all simulation variables from HSPF run to hsp2 run, during each timestep of the simulation: https://github.com/respec/HSPsquared/blob/develop/tests/test_regression.py
-
(will not do for the time being) Develop working install forenvironment_dev.yml(see https://github.com/respec/HSPsquared/blob/dependency_options/environment_dev.yml)
Code
Temporary bridge to pytest
The test harness demonstrated below was intended as a temporary bridge (by @austinorr ) between the previous ad-hoc testing (diy from the commandline) and pytest (which offers a CLI).
- Look at the ‘tests/ipwater’ directory for use example (store known-correct results in a .plt export from the fortran version).
- Alternative, look at the test10 and test10specl directories and set up your example similar to those.
- To add your test and run it from the commandline, you can then call it with the ‘-k’ flag:
(env) ~/source/hsp2$ pytest -k ipwater
- Or
(env) ~/source/hsp2$ pytest -k test10
Currently these testing setups are for checking parity with the fortran code, if your tests have other criteria (like you know what the correct result needs to be) then use the ipwatertest as an example for making tests with any assert you like.
Install pytest etc.
- Note if running these in a virtual environment, you will need to prefix
python3 -mto each of the following commands:
prefix=''
# uncomment to force venv python
# prefix='python3 -m '
${prefix}pip install -e .[dev] # make sure test dependencies are installed
${prefix}pip install pytest-xdist # allow running tests in parallel
${prefix}pip install pytest-cov # permit more verbose/detailed tests
Running pytest using Virtual Environment
- Allows you to test multiple python versions on your local machine
# install the VE using a specific python version
python3.9 -m virtualenv p39test
# activate your VE
source ./p39test/bin/activate
# run regression tests
python3 -m pytest -k test_regression
# Note, this also works:
python3 -m pytest tests/test_regression.py
Create a venv with a specific version of a lib (like pandas)
# prefix='python3 -m '
cd /usr/local/share/venv/
rm -Rf pandas3
python3.12 -m venv pandas3
source /usr/local/share/venv/pandas3/bin/activate
${prefix}pip install pandas==3.0.0
${prefix}pip install -e .[dev] # make sure test dependencies are installed
${prefix}pip install pytest-xdist # allow running tests in parallel
${prefix}pip install pytest-cov # permit more verbose/detailed tests
Compare Outputs from test10.uci and test10specl.uci
Basic testing in HSPsquared/tests/ (from @PaulDudaRESPEC @austinorr)
- Run code without numba:
NUMBA_DISABLE_JIT=1 hsp2 run test10specl.h5 - Run quick test:
pytest -k test_case - Run comparison btwn hsp2 and hspf:
pytest -k test_case -n 2 - Show every line of code not hit (requires
pytest-cov):NUMBA_DISABLE_JIT=1 pytest -k test_case --cov --cov-branch --cov-report term-missing
Very simple pytest
- when a test file is loaded, functions in that file prefixed with
test_will be automatically executed.
import pytest
import os
def test_h5_file_exists():
assert os.path.exists('test10.h5')
# This shows how how to break the test
def test_that_should_fail():
assert os.path.exists('nonexistent_h5_file.h5')
Metadata
Metadata
Assignees
Labels
No labels