Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multimodel tests using real data #856

Merged
merged 67 commits into from
Dec 7, 2020
Merged

Add multimodel tests using real data #856

merged 67 commits into from
Dec 7, 2020

Conversation

stefsmeets
Copy link
Contributor

@stefsmeets stefsmeets commented Nov 4, 2020

This PR adds some system tests using real data available in https://github.com/ESMValGroup/ESMValTool_sample_data.
Our first focus is on making tests for the multi-model statistics, because the unit tests do not quite cover all cases, and running recipes manually in ESMValTool is way to labour intensive.

Mini test specification

  • span overlap / full
    • daily group by calendar / monthly -> pass
    • + regression test
  • cubes with time axes -> pass
  • cubes without vertical -> iris.exceptions.CoordinateNotFoundError: 'Expected to find exactly 1 depth coordinate, but found none.'
  • cubes without horizontal -> pass
  • cubes with no time dimension -> ValueError: Cannot guess bounds for a coordinate of length 1.
  • for failing tests link issue or make a new one
    • comment in xfail

To do

Tasks

  • Create an issue to discuss what you are going to do, if you haven't done so already (and add the link at the bottom)
  • This pull request has a descriptive title that can be used in a changelog
  • Circle/CI tests pass. Status can be seen below your pull request. If the tests are failing, click the link to find out why.
  • Codacy code quality checks pass. Status can be seen below your pull request. If there is an error, click the link to find out why. If you suspect Codacy may be wrong, please ask by commenting.
  • Add developer documentation

If you need help with any of the tasks above, please do not hesitate to ask by commenting in the issue or pull request.


Helps with #685 #673 #781
Depends on ESMValGroup/ESMValTool_sample_data#3

@stefsmeets
Copy link
Contributor Author

Ah, shoot, just as I wanted to see the green check for CircleCI, they are having an outage 😅

@stefsmeets
Copy link
Contributor Author

stefsmeets commented Nov 6, 2020

Seems like everything is working, except I don't understand why the conda build fails:
TESTS FAILED: esmvalcore-2.1.0-py_0.tar.bz2

  • The issue seems to be that it cannot find the sample data repo (ImportError), even though it is specified in the setup.py. Probably it needs to be specified elsewhere...
  • Removing the test step works
  • `pytest -m 'not functional' does not work as it still imports the file
  • Ignoring the entire directory --ignore=tests/functional also works.

@bouweandela Do you know why is the build also doing tests and how can I add dependencies there? I tried updating the meta.yaml, but it does work as expected. I'm also confused at the moment why we need to run the full tests twice...

@stefsmeets
Copy link
Contributor Author

Hi @bouweandela , I think this is ready for a first pass.
Once we finish working on the sample data repo I will finalize this PR.

@stefsmeets stefsmeets marked this pull request as ready for review November 6, 2020 13:44
@bouweandela
Copy link
Member

Do you know why is the build also doing tests and how can I add dependencies there?

After building the package, conda build tests if it actually works as expected by installing it in a new environment and running the tests. The instructions on how to configure that are here.

@stefsmeets
Copy link
Contributor Author

stefsmeets commented Dec 2, 2020

Hi @bouweandela , this PR is ready for review.

  • Could you please check if the tests run locally for you?
  • Also, do we want to run the test on every commit? Otherwise, we should pass "-m not functional" to the pytest call.
  • Do you (or anyone else) have any idea how to solve this issue:

Part of the regression tests (4/8) fail on CircleCI, and I cannot figure out why. Oddly enough, the other 4 regression tests with different calendars pass on CircleCI:~

FAILED tests/functional/test_multimodel.py::test_multimodel_regression_month[overlap]
FAILED tests/functional/test_multimodel.py::test_multimodel_regression_month[full]
FAILED tests/functional/test_multimodel.py::test_multimodel_regression_day[full-365_day]
FAILED tests/functional/test_multimodel.py::test_multimodel_regression_day[overlap-365_day]

I checked the masks, but they are False for all data. I also compared all the versions between CircleCI and my local installation, but to no avail. I'm not sure what to do next. -> solved by clearing my pytest cache 🤦‍♂️

Loading sample data takes ~30 seconds. To save some time, store the
pytest cache for next time.
@bouweandela
Copy link
Member

@stefsmeets The tests are failing now because you removed the reference netcdf files in c065fb5

@stefsmeets
Copy link
Contributor Author

@stefsmeets The tests are failing now because you removed the reference netcdf files in c065fb5

Ah, shoot. The .nc files are in the .gitignore so they do not show up in git status.

@stefsmeets
Copy link
Contributor Author

Hi @bouweandela I think this is ready to merge!

Copy link
Member

@bouweandela bouweandela left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed this is starting to look really good, just a few things left to do.

doc/contributing.rst Outdated Show resolved Hide resolved
setup.cfg Outdated Show resolved Hide resolved
setup.py Outdated Show resolved Hide resolved
tests/sample_data/test_multimodel.py Outdated Show resolved Hide resolved
tests/sample_data/test_multimodel.py Outdated Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
preprocessor Related to the preprocessor testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants