Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use asdf files for testing compare_spec_type and star_type_probability #662

Merged
merged 6 commits into from
Dec 9, 2020

Conversation

lea-hagen
Copy link
Member

The generate_files_for_tests code in BEAST-Fitting/beast-examples#16 creates asdf files for testing the compare_spec_type and star_type_probability tools. This updates test_regresscheck to use those files. @karllark, could you use generate_files_for_tests to update the cached metal_small files (assuming, of course, that it doesn't need any changes first)? Then I'll be able to check if test_regresscheck is working as expected.

This PR also adds hdu=1 to the stats file read-in for compare_spec_type.

@karllark
Copy link
Member

@lea-hagen : it would be quicker if you were to generate the needed new files and then just tell me where they are on central storage? Saves me from having to clone your branch, etc.

@karllark karllark added the tests label Nov 12, 2020
@lea-hagen
Copy link
Member Author

No problem, I'll send you a message with the path!

@karllark
Copy link
Member

karllark commented Dec 4, 2020

Codestyle failures need to be addressed in addition to the files.

@lea-hagen
Copy link
Member Author

Even after uploading new files during the hack day last week, there were still some failing tests. I think I tracked down the issue, so along with new code in BEAST-Fitting/beast-examples#19, this will hopefully work.

@karllark
Copy link
Member

karllark commented Dec 8, 2020

Still failing. Hopefully you can find a solution fairly soon. If not, maybe tag some of these tests as skip?

Need this PR to be merged before merging PR #678.

@lea-hagen
Copy link
Member Author

Yesterday I narrowed it down to an issue with folder teardown when doing tests for code in tools.run. I'm fine solving that in a different PR, so I'll skip the relevant tests and push the branch shortly.

@lea-hagen
Copy link
Member Author

@karllark do you know why the noisemodel test is failing? I double checked that all of the input files and settings match, so maybe I just need to rebase and regenerate files again??

@karllark
Copy link
Member

karllark commented Dec 8, 2020

No idea. Try a rebase. We have been working on the noise model recently, so I could see one of the recent PRs changing the noisemodel. Depends on how out-of-date your branch is.

@karllark
Copy link
Member

karllark commented Dec 8, 2020

Not that I can see what PR would have caused this looking at the network graph.

@lea-hagen
Copy link
Member Author

Yeah, I couldn't figure anything out about noisemodel changes either. I'll rebase and see if the files change.

@lea-hagen
Copy link
Member Author

lea-hagen commented Dec 8, 2020

I rebased and ran
(a) generate_files_from_tests from start to finish as normal
(b) the contents test_toothpick_noisemodel, using the cached files (settings, SED grid, ASTs)

The resulting noise models are identical. So the best I can figure is it's the Mac vs Linux issue (e.g., #169).

Edit - I created the cached files on a Linux virtual server (only accessed from my Mac). So I have no idea.

@karllark
Copy link
Member

karllark commented Dec 9, 2020

So, not a Mac versus Linux issue. Sigh.

@lea-hagen
Copy link
Member Author

If you run generate_files_for_tests on your system, does it match the current cached files?

@karllark
Copy link
Member

karllark commented Dec 9, 2020

Testing things on my laptop now. Hopefully that will provide some clue.

@karllark
Copy link
Member

karllark commented Dec 9, 2020

I have done a number of tests. The tests were failing on my machine until I regenerated the files on my computer and pushed them to the cache directory. Now the tests are passing on my computer. Re-running the github action tests now to see if the online tests pass.

@karllark
Copy link
Member

karllark commented Dec 9, 2020

Tests pass online! Yeah!

The only thing I can think of is that there is some package/setup on your computer that is causing the problem. FYI, I am using python 3.8. One thought, did you recently update your system/conda/...?

@karllark
Copy link
Member

karllark commented Dec 9, 2020

As the tests pass, this the PR ready for review/merging? If so, please mark as ready to review.

@lea-hagen
Copy link
Member Author

I did rebuild my environment recently (in the past ~month?). I didn't have python 3.8 so tox was grumpy. Super weird that we're getting different results, but I'm glad it's sort of figured out! Thanks for taking the time to poke at it.

I'll mark as ready for review.

@lea-hagen lea-hagen marked this pull request as ready for review December 9, 2020 21:26
Copy link
Member

@karllark karllark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great.

@karllark
Copy link
Member

karllark commented Dec 9, 2020

Will merge. Would be good to update/add to #598 with what is needed/has been fixed with this PR.

@karllark karllark merged commit a7d4787 into BEAST-Fitting:master Dec 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants