Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use stdatamodels for ASDF-in-FITS support #2052

Merged
merged 2 commits into from
Mar 16, 2023

Conversation

pllim
Copy link
Contributor

@pllim pllim commented Mar 1, 2023

Description

This pull request is to use ASDF-in-FITS code from stdatamodels instead of asdf.

Unfortunately, this is blocked by:

Fixes #1980

TODO

Change log entry

  • Is a change log needed? If yes, is it added to CHANGES.rst? If you want to avoid merge conflicts,
    list the proposed change log here for review and add to CHANGES.rst before merge. If no, maintainer
    should add a no-changelog-entry-needed label.

Checklist for package maintainer(s)

This checklist is meant to remind the package maintainer(s) who will review this pull request of some common things to look for. This list is not exhaustive.

  • Are two approvals required? Branch protection rule does not check for the second approval. If a second approval is not necessary, please apply the trivial label.
  • Do the proposed changes actually accomplish desired goals? Also manually run the affected example notebooks, if necessary.
  • Do the proposed changes follow the STScI Style Guides?
  • Are tests added/updated as required? If so, do they follow the STScI Style Guides?
  • Are docs added/updated as required? If so, do they follow the STScI Style Guides?
  • Did the CI pass? If not, are the failures related?
  • Is a milestone set? Set this to bugfix milestone if this is a bug fix and needs to be released ASAP; otherwise, set this to the next major release milestone.
  • After merge, any internal documentations need updating (e.g., JIRA, Innerspace)?

@codecov
Copy link

codecov bot commented Mar 13, 2023

Codecov Report

Patch coverage: 100.00% and project coverage change: -0.04 ⚠️

Comparison is base (60bb7e9) 92.12% compared to head (a0e55e3) 92.09%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2052      +/-   ##
==========================================
- Coverage   92.12%   92.09%   -0.04%     
==========================================
  Files         140      140              
  Lines       15444    15440       -4     
==========================================
- Hits        14228    14219       -9     
- Misses       1216     1221       +5     
Impacted Files Coverage Δ
...default/plugins/metadata_viewer/metadata_viewer.py 94.73% <ø> (ø)
jdaviz/configs/imviz/helper.py 96.20% <ø> (ø)
jdaviz/configs/imviz/plugins/parsers.py 97.56% <100.00%> (-2.44%) ⬇️
jdaviz/configs/imviz/tests/test_parser.py 100.00% <100.00%> (ø)
jdaviz/configs/specviz2d/tests/test_parsers.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@pllim pllim marked this pull request as ready for review March 13, 2023 23:00
@pllim
Copy link
Contributor Author

pllim commented Mar 13, 2023

The remaining dev failures are caused by the following and we are going to deal with them separately:

TST: Update test_parse_jwst_niriss_grism
because asdf_in_fits do the invalid WCS handling natively now.

TST: specviz2d no longer warns.
@pllim

This comment was marked as resolved.

@pllim
Copy link
Contributor Author

pllim commented Mar 15, 2023

The remaining dev failure has nothing to do with ASDF. Looking like some uncertainty stuff in spectrum model fitting, so maybe @bmorris3 or @rosteen know what's up. Probably also happens on main but I cannot check that because this dev job wouldn't even start in main right now because of ASDF kerfuffle.

________________ test_register_model_with_uncertainty_weighting ________________
...
    def test_register_model_with_uncertainty_weighting(specviz_helper, spectrum1d):
        spectrum1d.uncertainty = StdDevUncertainty(spectrum1d.flux * 0.1)
        with warnings.catch_warnings():
            warnings.simplefilter('ignore')
            specviz_helper.load_spectrum(spectrum1d)
        modelfit_plugin = specviz_helper.plugins['Model Fitting']
    
        # Test registering a simple linear fit
        modelfit_plugin._obj.model_comp_selected = 'Linear1D'
        modelfit_plugin._obj.vue_add_model({})
        with pytest.warns(AstropyUserWarning, match='Model is linear in parameters'):
            modelfit_plugin.calculate_fit()
        assert len(specviz_helper.app.data_collection) == 2
    
        # Test fitting again overwrites original fit
        with pytest.warns(AstropyUserWarning, match='Model is linear in parameters'):
            modelfit_plugin.calculate_fit()
        assert len(specviz_helper.app.data_collection) == 2
    
        # Test custom model label
        test_label = 'my_Linear1D'
        modelfit_plugin._obj.results_label = test_label
        with pytest.warns(AstropyUserWarning, match='Model is linear in parameters'):
            modelfit_plugin.calculate_fit()
        assert test_label in specviz_helper.app.data_collection
    
        # Test that the parameter uncertainties were updated
        expected_uncertainties = {'slope': 0.0003657, 'intercept': 2.529}
        result_model = modelfit_plugin._obj.component_models[0]
        for param in result_model["parameters"]:
>           assert np.allclose(param["std"], expected_uncertainties[param["name"]], rtol=0.01)
E           assert False
E            +  where False = <function allclose at ...>(0.0007063584243707317, 0.0003657, rtol=0.01)
E            +    where <function allclose at ...> = np.allclose

jdaviz/configs/default/plugins/model_fitting/tests/test_plugin.py:96: AssertionError

@bmorris3
Copy link
Contributor

This dev failure is coming from astropy/astropy#14519. I see the test is checking for equivalence with:

expected_uncertainties = {'slope': 0.0003657, 'intercept': 2.529}

Any idea how these expected uncertainties were computed? It might be appropriate to update them, since the behavior of astropy modeling has changed upstream.

CHANGES.rst Outdated Show resolved Hide resolved
@pllim
Copy link
Contributor Author

pllim commented Mar 15, 2023

Any idea how these expected uncertainties were computed?

I have no idea but maybe git blame can give you a clue. Anyways, fixing the model fitting would be out of scope here but we should do it in separate PR.

Copy link
Contributor

@bmorris3 bmorris3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Let's ping @havok2063 for his two cents before merge?

@bmorris3 bmorris3 added the trivial Only needs one approval instead of two label Mar 16, 2023
@bmorris3
Copy link
Contributor

Looks good! Let's ping @havok2063 for his two cents before merge?

@havok2063 approves via Slack. All good!

@pllim pllim merged commit 358d516 into spacetelescope:main Mar 16, 2023
@pllim pllim deleted the asdf-giving-me-the-fits branch March 16, 2023 15:09
@pllim
Copy link
Contributor Author

pllim commented Mar 16, 2023

Thanks, all!

@pllim

This comment was marked as resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
imviz testing trivial Only needs one approval instead of two
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Imviz: AsdfInFits removal from ASDF 3.0
2 participants