Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Martini #164

Closed
20 of 32 tasks
kyleaoman opened this issue Mar 6, 2024 · 27 comments
Closed
20 of 32 tasks

Martini #164

kyleaoman opened this issue Mar 6, 2024 · 27 comments
Assignees
Labels
6/pyOS-approved 9/joss-approved astropy An astropy community affiliated package review

Comments

@kyleaoman
Copy link

kyleaoman commented Mar 6, 2024

Submitting Author: Kyle Oman (@kyleaoman)
All current maintainers: (@kyleaoman)
Package Name: martini
One-Line Description of Package: MARTINI is a modular package for the creation of synthetic resolved HI line observations (data cubes) of smoothed-particle hydrodynamics simulations of galaxies.
Repository Link: https://github.com/kyleaoman/martini
Version submitted: 2.0.11 (note JOSS paper is in branch joss-paper, and that branch is somewhat behind main & 2.0.11)
Editor: @hamogu
Reviewer 1: @taldcroft
Reviewer 2: @MicheleDelliVeneri
Archive: https://zenodo.org/doi/10.5281/zenodo.11193206
JOSS DOI: https://doi.org/10.21105/joss.06860
Version accepted: 2.0.15
Date accepted (month/day/year): 06/03/2024


Code of Conduct & Commitment to Maintain Package

Description

MARTINI is a modular package for the creation of synthetic resolved HI line observations (data cubes) of smoothed-particle hydrodynamics simulations of galaxies. The various aspects of the mock-observing process are divided logically into sub-modules handling the data cube, source, beam, noise, spectral model and SPH kernel. MARTINI is object-oriented: each sub-module provides a class (or classes) which can be configured as desired. For most sub-modules, base classes are provided to allow for straightforward customization. Instances of each sub-module class are given as parameters to the Martini class; a mock observation is then constructed by calling a handful of functions to execute the desired steps in the mock-observing process.

Scope

  • Please indicate which category or categories.
    Check out our package scope page to learn more about our
    scope. (If you are unsure of which category you fit, we suggest you make a pre-submission inquiry):

    • Data retrieval
    • Data extraction
    • Data processing/munging
    • Data deposition
    • Data validation and testing
    • Data visualization1
    • Workflow automation
    • Citation management and bibliometrics
    • Scientific software wrappers
    • Database interoperability

Domain Specific

  • Geospatial
  • Education

Community Partnerships

If your package is associated with an
existing community please check below:

  • For all submissions, explain how the and why the package falls under the categories you indicated above. In your explanation, please address the following points (briefly, 1-2 sentences for each):

    • Who is the target audience and what are scientific applications of this package?

The target audience is research astronomers interested in galaxies (broadly, "extragalactic astronomers") from both the theoretical and observational communities. The package provides a way to transform data products from the theory community (smoothed-particle hydrodynamics based simulations of galaxy formation and evolution) into data products closely resembling the atomic hydrogen signal observed with a radio telescope at 21-cm wavelengths. This enables much more faithful comparisons between theoretical predictions and measurements.

  • Are there other Python packages that accomplish the same thing? If so, how does yours differ?

I am not aware of any other actively maintained packages with similar purpose.

  • If you made a pre-submission enquiry, please paste the link to the corresponding issue, forum post, or other discussion, or @tag the editor you contacted:

N/A (I had previously submitted to astropy under their affiliated package scheme and have been redirected here)

Technical checks

For details about the pyOpenSci packaging requirements, see our packaging guide. Confirm each of the following by checking the box. This package:

  • does not violate the Terms of Service of any service it interacts with.
  • uses an OSI approved license.
  • contains a README with instructions for installing the development version.
  • includes documentation with examples for all functions.
  • contains a tutorial with examples of its essential functions and uses.
  • has a test suite.
  • has continuous integration setup, such as GitHub Actions CircleCI, and/or others.

Publication Options

JOSS Checks
  • The package has an obvious research application according to JOSS's definition in their submission requirements. Be aware that completing the pyOpenSci review process does not guarantee acceptance to JOSS. Be sure to read their submission requirements (linked above) if you are interested in submitting to JOSS.
  • The package is not a "minor utility" as defined by JOSS's submission requirements: "Minor ‘utility’ packages, including ‘thin’ API clients, are not acceptable." pyOpenSci welcomes these packages under "Data Retrieval", but JOSS has slightly different criteria.
  • The package contains a paper.md matching JOSS's requirements with a high-level description in the package root or in inst/.
  • The package is deposited in a long-term repository with the DOI: https://zenodo.org/doi/10.5281/zenodo.11193206

Since I'm coming in through the new astropy route, I had not prepared for JOSS requirements, but was thinking of submitting to JOSS soon anyway. I've now pushed a paper.md in the joss-paper branch. My package is indexed in the ASCL which in turn is indexed in ADS, but neither of these seems to provide an actual DOI so I will need to look into other repositories (probably Zenodo). As far as I understand submission to JOSS happens after the pyOpenSci review so there is a little bit of time to get this done - I expect to have these two items ticked off by the time the pyOpenSci review process is completed.

Note: JOSS accepts our review as theirs. You will NOT need to go through another full review. JOSS will only review your paper.md file. Be sure to link to this pyOpenSci issue when a JOSS issue is opened for your package. Also be sure to tell the JOSS editor that this is a pyOpenSci reviewed package once you reach this step.

Are you OK with Reviewers Submitting Issues and/or pull requests to your Repo Directly?

This option will allow reviewers to open smaller issues that can then be linked to PR's rather than submitting a more dense text based review. It will also allow you to demonstrate addressing the issue via PR links.

  • Yes I am OK with reviewers submitting requested changes as issues to my repo. Reviewers will then link to the issues in their submitted review.

Confirm each of the following by checking the box.

  • I have read the author guide.
  • I expect to maintain this package for at least 2 years and can help find a replacement for the maintainer (team) if needed.

Please fill out our survey

P.S. Have feedback/comments about our review process? Leave a comment here

Editor and Review Templates

The editor template can be found here.

The review template can be found here.

Footnotes

  1. Please fill out a pre-submission inquiry before submitting a data visualization package.

@kyleaoman kyleaoman changed the title Martini - Mock array telescope interferometry of the neutral ISM Martini Mar 6, 2024
@hamogu hamogu added the astropy An astropy community affiliated package review label Mar 6, 2024
@isabelizimm
Copy link
Contributor

Hello there! Just letting you know we have seen this issue and will be following up with pre-review checks shortly!

@isabelizimm
Copy link
Contributor

isabelizimm commented Mar 23, 2024

Thank you so much for this submission! It's so exciting to see these astropy submissions roll in ☄️

Editor in Chief checks

Hi there! Thank you for submitting your package for pyOpenSci
review. Below are the basic checks that your package needs to pass
to begin our review. If some of these are missing, we will ask you
to work on them before the review process begins.

Please check our Python packaging guide for more information on the elements
below.

  • Installation The package can be installed from a community repository such as PyPI (preferred), and/or a community channel on conda (e.g. conda-forge, bioconda).
    • The package imports properly into a standard Python environment import package.
  • Fit The package meets criteria for fit and overlap.
  • Documentation The package has sufficient online documentation to allow us to evaluate package function and scope without installing the package. This includes:
    • User-facing documentation that overviews how to install and start using the package.
    • Short tutorials that help a user understand how to use the package and what it can do for them.
    • API documentation (documentation for your code's functions, classes, methods and attributes): this includes clearly written docstrings with variables defined using a standard docstring format.
  • Core GitHub repository Files
    • README The package has a README.md file with clear explanation of what the package does, instructions on how to install it, and a link to development instructions.
    • Contributing File The package has a CONTRIBUTING.md file that details how to install and contribute to the package.
    • Code of Conduct The package has a CODE_OF_CONDUCT.md file.
    • License The package has an OSI approved license.
      NOTE: We prefer that you have development instructions in your documentation too.
  • Issue Submission Documentation All of the information is filled out in the YAML header of the issue (located at the top of the issue template).
  • Automated tests Package has a testing suite and is tested via a Continuous Integration service.
  • Repository The repository link resolves correctly.
  • Package overlap The package doesn't entirely overlap with the functionality of other packages that have already been submitted to pyOpenSci.
  • Archive (JOSS only, may be post-review): The repository DOI resolves correctly.
  • Version (JOSS only, may be post-review): Does the release version given match the GitHub release (v1.0.0)?

  • Initial onboarding survey was filled out
    We appreciate each maintainer of the package filling out this survey individually. 🙌
    Thank you authors in advance for setting aside five to ten minutes to do this. It truly helps our organization. 🙌


Editor comments

In your README can you add a quick installation piece? It doesn't have to be too fancy, something like the # Installation section on the rdata package would be perfect. In the meantime, I'll sort out finding an editor for this review 🙌

Also, are you able to to add a "version submitted"? I'm assuming it is the latest, but we'll want to know what version the review started at. It's okay if there are releases during the review/the accepted version is different, that is expected as you make edits!

@isabelizimm
Copy link
Contributor

Also, we have an editor lined up; @hamogu will be the editor lined up for this review 🙌

@kyleaoman
Copy link
Author

@isabelizimm I added the version submitted. Not sure what you're requesting for the installation information in README - there's a paragraph there already so if something else/different is needed I'm not sure what it is. Or do you not mean the package README?

@lwasser lwasser moved this to under-review in peer-review-status Mar 27, 2024
@isabelizimm
Copy link
Contributor

isabelizimm commented Mar 29, 2024

Ah, I was specifically looking for the way to pip install martini, a section similar to "Installation" on this package. I do see this command in your installation paragraph, but it's quite easy to miss, so it would be ideal to pull it out into its own line, rather than be part of a larger paragraph.

This change is non-blocking to your review, just a note to make it easier to skim through details 😄

@hamogu
Copy link

hamogu commented Apr 7, 2024

Editor Note

April 7th: I've found one reviewer, but I'm still looking for a second reviewer. Since that's more difficult than expected, I figured the first reviewer can get started, so that we can make good use of the extra time and any suggestions/PRs that the first reviewer might have can already be discussed and implemented.

Editor response to review:


Editor comments

👋 Hi @taldcroft and @MicheleDelliVeneri! Thank you for volunteering to review
for pyOpenSci!

Please fill out our pre-review survey

Before beginning your review, please fill out our pre-review survey. This helps us improve all aspects of our review and better understand our community. No personal data will be shared from this survey - it will only be used in an aggregated format by our Executive Director to improve our processes and programs.

  • reviewer 1 survey completed.
  • reviewer 2 survey completed.

The following resources will help you complete your review:

  1. Here is the reviewers guide. This guide contains all of the steps and information needed to complete your review.
  2. Here is the review template that you will need to fill out and submit
    here as a comment, once your review is complete.

Please get in touch with any questions or concerns! Your review is due:

Reviewers: @taldcroft and @https://forms.gle/F9mou7S3jhe8DMJ16
Due date: mid-June

@hamogu
Copy link

hamogu commented Apr 14, 2024

Just a short note that I'm still looking for a second reviewer for martini. There seems to be a bunching of proposals and proposal review deadlines around this time of year that made several people decline whom I asked. I'm doing my best to find someone and sorry for the delay.

@taldcroft
Copy link

First look review

@kyleaoman - I started on the review of martini. Some initial comments below.

Installation

In all the places where you specify optional pip dependencies with [optional], the whole thing should be enclosed in quotes for some shells like zsh. Without that you get problems like this:

(martini) ➜  martini git:(main) python3 -m pip install astromartini[hdf5_output]
zsh: no matches found: astromartini[hdf5_output]

!{sys.executable} -m pip install astromartini[eaglesource]==2.0.10
zsh:1: no matches found: astromartini[eaglesource]==2.0.10

This link doesn't exist: https://github.com/kyleaoman/martini/tree/2.0.X.
I think you should be pointing to releases page and pointing users to download a zip file from there.

Tests

Tests are currently failing for me in main and 2.0.10 using Python 3.7:

>           from multiprocess.pool import Pool
E           ModuleNotFoundError: No module named 'multiprocess'

../../miniconda3/envs/martini/lib/python3.7/site-packages/martini/spectral_models.py:90: ModuleNotFoundError
============================================================ short test summary info =============================================================
FAILED tests/test_martini.py::TestParallel::test_parallel_consistent_with_serial[True] - ModuleNotFoundError: No module named 'multiprocess'
FAILED tests/test_martini.py::TestParallel::test_parallel_consistent_with_serial[False] - ModuleNotFoundError: No module named 'multiprocess'
FAILED tests/test_spectral_models.py::TestParallelSpectra::test_parallel_spectra[GaussianSpectrum] - ModuleNotFoundError: No module named 'multiprocess'
FAILED tests/test_spectral_models.py::TestParallelSpectra::test_parallel_spectra[DiracDeltaSpectrum] - ModuleNotFoundError: No module named 'multiprocess'
======================================================= 4 failed, 900 deselected in 1.03s ========================================================

This might be something as simple as a typo where you want multiprocessing not multiprocess.

CI

The above test failure highlights a lack of CI since the README badge is showing green for tests.

The review criteria for badges in the README is shown here, so these will need to be included.

  • Badges for:
    • Continuous integration and test coverage,
    • Docs building (if you have a documentation website),
    • A repostatus.org badge,
    • Python versions supported,
    • Current package version (on PyPI / Conda).

The current GitHub workflow action which is providing the current Run Tests pass badge is apparently only checking code linting. You should be running unit tests on at least the minimum and most recent supported versions of Python.

@kyleaoman
Copy link
Author

Hi @taldcroft! Thanks for the initial thoughts. A couple of quick things:

  • I do mean multiprocess and not multiprocessing, it's a similar package but uses dill for serialisation. It's in the optional_requirements.txt, but this probably means I should have a look at what's required, optional-required and how this interacts with tests.
  • There is CI including a full test suite and it does run on github, both on a schedule and on pushes, PRs, etc. The test suite does install the optional requirements. Possible something is glitchy with the badge, it's been ages since I looked at that. I will look at the other badges that aren't present yet.

@taldcroft
Copy link

@kyleaoman - about the CI, got it. I was a bit hasty but the workflow file name code_quality.yml and the lint job at the top got me thinking it was only about linting. Since that is not the case then ignore that comment.

The installation docs say that versions of Python 3.7 and later are supported, so the CI testing should probably include all available versions up through 3.12. It is common policy now for scientific Python packages to follow SPEC 0 — Minimum Supported Dependencies and only support Python versions for 3 years from their release date. You can reduce the version proliferation and CI testing by following that, but that is your choice.

@hamogu
Copy link

hamogu commented Apr 29, 2024

@kyleaoman just to keep you in the loop. I've just contacted the fifth person as a potential second reviewer. I promise we'll find someone, but I'm sorry it's taking way longer than I expected.

@kyleaoman
Copy link
Author

@hamogu Thanks for the update, and no worries. I originally submitted this as an astropy affiliated package in March 2023 as that process was being re-thought, so I've got the hang of being patient with the process.

@hamogu
Copy link

hamogu commented May 10, 2024

@MicheleDelliVeneri : Thank you for agreeing to be the second reviewer for this package. I've updated all the posts above which have instructions on how the review is supposed to work, but I don't know if github send out notifications when I add a handle by editing, so I'm just writing separately here. Please reach out to me if you have any questions about the review or if I can help in any way.

@hamogu
Copy link

hamogu commented May 10, 2024

@kyleaoman About getting a DOI for the code: While there are several ways of getting snapshots of your code into a permanent archive that issues a DOI, the easiest (and what I use for my own projects) is probably the Github-Zenodo integration. That way, every time you make a "release" on github, it triggers a webhook that makes zenodo pull that specific release into zenodo and issue a corresponding DOI. It also pulls most of the metadata (authors, licence, etc.) automatically. (It sometimes takes up to an hour, so don't worry if you don't see it immediately, and sometimes the metadata is not right, e.g. if you have bot commits to your code, you might only want to have the humans listed a "author" on zenodo, but you can edit the metadata on zenodo later if needed.)
The nice thing about it is that's is a fire-and-forget setup: Once it's set up it'll work for all future releases automatically.

Of course, you should do whatever you think works best for your package, I'm just trying to make suggestions to help you make that process easy.

@taldcroft
Copy link

taldcroft commented May 14, 2024

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide

  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README.
  • Installation instructions: for the development version of the package and any non-standard dependencies in README.
  • Vignette(s) demonstrating major functionality that runs successfully locally.1
  • Function Documentation: for all user-facing functions.
  • Examples for all user-facing functions. 2
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING.
  • Metadata including author(s), author e-mail(s), a url, and any other relevant metadata e.g., in a pyproject.toml file or elsewhere.

Readme file requirements
The package meets the readme requirements below:

  • Package has a README.md file in the root directory.

The README should include, from top to bottom:

  • The package name
  • Badges for:
    • Continuous integration and test coverage,
    • Docs building (if you have a documentation website),
    • A repostatus.org badge,
    • Python versions supported,
    • Current package version (on PyPI / Conda).

NOTE: If the README has many more badges, you might want to consider using a table for badges: see this example. Such a table should be more wide than high. (Note that the a badge for pyOpenSci peer-review will be provided upon acceptance.)

  • Short description of package goals.
  • Package installation instructions
  • Any additional setup required to use the package (authentication tokens, etc.)
  • Descriptive links to all vignettes. If the package is small, there may only be a need for one vignette which could be placed in the README.md file.
    • Brief demonstration of package usage (as it makes sense - links to vignettes could also suffice here if package description is clear)
  • Link to your documentation website.
  • If applicable, how the package compares to other similar packages and/or how it relates to other packages in the scientific ecosystem.
  • Citation information

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole.
Package structure should follow general community best-practices. In general please consider whether:

  • Package documentation is clear and easy to find and use.
  • The need for the package is clear
  • All functions have documentation and associated examples for use
  • The package is easy to install

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.3
  • Performance: Any performance claims of the software been confirmed.
  • Automated tests:
    • All tests pass on the reviewer's local machine for the package version submitted by the author. Ideally this should be a tagged version making it easy for reviewers to install.4
    • Tests cover essential functions of the package and a reasonable range of inputs and conditions.5
  • Continuous Integration: Has continuous integration setup (We suggest using Github actions but any CI platform is acceptable for review)
  • Packaging guidelines: The package conforms to the pyOpenSci packaging guidelines.
    A few notable highlights to look at:
    • Package supports modern versions of Python and not End of life versions.
    • Code format is standard throughout package and follows PEP 8 guidelines (CI tests for linting pass)

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

  • A short summary describing the high-level functionality of the software
  • Authors: A list of authors with their affiliations
  • A statement of need clearly stating problems the software is designed to solve and its target audience.
  • References: With DOIs for all those that have one (e.g. papers, datasets, software).

Final approval (post-review)

  • The author has responded to my review and made changes to my satisfaction. I recommend approving this package.

Estimated hours spent reviewing:


Review Comments

Footnotes

  1. Example notebooks provide vignettes, but these require authenticated access to
    specialized databases. I did not verify that these run, so I can only guess that they
    run for a researcher in the field with access.

  2. Strictly speaking the API docs do not include an example for every user-facing function.
    This seems like an unreasonable requirement in this package due to the setup required in
    most cases to make use of a function or method. Overall the API docs seem thorough and
    sufficient to me.

  3. I don't have the domain expertise to confirm the functional claims of the package.

  4. There is an open issue related to missing dependencies for tests.

  5. Test coverage is not included in CI but I ran it locally and found better
    than 90% coverage in core modules. Backend-specific modules had low coverage
    perhaps because of missing dependencies.

@kyleaoman
Copy link
Author

Got Zenodo automated and put in earlier releases too. There's a known issue where the latest release is flagged incorrectly in Zenodo but that's supposed to resolve when the next release is created, which I'll do at the end of review at the latest.

@kyleaoman
Copy link
Author

Thanks @taldcroft for the review. I think that I've now addressed everything raised:

  • All issues that you opened are closed with a PR, commit or explanation.
  • I added a codecov badge to the readme, it shows 86%. Most files are ~100% covered, with a few (deliberately) close to 0%. These uncovered ones are specific to particular simulations that won't get tested on github. I'm looking at setting up some local tests that I can run from time to time on a cluster where the large test data for these can be easily hosted.
  • I noticed you didn't tick the item for "references with DOIs" in the JOSS paper. The reference list does include DOIs for all items in the generated PDF. This can be obtained as an artifact from workflow runs, the most recent is currently https://github.com/kyleaoman/martini/actions/runs/8402372561/artifacts/1352483918

@MicheleDelliVeneri
Copy link

MicheleDelliVeneri commented Jun 3, 2024

HI @kyleaoman here it is my review, sorry for taking some time.
In short, my compliments for the great work.

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide

  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README.
  • Installation instructions: for the development version of the package and any non-standard dependencies in README.
  • Vignette(s) demonstrating major functionality that runs successfully locally.
  • Function Documentation: for all user-facing functions.
  • Examples for all user-facing functions.
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING.
  • Metadata including author(s), author e-mail(s), a url, and any other relevant metadata e.g., in a pyproject.toml file or elsewhere.

Readme file requirements
The package meets the readme requirements below:

  • Package has a README.md file in the root directory.

The README should include, from top to bottom:

  • The package name
  • Badges for:
    • Continuous integration and test coverage,
    • Docs building (if you have a documentation website),
    • A repostatus.org badge,
    • Python versions supported,
    • Current package version (on PyPI / Conda).

NOTE: If the README has many more badges, you might want to consider using a table for badges: see this example. Such a table should be more wide than high. (Note that the a badge for pyOpenSci peer-review will be provided upon acceptance.)

  • Short description of package goals.
  • Package installation instructions
  • Any additional setup required to use the package (authentication tokens, etc.)
  • Descriptive links to all vignettes. If the package is small, there may only be a need for one vignette which could be placed in the README.md file.
    • Brief demonstration of package usage (as it makes sense - links to vignettes could also suffice here if package description is clear)
  • Link to your documentation website.
  • If applicable, how the package compares to other similar packages and/or how it relates to other packages in the scientific ecosystem.
  • Citation information

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole.
Package structure should follow general community best-practices. In general please consider whether:

  • Package documentation is clear and easy to find and use.
  • The need for the package is clear
  • All functions have documentation and associated examples for use
  • The package is easy to install

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.
  • Performance: Any performance claims of the software been confirmed.
  • Automated tests:
    • All tests pass on the reviewer's local machine for the package version submitted by the author. Ideally this should be a tagged version making it easy for reviewers to install.
    • Tests cover essential functions of the package and a reasonable range of inputs and conditions.
  • Continuous Integration: Has continuous integration setup (We suggest using Github actions but any CI platform is acceptable for review)
  • Packaging guidelines: The package conforms to the pyOpenSci packaging guidelines.
    A few notable highlights to look at:
    • Package supports modern versions of Python and not End of life versions.
    • Code format is standard throughout package and follows PEP 8 guidelines (CI tests for linting pass)

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

  • A short summary describing the high-level functionality of the software
  • Authors: A list of authors with their affiliations
  • A statement of need clearly stating problems the software is designed to solve and its target audience.
  • References: With DOIs for all those that have one (e.g. papers, datasets, software).

Final approval (post-review)

  • The author has responded to my review and made changes to my satisfaction. I recommend approving this package.

Estimated hours spent reviewing:


Review Comments

All green from me, workflows have passed the tests on my local server, and I have run all tests locally.
All functions works as expected.

@hamogu
Copy link

hamogu commented Jun 3, 2024

@taldcroft Could you have a look and see if all your concerns are addressed and, if so, update your comment above by ticking the "final-approval" box?

@taldcroft
Copy link

@hamogu - Done!

@hamogu
Copy link

hamogu commented Jun 3, 2024

@MicheleDelliVeneri I noticed that you did not tick the box for "Package supports modern versions of Python", but I just checked, the tests run with Python version up to 3.12, so I think that's safe! I take the editors privilege to edit your review and tick that box before proceeding.

@hamogu
Copy link

hamogu commented Jun 3, 2024


🎉 has been approved by pyOpenSci! Thank you for submitting and many thanks to for reviewing this package! 😸

Author Wrap Up Tasks

There are a few things left to do to wrap up this submission:

  • Activate Zenodo watching the repo if you haven't already done so.
  • Tag and create a release to create a Zenodo version and DOI.
  • Add the badge for pyOpenSci peer-review to the README.md of . The badge should be [![pyOpenSci](https://tinyurl.com/y22nb8up)](https://github.com/pyOpenSci/software-review/issues/issue-number).
  • Please fill out the post-review survey. All maintainers and reviewers should fill this out.
It looks like you would like to submit this package to JOSS. Here are the next steps:
  • Once the JOSS issue is opened for the package, we strongly suggest that you subscribe to issue updates. This will allow you to continue to update the issue labels on this review as it goes through the JOSS process.
  • Login to the JOSS website and fill out the JOSS submission form using your Zenodo DOI. When you fill out the form, be sure to mention and link to the approved pyOpenSci review. JOSS will tag your package for expedited review if it is already pyOpenSci approved.
  • Wait for a JOSS editor to approve the presubmission (which includes a scope check).
  • Once the package is approved by JOSS, you will be given instructions by JOSS about updating the citation information in your README file.
  • When the JOSS review is complete, add a comment to your review in the pyOpenSci software-review repo here that it has been approved by JOSS. An editor will then add the JOSS-approved label to this issue.

🎉 Congratulations! You are now published with both JOSS and pyOpenSci! 🎉

Editor Final Checks

Please complete the final steps to wrap up this review. Editor, please do the following:

  • Make sure that the maintainers filled out the post-review survey
  • Invite the maintainers to submit a blog post highlighting their package. Feel free to use / adapt language found in this comment to help guide the author.
  • Change the status tag of the issue to 6/pyOS-approved6 🚀🚀🚀.
  • Invite the package maintainer(s) and both reviewers to slack if they wish to join.
  • If the author submits to JOSS, please continue to update the labels for JOSS on this issue until the author is accepted (do not remove the 6/pyOS-approved label). Once accepted add the label 9/joss-approved to the issue. Skip this check if the package is not submitted to JOSS.
  • If the package is JOSS-accepted please add the JOSS doi to the YAML at the top of the issue.

If you have any feedback for us about the review process please feel free to share it here. We are always looking to improve our process and documentation in the peer-review-guide.

@hamogu
Copy link

hamogu commented Jun 3, 2024

@kyleaoman Just to confirm: I see you did a very recent release on May15th. Is that the version that includes the responses to @taldcroft comments? Or do you plan to make a new release now that it's accepted? I just want to make sure I'm linking to the correct release number.

@kyleaoman
Copy link
Author

kyleaoman commented Jun 3, 2024

@hamogu release 2.0.12 contains the latest commit so yes, that's up to date. There's also a Zenodo DOI for it. I'll do the JOSS submission and other little to-do items asap, probably tomorrow. Thanks!

@hamogu
Copy link

hamogu commented Jun 4, 2024

Congratulations! I switched to accepted - just remember to fill out the post-review survey and add a badge pointing to this issue, if you want to.

Also, we want to invite you to write a blog post (totally optional) on your package for us to promote your work! if you are interested - here are a few examples of other blog posts:

pandera
movingpandas

and here is a markdown example that you could use as a guide when creating your post.

If you are too busy for this no worries. But if you have time - we'd love to spread the word about your package!

Last, a huge thanks to @taldcroft and @MicheleDelliVeneri for their review - without reviewers like you PyOpenSci would not be possible.

Also, we want to invite all of you (authors and reviewers) you to join the PyOpenSci slack - it's a very active community (more than astropy!) with lots if technical discussion and people happy to help with packaging and related Python questions. Let me know if you are interested, and we'll send you an invite yo any email address you choose!

@kyleaoman
Copy link
Author

I've completed the post-review survey, added the badge, and just now submitted to JOSS - will come back once that process has run its course :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
6/pyOS-approved 9/joss-approved astropy An astropy community affiliated package review
Projects
Status: joss-accepted
Development

No branches or pull requests

5 participants