Skip to content

Integrate into PyOpenSci for Our Affiliated package review #402

@Cadair

Description

@Cadair

The goal of this issue is to formally identify and discuss the changes to our Affiliated package review system that would stem from integrating with pyopensci.


A refresher: SunPy Affiliated Packages

SEP 4

The basic concepts and requirements of SunPy's affiliated packages are documented formally by SEP-4, although that SEP does not specify how the packages should be reviewed.

SEP 4 spells out the purpose of the affiliated package system and why we have a review:

The primary purpose of the affiliated package system is to support software developers that provide additional tools and functionality that extends and builds upon the core library.

A review process ensures that affiliated packages provide useful functionality to the community at a standard of quality similar to the core SunPy package. This process also ensures that cross-compatibility is maintained throughout the SunPy ecosystem. The SunPy project will ensure that affiliated packages are maintained and publicized in order to encourage community development.

It also set's the following requirements for affiliated packages:

  • The package shall provide functionality that is relevant and useful to the community and must be relatively mature.
  • The package must make use of all appropriate features in the core library, to reduce code duplication and complexity.
  • The package should strive to not duplicate any functionality in either the core library or any other affiliated package.
  • The package must provide documentation that is of comparable quality to the core library.
  • The package must provide a test suite that can be used to verify its functionality.
  • The developers of the package should engage with the SunPy community to encourage knowledge and code sharing.

and, finally, some requirements for the review process:

  • [...] be clear, fair and well-documented and must include criteria based on the requirements provided in this SEP.
  • It is expected that all affiliated packages shall be re-reviewed at a regular (annual) cadence in order to make sure that affiliated packages maintain compliance.

Our Existing Review Criteria and PyOpenSci

We currently evaluate our packages on the following criteria:

  • Functionality
  • Integration
  • Documentation
  • Testing
  • Duplication
  • Community
  • Development Status

Based on the PyOpenSci Peer Review Template [1] we could probably count the following of our sections as a subset of the PyOpenSci review:

  • Documentation
  • Testing
  • Development Status
  • Duplication

Which leaves the following extra criteria we would want to add to the PyOpenSci Reviews:

  • Functionality - Is this a solar package?
  • Integration - Does this package integrate with the sunpy packages
  • Community - Is this an open development package?

Process Changes

Adopting the PyOpenSci Reviews would mean significant process changes, but this is also the main benefit as we don't have to maintain the whole process ourselves.

Main changes I can think of are below, however, I think this is the part where we would need to talk to the PyOpenSci people.

  • Packages would be submitted for review on the PyOpenSci repo
  • PyOpenSci would handle periodic re-reviews and finding non-sunpy reviewers
  • Packages would be listed under a solar physics section on the pyopensci website
  • What happens to our "provisional" packages? - Packages which don't currently meet the review criteria but are being actively developed?

Changes Apparent to Users

  • The non-PyOpenSci criteria would no longer be our "traffic light" style and our additional criteria may-or-not be depending on how we implement it. This means that the "status" of packages is not as clear to users, and we maybe loose the "general package / specialized package" distinction.

Changes Apparent to Maintainers

  • All the process changes.
  • I think the PyOpenSci criteria are more stringent than our current criteria, as we will accept packages with "imperfect" scores, and highlight this to users in our listing. It looks like we would be letting this go in favour of a binary yes/no model with PyOpenSci.

[1] Is this really the authoritative source for the review criteria?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions