Skip to content

CAL-DISP PGE Initial Implementation & Interface SAS 0.1 PGE Integration #766

@RKuttruff

Description

@RKuttruff

PGE/SAS Integration Procedure Steps

The first Interface delivery of the R7 CAL-DISP SAS (part of the OPERA VLM product suite) was made on January 20, 2026. This ticket will track the development of the initial version 7.0.0-er.1.0 of the CAL-DISP PGE

Milestones

  • Initial stub implementation, docker builds and basic unit tests
  • Pre- and post-processing validations
  • PCM/archival metadata generation
  • Integration testing framework

Initial stub implementation, docker builds and basic unit tests

  • Create the src/opera/pge/cal_disp/cal_disp_pge.py module
    • The initial module should only define the barebones versions of the pre/post-processor mixins and executor class that simply inherit from the implementations in base_pge.py. See the disp-ni version for an example
  • Create the src/opera/pge/cal_disp/schema/cal_disp_sas_schema.yaml file
    • The cal-disp repo models its runconfig schema as Pydantic code, which can be found here. We will need to convert this to the corresponding Yamale-format schema. An example runconfig that conforms to this schema can be found in the acceptance test data.
  • Create the src/opera/pge/cal_disp/schema/algorithm_parameters_cal_disp_schema.yaml file
    • The cal-disp repo models its algorithm parameters schema as Pydantic code, which can be found here. We will need to convert this to the corresponding Yamale-format schema. An example algorithm parameters file that conforms to this schema can be found in the acceptance test data.
  • Update the PGE_NAME_MAP in src/opera/scripts/pge_main.py to include an entry mapping CAL_DISP_PGE to the CalDispExecutor class
  • Create the src/opera/test/pge/cal_disp/test_cal_disp_pge.py suite
    • This should also be a barebones test suite that only contains a single test_cal_disp_pge_execution function. See the disp-ni example
  • Create the src/opera/test/data/test_cal_disp_config.yaml and src/opera/test/data/test_cal_disp_algorithm_parameters.yaml files for use with test_cal_disp_pge.py
    • The sample runconfig files from the acceptance test data can be used as sensible sample settings for the SAS section of the our test runconfig
  • Create the .ci/docker/Dockerfile_cal_disp file
    • This will essentially be a copy of our other existing Dockerfiles, but with the following settings:
    • ARG PGE_DEST_DIR=/home/conda
    • conda:conda should be used where ever the user/group is referenced, for example: COPY --chown=conda:conda ${PGE_SOURCE_DIR} ${PGE_DEST_DIR}
  • Create the .ci/scripts/disp_ni/build_cal_disp.sh file
    • This can also essentially be a cut/paste of a previous build script, with the following modifications:
    • Default value for SAS_IMAGE argument should be SAS_IMAGE="artifactory-fn.jpl.nasa.gov:16001/gov/nasa/jpl/opera/adt/opera/cal-disp:interface_0.1"
    • Ensure Dockerfile_cal_disp is referenced within the docker build command
  • Create the .ci/scripts/disp_ni/test_cal_disp.sh file
    • See the example for disp-ni
    • Ensure CONTAINER_HOME="/home/conda"
  • Update the .ci/scripts/util/build_all_images.sh and .ci/scripts/util/test_all_images.sh scripts to include the new build/test scripts for CAL-DISP
  • Create docs/opera.pge.cal_disp.rst
    • Boilerplate content for docs generation. Refer to disp-ni
  • Add opera.pge.cal_disp to the toctree in docs/opera.pge.rst
  • Set __version__ in src/opera/_package.py to 7.0.0-er.1.0
    • Add PGE_VERSION = "6.0.0-rc.4.0" to DistS1Executor in src/opera/pge/dist_s1/dist_s1_pge.py to ensure its version is preserved (also add the docstring: """Version of the PGE (overrides default from base_pge)""")
  • Run the unit test suite to ensure all tests pass

Pre- and post-processing validations

For now, basic input validations should be used: file existence/count, nonzero size, valid extensions, etc.
Same for the output files but also validate naming convention.

  • Implement or reuse validations for DISP inputs
  • Implement or reuse validations for ancillary inputs
  • Implement or reuse validations for algorithm parameters
  • Implement output validation
  • Add unit test cases for input and output validations
  • Rerun the unit test suite to ensure all tests pass

PCM/archival metadata generation

  • Implement core and ancillary file naming functions
  • Implement test dataset creation function
  • Create ISO XML jinja2 template
  • Complete the measured parameters config for CAL-DISP. The sample products provided with the interface delivery appear to have a fair amount of metadata already, correlate this with anything present in the product spec.
    • The opera.util.h5_utils.get_hd5_group_as_dict() can be used to extract all available product metadata from an output product in HDF5 format
    • The latest descriptions for each product metadata field should be sourced from the latest Product Specification document for the CAL-DISP SAS
  • Implement the DISP-NI specific version of _create_custom_metadata to ensure the catalog metadata is populated correctly for CAL-DISP
  • Ensure the proper file name conventions are applied for both the ISO XML and Catalog files.
  • Update unit test suite to ensure both the ISO XML and Catalog Metadata are written to disk, populated correctly (no missing ISO fields) and have the correct file names applied.
  • Rerun the unit test suite to ensure all tests pass

Integration testing framework

  • Create the expected input and output archives used with the Integration test from the samples provided by ADT
    • This involves splitting out the input data (including ancillaries) and expected outputs from the "golden" data on Artifactory into separate .zip files for each, then uploading the new archives to S3
  • Create .ci/scripts/cal_disp/opera_pge_cal_disp_r1.0_interface_runconfig.yaml and .ci/scripts/cal_disp/opera_pge_cal_disp_r1.0_interface_algorithm_parameters.yaml based on the config files delivered with the SAS
  • Create test driver and product comparison scripts
  • Add cal_disp to the int and deploy Jenkins pipelines
  • Rerun the unit test suite to ensure all tests pass
  • Rerun the CAL-DISP Integration Test to ensure it passes using the new expected assets for the latest delivery

Metadata

Metadata

Assignees

Labels

must haveFeature is a must havepge.r.07PGE Release 07

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions