-
Notifications
You must be signed in to change notification settings - Fork 8
Labels
Description
PGE/SAS Integration Procedure Steps
The first Interface delivery of the R7 CAL-DISP SAS (part of the OPERA VLM product suite) was made on January 20, 2026. This ticket will track the development of the initial version 7.0.0-er.1.0 of the CAL-DISP PGE
Milestones
- Initial stub implementation, docker builds and basic unit tests
- Pre- and post-processing validations
- PCM/archival metadata generation
- Integration testing framework
Initial stub implementation, docker builds and basic unit tests
- Create the
src/opera/pge/cal_disp/cal_disp_pge.pymodule- The initial module should only define the barebones versions of the pre/post-processor mixins and executor class that simply inherit from the implementations in
base_pge.py. See the disp-ni version for an example
- The initial module should only define the barebones versions of the pre/post-processor mixins and executor class that simply inherit from the implementations in
- Create the
src/opera/pge/cal_disp/schema/cal_disp_sas_schema.yamlfile- The
cal-disprepo models its runconfig schema as Pydantic code, which can be found here. We will need to convert this to the corresponding Yamale-format schema. An example runconfig that conforms to this schema can be found in the acceptance test data.
- The
- Create the
src/opera/pge/cal_disp/schema/algorithm_parameters_cal_disp_schema.yamlfile- The
cal-disprepo models its algorithm parameters schema as Pydantic code, which can be found here. We will need to convert this to the corresponding Yamale-format schema. An example algorithm parameters file that conforms to this schema can be found in the acceptance test data.
- The
- Update the
PGE_NAME_MAPinsrc/opera/scripts/pge_main.pyto include an entry mappingCAL_DISP_PGEto theCalDispExecutorclass - Create the
src/opera/test/pge/cal_disp/test_cal_disp_pge.pysuite- This should also be a barebones test suite that only contains a single
test_cal_disp_pge_executionfunction. See the disp-ni example
- This should also be a barebones test suite that only contains a single
- Create the
src/opera/test/data/test_cal_disp_config.yamlandsrc/opera/test/data/test_cal_disp_algorithm_parameters.yamlfiles for use withtest_cal_disp_pge.py- The sample runconfig files from the acceptance test data can be used as sensible sample settings for the SAS section of the our test runconfig
- Create the
.ci/docker/Dockerfile_cal_dispfile- This will essentially be a copy of our other existing Dockerfiles, but with the following settings:
ARG PGE_DEST_DIR=/home/condaconda:condashould be used where ever the user/group is referenced, for example:COPY --chown=conda:conda ${PGE_SOURCE_DIR} ${PGE_DEST_DIR}
- Create the
.ci/scripts/disp_ni/build_cal_disp.shfile- This can also essentially be a cut/paste of a previous build script, with the following modifications:
- Default value for
SAS_IMAGEargument should beSAS_IMAGE="artifactory-fn.jpl.nasa.gov:16001/gov/nasa/jpl/opera/adt/opera/cal-disp:interface_0.1" - Ensure
Dockerfile_cal_dispis referenced within thedocker buildcommand
- Create the
.ci/scripts/disp_ni/test_cal_disp.shfile- See the example for disp-ni
- Ensure
CONTAINER_HOME="/home/conda"
- Update the
.ci/scripts/util/build_all_images.shand.ci/scripts/util/test_all_images.shscripts to include the new build/test scripts for CAL-DISP - Create
docs/opera.pge.cal_disp.rst- Boilerplate content for docs generation. Refer to disp-ni
- Add
opera.pge.cal_dispto the toctree indocs/opera.pge.rst - Set
__version__insrc/opera/_package.pyto7.0.0-er.1.0- Add
PGE_VERSION = "6.0.0-rc.4.0"toDistS1Executorinsrc/opera/pge/dist_s1/dist_s1_pge.pyto ensure its version is preserved (also add the docstring:"""Version of the PGE (overrides default from base_pge)""")
- Add
- Run the unit test suite to ensure all tests pass
Pre- and post-processing validations
For now, basic input validations should be used: file existence/count, nonzero size, valid extensions, etc.
Same for the output files but also validate naming convention.
- Implement or reuse validations for DISP inputs
- Implement or reuse validations for ancillary inputs
- Implement or reuse validations for algorithm parameters
- Implement output validation
- Add unit test cases for input and output validations
- Rerun the unit test suite to ensure all tests pass
PCM/archival metadata generation
- Implement core and ancillary file naming functions
- Implement test dataset creation function
- Create ISO XML jinja2 template
- Complete the measured parameters config for CAL-DISP. The sample products provided with the interface delivery appear to have a fair amount of metadata already, correlate this with anything present in the product spec.
- The opera.util.h5_utils.get_hd5_group_as_dict() can be used to extract all available product metadata from an output product in HDF5 format
- The latest descriptions for each product metadata field should be sourced from the latest Product Specification document for the CAL-DISP SAS
- Implement the DISP-NI specific version of
_create_custom_metadatato ensure the catalog metadata is populated correctly for CAL-DISP - Ensure the proper file name conventions are applied for both the ISO XML and Catalog files.
- Update unit test suite to ensure both the ISO XML and Catalog Metadata are written to disk, populated correctly (no missing ISO fields) and have the correct file names applied.
- Rerun the unit test suite to ensure all tests pass
Integration testing framework
- Create the expected input and output archives used with the Integration test from the samples provided by ADT
- This involves splitting out the input data (including ancillaries) and expected outputs from the "golden" data on Artifactory into separate .zip files for each, then uploading the new archives to S3
- Create
.ci/scripts/cal_disp/opera_pge_cal_disp_r1.0_interface_runconfig.yamland.ci/scripts/cal_disp/opera_pge_cal_disp_r1.0_interface_algorithm_parameters.yamlbased on the config files delivered with the SAS - Create test driver and product comparison scripts
- Add
cal_dispto the int and deploy Jenkins pipelines - Rerun the unit test suite to ensure all tests pass
- Rerun the CAL-DISP Integration Test to ensure it passes using the new expected assets for the latest delivery