Skip to content

Conversation

@VeckoTheGecko
Copy link
Contributor

@erikvansebille flagged to me that integration tests weren't actually being run in CI.

Also, I realised that the notebooks integration tests aren't working. pixi run -e test-notebooks-latest test-notebooks gives the following error - @reint-fischer would you mind looking into this if you have the time?:

=================================================================================================== FAILURES ====================================================================================================
________________________________________________________________________________ docs/examples/tutorial_Argofloats.ipynb::Cell 1 ________________________________________________________________________________
Notebook cell execution failed
Cell 1: Timeout of 5 seconds exceeded waiting for output.

Input:
from datetime import timedelta

import xarray as xr

import parcels

# Load the CopernicusMarine data in the Agulhas region from the example_datasets
example_dataset_folder = parcels.download_example_dataset(
    "CopernicusMarine_data_for_Argo_tutorial"
)

ds = xr.open_mfdataset(f"{example_dataset_folder}/*.nc", combine="by_coords")

# TODO check how we can get good performance without loading full dataset in memory
ds.load()  # load the dataset into memory

fieldset = parcels.FieldSet.from_copernicusmarine(ds)
fieldset.add_constant("mindepth", 1.0)

# Define a new Particle type including extra Variables
ArgoParticle = parcels.Particle.add_variable(
    [
        parcels.Variable("cycle_phase", dtype=np.int32, initial=0.0),
        parcels.Variable(
            "cycle_age", dtype=np.float32, initial=0.0
        ),  # TODO update to "timedelta64[s]"
        parcels.Variable("drift_age", dtype=np.float32, initial=0.0),
        parcels.Variable("temp", dtype=np.float32, initial=np.nan),
    ]
)

# Initiate one Argo float in the Agulhas Current
pset = parcels.ParticleSet(
    fieldset=fieldset,
    pclass=ArgoParticle,
    lon=[32],
    lat=[-31],
    z=[fieldset.mindepth],
)

# combine Argo vertical movement kernel with built-in Advection kernel
kernels = [
    ArgoPhase1,
    ArgoPhase2,
    ArgoPhase3,
    ArgoPhase4,
    ArgoPhase5,
    ArgoPhase6,
    parcels.kernels.AdvectionRK4,
]

# Create a ParticleFile object to store the output
output_file = parcels.ParticleFile(
    store="argo_float.zarr",
    outputdt=timedelta(minutes=15),
    chunks=(1, 500),  # setting to write in chunks of 500 observations
)

# Now execute the kernels for 30 days, saving data every 30 minutes
pset.execute(
    kernels,
    runtime=timedelta(days=30),
    dt=timedelta(minutes=15),
    output_file=output_file,
)

============================================================================================ short test summary info ============================================================================================
FAILED docs/examples/tutorial_Argofloats.ipynb::Cell 1
=================================================================================== 1 failed, 18 passed, 3 xfailed in 56.27s ====================================================================================

@VeckoTheGecko
Copy link
Contributor Author

pixi run -e test-notebooks-latest test-notebooks gives the following error - @reint-fischer would you mind looking into this if you have the time?

Note this comment (and this PR) might not be necessary depending on the outcome of #2335

@VeckoTheGecko VeckoTheGecko added documentation v4 CI/CD Changes to continuous integration labels Nov 4, 2025
@github-project-automation github-project-automation bot moved this to Backlog in Parcels v4 Nov 4, 2025
@reint-fischer
Copy link
Contributor

I’m not sure I fully understand what’s going on yet, but I am pinging computationalmodelling/nbval#151 here for our reference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CI/CD Changes to continuous integration documentation v4

Projects

Status: Backlog
Status: Backlog

Development

Successfully merging this pull request may close these issues.

2 participants