You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running with a forecast split into dask array chunks. The evaluation runs fine if I do not use this function and manually rename the zarr variables myself, even with the same dataset and same chunks.
Hey, thanks for reporting this error. That is indeed suboptimal. It sounds like you have a workaround for now. I added this to our internal TODO list but can't promise we will push a fix in the immediate future.
Issue Description
If the forecast dataset exceeds RAM amount, this function seems to always give out-of-memory errors. I've narrowed it down to the line
ds = xr.merge(das)
I am running with a forecast split into dask array chunks. The evaluation runs fine if I do not use this function and manually rename the zarr variables myself, even with the same dataset and same chunks.
Execution
Conda environment
The text was updated successfully, but these errors were encountered: