-
Notifications
You must be signed in to change notification settings - Fork 691
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WRF-CHEM w/kpp && WRF-CHEMDA v4.6.0 - Intel LLVM Compilers #2065
Comments
@islas is there a potential fix in the develop branch I can test for this issue? |
Any updates? |
You can try integrating #2070, #2069, and #2068 to reduce memory usage of the peak files. There still exists the limitation that the total memory used is the same so if a single file was broken into 12 files, but |
How would I merge different PRs? I'm not familiar with github commands very well. |
Is there any reason why GNU doesn't have this issue but Intel LLVM does? |
I ran some tests today. Looks like for the module that causes all the problems the max processes that can be run is either -j2 for less than 32gb of ram or -j4 for more than 32gb of ram. Would it be possible to hard code a way to just make it use two processes at max? |
With the release of WRF v4.6.0, I have conducted a series of evaluations on the Weather Research and Forecasting Model Chemistry (WRF-Chem) version 4.6.0, utilizing the Intel LLVM Compilers and associated libraries. The purpose of these tests was to assess the performance and compatibility of WRF-CHEM w/kpp && WRF-CHEMDA v4.6.0 under the specified computational environment. Given the diverse nature of the issues encountered, I will be documenting each separately in subsequent communications, ensuring detailed discussions per topic. Attached to each of these communications, you will find a zip file containing all relevant logs.
System Specifications:
Operating System: Ubuntu 22.04.4
Memory: 64 GB DDR5 RAM
Processor: Intel Core i9-13900K
Storage: 1TB SSD
Compiler and Library Environment:
Intel LLVM Compilers, Version 2024.1.0:
ifx (Fortran compiler)
icx (C compiler)
icpx (DPC++/C++ compiler)
mpiifx, mpiicx, mpiicpx (Intel MPI wrappers for respective compilers)
Libraries:
HDF5 v1.14.4.2
PHDF5 v1.14.4.2
ZLIB v1.3.1
Libpng v1.6.39
Jasper v1.900.1
Pnetcdf v1.13.0
Netcdf-c v4.9.2
Netcdf-fortran v4.6.1
Preliminary Observations:
The testing required additional compiler flags to address compatibility issues with Jasper and Libpng libraries, which appear to stem from advancements in compiler technology beyond the versions recommended by NCAR. These flags include:
-Wno-implicit-function-declaration
-Wno-incompatible-function-pointer-types
-Wno-unused-command-line-argument
Methodology:
Tests were conducted in both single-threaded and multi-threaded modes to evaluate performance scalability. The procedure involved:
Fresh installation of the specified packages and configuration in a new terminal.
Execution of the WRF model using the command ./compile with -j 1 for single-threaded and -j 16 (utilizing half of the available CPU threads) for multi-threaded scenarios.
Collection, archiving, and zipping of all log files and configurations post-testing.
Conclusion:
WRF-CHEM
Single Threaded
The tests conducted in the single-threaded configuration for WRF-CHEM were sucessful.
Multi Threaded
The test conducted in the multi threaded configuration for WRF-CHEM were not sucessful, and caused extreme memory usage which froze my computer. My thoughts are that the memory allocated for each module is being called multiple times and as such causing the extreme memory usage. But I am uncertain if this is how WRF-CHEM works
WRF-CHEMDA
Single Threaded
The test conducted in the single-threaded configuration for WRF-CHEMDA were sucessful. WRF-CHEMDA has to be installed before WRF-CHEM.
Multi Threaded
The test conducted in the multi-threaded configuration for WRF-CHEMDA were sucessful. WRF-CHEMDA has to be installed before WRF-CHEM.
wrfchem_single_thread.zip
wrfchem_multi_thread_fail.zip
WRF_chem_da_single_thread.zip
wrfchem_da_multi_thread.zip
The text was updated successfully, but these errors were encountered: