Release Notes: glwu v1.1.1 - released to NCO on December 13, 2021 Transition from WCOSS Cray to WCOSS2 (update for run time optimization, and name change from 'wave_glwu' to 'glwu'). Where is the release tag on subversion/git/vlab? https://github.com/NOAA-EMC/glwu.git Tag: IT-glwu.v1.1.1-20211213 List of external software used (anything outside of your vertical structure), including compilers and version numbers for everything. Software used must be a minimal list of modules/versions specified per job: jwave_glwu_prep: envvar/1.0 prod_envir/2.0.5 prod_util/2.0.8 PrgEnv-intel/8.1.0 craype/2.7.8 intel/19.1.3.304 cray-mpich/8.1.7 cray-pals/1.0.12 cfp/2.0.4 wgrib2/2.0.8 hdf5/1.10.6 jwave_glwu_forecast: envvar/1.0 prod_envir/2.0.5 prod_util/2.0.8 PrgEnv-intel/8.1.0 craype/2.7.8 intel/19.1.3.304 cray-mpich/8.1.7 cray-pals/1.0.12 hdf5/1.10.6 netcdf/4.7.4 jasper/2.0.25 libpng/1.6.37 zlib/1.2.11 g2/3.4.5 w3nco/2.4.1 bacio/2.4.1 jwave_glwu_post: envvar/1.0 prod_envir/2.0.5 prod_util/2.0.8 PrgEnv-intel/8.1.0 craype/2.7.8 intel/19.1.3.304 cfp/2.0.4 wgrib2/2.0.8 grib_util/1.2.2 libjpeg/9c hdf5/1.10.6 netcdf/4.7.4 jwave_glwu_pgen: envvar/1.0 prod_envir/2.0.5 prod_util/2.0.8 PrgEnv-intel/8.1.0 craype/2.7.8 intel/19.1.3.304 libjpeg/9c libpng/1.6.37 grib_util/1.2.2 wgrib2/2.0.8 util_shared/1.4.0 List of all code/scripts modified with this release (relative to v1.1.0) NOTE: Many files have been renamed, indicated with an -> versions/build.ver versions/run.ver modulefiles/build_glwu.module modulefiles/build_wavewatch3.modules sorc/multiwavefcst.fd/makefile sorc/multiwavefcst.fd/w3triamd.F90 def/prod00.def def/prod06.def def/prod12.def def/prod18.def def/wcoss_defs/prod00.out def/wcoss_defs/prod06.out def/wcoss_defs/prod12.out def/wcoss_defs/prod18.out ecf/jwave_glwu_prep.ecf -> ecf/jglwu_prep.ecf ecf/jwave_glwu_forecast.ecf -> ecf/jglwu_forecast.ecf ecf/jwave_glwu_post.ecf -> ecf/jglwu_post.ecf ecf/jwave_glwu_pgen.ecf -> ecf/jglwu_pgen.ecf jobs/JWAVE_GLWU_PREP -> jobs/JGLWU_PREP jobs/JWAVE_GLWU_FORECAST -> jobs/JGLWU_FORECAST jobs/JWAVE_GLWU_POST -> jobs/JGLWU_POST jobs/JWAVE_GLWU_PGEN -> jobs/JGLWU_PGEN scripts/exwave_glwu_prep.sh -> scripts/exglwu_prep.sh scripts/exwave_glwu_prep.sh.lookback_24h -> scripts/exglwu_prep.sh.lookback_24h scripts/exwave_glwu_forecast.sh -> scripts/exglwu_forecast.sh scripts/exwave_glwu_post.sh -> scripts/exglwu_post.sh scripts/exwave_glwu_pgen.sh -> scripts/exglwu_pgen.sh ush/multiwavegrib2.sh ush/multiwavegrid_interp.sh ush/multiwavespec2.sh ush/multiwavespec_bull.sh ush/multiwavespec_ts.sh ush/wave_fldn.sh ush/waveice_glw.sh ush/wavemod_def.sh ush/wavendfd_glwu_inc.sh ush/wavendfd_glwu.sh For development only (please ignore): dev/lsf_scripts/glwu.cron dev/lsf_scripts/glwu.boss.lsf dev/lsf_scripts/JWAVE_GLWU_PREP.lsf -> dev/lsf_scripts/JGLWU_PREP.lsf dev/lsf_scripts/JWAVE_GLWU_FORECAST.lsf -> dev/lsf_scripts/JGLWU_FORECAST.lsf dev/lsf_scripts/JWAVEGLWU_POST.lsf -> dev/lsf_scripts/JGLWU_POST.lsf dev/lsf_scripts/JWAVE_GLWU_PGEN.lsf -> dev/lsf_scripts/JGLWU_PGEN.lsf What changes were made to the above code/scripts to support the new architecture? Relative to v1.1.0, the forecast job was sped up by allocating more nodes, cores and RAM. This was done due to code profiling by GDIT showing a memory bottleneck: Old: #PBS -l select=6:ncpus=128:mem=150GB export mpicmd='mpiexec -n 768 -ppn 128' New: #PBS -l select=12:ncpus=128:mem=500GB export mpicmd='mpiexec -n 768 -ppn 64 --depth 2 --cpu-bind depth' In addition, many scripts and script names were updated to reflect a change in the application name from 'wave_glwu' to 'glwu'. This includes changes to environment variables such as: NET=wave -> NET=glwu wave_glwu -> glwu wave_glwu_ver -> glwu_ver HOMEwave -> HOMEglwu HOMEwave_glwu -> HOMEglwu FIXwave -> FIXglwu PARMwave -> PARMglwu USHwave -> USHglwu EXECwave -> EXECglwu EXECcode -> EXECglwu Were any other changes made that aren’t directly related to the transition? No Are there any changes to incoming data needs or outgoing products? No If output filenames are changing, list the current and new filename N/A Compute resource information, for every job: *** Providing PBS and/or ecFlow submission scripts as part of release tag is preferred; if they are provided then resource information is not needed in the release notes. See glwu/ecf/ Runtime changes compared to current production (/gpfs/dell1/nco/ops/com/logs/runtime/daily/* for current stats) Long cycles (01z, 07z, 13z, 19z): jwave_glwu_prep: 00:02:50 (prod: 00:00:21) jwave_glwu_forecast: 00:22:23 (prod: 00:24:34) jwave_glwu_post: 00:19:44 (prod: 00:15:09) jwave_glwu_pgen: 00:00:10 (prod: 00:00:27) Short cycles (remaining hours: 00z, 02z, 03z, 04z, 05z, 06z, 08z, 09z, 10z, 11z, 12z, etc.): jwave_glwu_prep 00:02:30 (prod: 00:00:20) jwave_glwu_forecast 00:08:42 (prod: 00:08:31) jwave_glwu_post 00:11:12 (prod: 00:09:31) jwave_glwu_pgen 00:00:11 (prod: 00:00:11) Disk space required per day or per cycle; data retention on disk will remain the same unless otherwise requested. Unchanged at 118G/day. Dissemination will remain the same unless otherwise communicated to NCO and a PNS/SCN issued Unchanged. HPSS archive retention will remain the same unless approval granted by HPCRAC Unchanged. What are your up and downstream dependencies? Unchanged. Upstream: NDFD winds and NIC sea ice analysis from DCOM. Downstream: None. Provide a pointer to your COMOUT directory that was used during testing: /lfs/h2/emc/couple/noscrub/Andre.VanderWesthuysen/GLWU/com/glwu/v1.1.1/glwu.20211003 Canned data from WCOSS1 Cray for comparison: /lfs/h2/emc/couple/noscrub/Andre.VanderWesthuysen/GLWU/COM_wcoss1_prod/glwu.20211003