HI-SCALE August 30 case version 2

File location

  • WRF output (HPSS storage): /nersc/projects/m3314/ksa/wrf/runs/HI-SCALE/0830/JF_sensitivity5/

  • Post-processed time series files (CFS space):
    /global/cfs/cdirs/m3314/ksa/simulations/0830/JF_sensitivity5/tsfiles_nc/

WRF outputs have been moved to the tape archive HPSS. Please use htar command to retrieve them into your own scratch space.

%htar -xf /nersc/projects/m3314/ksa/wrf/runs/HI-SCALE/0830/JF_sensitivity5/JF_sensitivity5_d02_2016-08-30_14_history.tar

Each tar file contains wrf output files from one hour simulation period. For d02 after 14 UTC, the total file size is 52 GB/file x 60 = 3 TB per simulation her.


Model configuration

  • TBA

Time series output

profile output

Seven Profile variables (U, V, W, TH, QV, PH, and PR) are collected into a single netcdf file whose name starts with "tsprofile_". A single file contains one hour of time samples. For the nested d02 domain, 0.5 second frequency leads to 7200 time samples per file. Each file contains data for one site.

An example file looks like:

tsprofile_JF_sensitivity5_d02_C1_2016-08-30_1600-1700Z.nc

  • dimensions:
    • time = UNLIMITED ; // (7200 currently)
    • znw = 305 ;
  • variables:
    • double UU(time, znw) ;
      • UU:_FillValue = NaN ;
      • UU:long_name = "x-wind component" ;
      • UU:units = "m s-1" ;
    • double VV(time, znw) ;
      • VV:_FillValue = NaN ;
      • VV:long_name = "y-wind component" ;
      • VV:units = "m s-1" ;
    • double WW(time, znw) ;
      • WW:_FillValue = NaN ;
      • WW:long_name = "z-wind component" ;
      • WW:units = "m s-1" ;
    • double TH(time, znw) ;
      • TH:_FillValue = NaN ;
      • TH:long_name = "potential temperature" ;
      • TH:units = "K" ;
    • double QV(time, znw) ;
      • QV:_FillValue = NaN ;
      • QV:long_name = "Water vapor mixing ratio" ;
      • QV:units = "kg kg-1" ;
    • double PH(time, znw) ;
      • PH:_FillValue = NaN ;
      • PH:long_name = "geopotential height" ;
      • PH:units = "m" ;
    • double PR(time, znw) ;
      • PR:_FillValue = NaN ;
      • PR:long_name = "pressure" ;
      • PR:units = "Pa" ;
    • int time(time) ;
      • time:units = "milliseconds since 2016-08-30 16:00:00" ;
      • time:calendar = "proleptic_gregorian" ;
    • float znw(znw) ;
      • znw:_FillValue = NaNf ;
      • znw:long_name = "model eta level" ;
      • znw:units = "unitless" ;

sample time/height plot for vertical velocity at C1 and C1a virtual site (from 1400 to 2200 UTC). Example: W profile at C1 and C1a sites

Surface/single-level/integrated output

All other single-level variables (49 of them with the solar-diag option) are collected in a file starting with "tspts_", e.g., tspts_JF_sensitivity5_d02_C1_2016-08-30_1400-1500Z.nc. Please see the table below for more details of the output variables (TBA).


TS variable table


site locations

Name pfx LAT LON
Central_Facility C1 36.605 -97.485
Ashton E9 37.133 -97.266
Central_Facility_EC E14 36.607 -97.488
Medford E32 36.819 -97.820
Newkirk E33 36.926 -97.082
Marshall E36 36.117 -97.511
Waukomis E37 36.311 -97.928
Morrison E39 36.374 -97.069
Peckham E41 36.880 -97.086
AWAKEN_A1 A1 36.357 -97.403
AWAKEN_A2 A2 36.318 -97.410
CF_array_a C1a 36.577 -97.485
CF_array_b C1b 36.591 -97.485
CF_array_c C1c 36.618 -97.484
CF_array_d C1d 36.632 -97.484
CF_array_e C1e 36.605 -97.519
CF_array_f C1f 36.605 -97.502
CF_array_g C1g 36.604 -97.467
CF_array_h C1h 36.604 -97.450

WRF output

*Frequency: d01 - 15 minutes, d02 - 15 minutes for 1200-1400 UTC (0600-0800 CST), and every minute afterwords

Tips

Perlmutter environment

Unlike Cori, nco and ncview are not available as modules on Perlmutter. There are two ways to load these programs (into your paths)

  • Use the nco and ncview installed as part of the E4S software stack
module load cpu
module load spack
module load e4s
spack env activate gcc
module load cudatoolkit/11.5 #- this step is to avoid a link error after Perlmutter's December maintenance for NCO)
spack load nco
spack load ncview

This whole step takes a couple of minutes.

  • Use E3SM's unified environment
source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_pm-cpu.sh