Testing Code
For instructions on how to build Omega and run CTest unit tests, see Building and testing Omega.
Running the Polaris omega_pr Test Suite
The CTest unit tests on their own are not sufficient to ensure that changes to Omega behave as expected. Developers also need to run the regression tests in the form of the omega_pr test suite from Polaris. The tests in the suite are defined in omega_pr.txt in the Polaris repository.
You will run the suite twice: once using the Omega version vendored in Polaris
under e3sm_submodules/Omega (the official baseline) and once using your
development branch. This means you will need to
build Omega from both sources before
running the suite. You may wish to use the Polaris CTest Utility
to automate the build and run the CTests for each.
Note
While the Omega develop branch often matches the Polaris submodule
bit-for-bit (b4b), the submodule at e3sm_submodules/Omega is the
authoritative baseline for regression comparison.
Prerequisites: build Omega for baseline and PR
Before setting up and running the omega_pr suite, you must build Omega twice:
Build the baseline from the Omega sources vendored by Polaris at
<path-to-polaris-repo>/e3sm_submodules/Omega. Record the CMake build directory path asBASELINE_BUILD(used below withpolaris suite -p).Build from your development (PR) branch. Record the CMake build directory path as
PR_BUILD(also used with-p).
The -p option passed to polaris suite should point to the CMake build
directory for the corresponding branch (the directory where you ran CMake and
invoked ./omega_build.sh).
Note
Run both baseline and PR suite on the same machine, CPU/GPU partition, compiler, and Omega build type (Debug/Release) to avoid false positive/negative diffs.
Setting up the suite
If you have not yet set up Polaris, you will need to follow its quick start including cloning the repo, configuring the software environment, and activating the environment.
Then, with the Polaris environment active, you can set up the baseline
omega_pr suite as follows:
BASELINE_BUILD=/path/to/build/from/polaris/e3sm_submodules/Omega
POLARIS_SCRATCH=/path/to/scratch/polaris_testing/
polaris suite -c ocean -t omega_pr --model omega \
-w $POLARIS_SCRATCH/baseline_omega_pr \
-p $BASELINE_BUILD
where $POLARIS_SCRATCH is a directory within your scratch space that is
useful for setting up and running the Polaris test suites and
$BASELINE_BUILD is the directory where you built Omega from the Omega
sources located at <polaris-repo>/e3sm_submodules/Omega, or where the
Polaris CTest utility built it for you (e.g. build_omega/build_chrysalis_intel).
Next, you can set up the omega_pr for your development branch, using the
baseline set up above:
PR_BUILD=/path/to/omega/pr/branch/build
POLARIS_SCRATCH=/path/to/scratch/polaris_testing/
polaris suite -c ocean -t omega_pr --model omega \
-b $POLARIS_SCRATCH/baseline_omega_pr \
-w $POLARIS_SCRATCH/my_branch_omega_pr \
-p $PR_BUILD
Here, $POLARIS_SCRATCH is the same scratch directory as above,
$PR_BUILD is the directory where you build Omega from your
development branch (or where the CTest utility built for you), and
my_branch_omega_pr can be any name that helps you keep track of what you
were testing.
Running the suite
First, run the baseline, e.g. for Slurm systems:
cd $POLARIS_SCRATCH/baseline_omega_pr
sbatch job_script.omega_pr.sh
Note the job ID for the next step. Next, run the suite for the test branch, waiting until the baseline is done:
cd $POLARIS_SCRATCH/my_branch_omega_pr
sbatch --kill-on-invalid-dep=yes --dependency=afterok:$JOBID job_script.omega_pr.sh
where $JOBID is the JOB ID of the baseline slurm job.
Monitor the jobs (e.g. squeue -j $JOBID) and take a look at combined
stdout/stderr log in polaris_omega_pr.o$JOBID in each work directory to see
whether the tests passed. Report the results as part of any (non-trivial)
Omega PR.
Note
For other schedulers (e.g. PBS on Aurora), take a look at the machine’s documentation for equivalent commands.
Handling test failures
What should you do if you see test failures?
Test execution errors for baseline (Polaris e3sm_submodules/Omega)
If you see test execution failures in the baseline run from Polaris
e3sm_submodules/Omega,
these need to be reported as issues
on the Omega repo, preferably mentioning a relevant Omega developer who may
be able to investigate the cause of the failure. Please provide the contents
of the relevant log file for the failing test(s) and optionally the location of
the tests (and log files) if other Omega developers will have access.
Test execution errors for your branch
If you see execution failures on the tests for your development branch, please take a look at the log files and attempt to debug the issues yourself first, then reach out to the Omega team if you need help. In many cases, the test failures may related to changes that are required in Polaris, see Updating Polaris below.
Diffs between your branch and baseline
If diffs are expected:
Document the diffs (in Omega PR comment)
Update
e3sm_submodules/Omegain Polaris (after merge of Omega PR)
If not expected:
Debug and eliminate
If you see errors related to comparisons with the baseline, you will need to
determine if these are expected based on the code changes or not. If they
are expected, you will need to document (in a PR comment) how large the
differences are in the tests that show differences. In addition, you will need
to make a pull request to the Polaris repository to update the Omega submodule
in e3sm_submodules/Omega once your branch has been merged so that the changes
become the new baseline in Polaris.
Note
We do not update e3sm_submodules/Omega after every branch is merged into
develop on Omega to reduce maintenance burden.
If you see unexpected differences between the results for your development branch and those from the baseline, you should attempt to debug the cause yourself and then reach out to the Omega team if you need help.
You may want to look at Polaris’ bisect utility to help debug which commit introduced the unexpected differences.
What to include in your PR Testing comment
Please add a PR comment titled “Testing” that summarizes what you ran and the outcomes. Include the following so reviewers can validate apples-to-apples runs.
Note
At a minimum, you need to run both CTests and the omega_pr suite on one
machine and compiler.
CTest unit tests
Machine(s) and partition/queue (if applicable)
Compiler(s), and build type(s) (Debug/Release)
Result: either “All tests passed” or a short list of failing tests with a one-line note each
Polaris
omega_prregression suiteBaseline run (Polaris
e3sm_submodules/Omega)Build path used for -p (from Polaris
e3sm_submodules/Omega)Work directory used for -w
PR branch run
Build directory used for -p
Work directory used for -w
Machine/partition, compiler, and build type (should match between baseline and PR)
Result: “All tests passed” or a concise list of diffs/failures with a brief note
If there are issues, add the path to the relevant log(s) (e.g., polaris_omega_pr.o$JOBID) and a short excerpt if helpful
Tests added/modified/impacted
Briefly list any new or updated CTest or Polaris tests and what they cover
Performance PRs
Link to the relevant PACE experiment and summarize before/after metrics
The Omega CTest Utility
and the omega_pr suite will produce output in a format you can copy/paste into the PR
description. These outputs follow roughly the template below. If you want to add the
information manually, you can copy-paste template you can use in your PR comment:
## Testing
CTest unit tests
- Machine: <machine>[, <partition>]
- Compiler: <compiler>
- Build type: <Debug|Release>
- Result: All tests passed | Failures (X of Y): <name> (<one-line note>), ...
Polaris omega_pr regression suite
- Baseline build (-p): <path/to/build of Polaris e3sm_submodules/Omega>
- Baseline workdir (-w): <path/to/baseline_omega_pr>
- PR build (-p): <path/to/pr/build>
- PR workdir (-w): <path/to/my_branch_omega_pr>
- Machine/partition: <machine>[, <partition>]
- Compiler/build type: <compiler>, <Debug|Release>
- Result: All tests passed | Diffs/Failures: <brief summary>
- Logs (if applicable): <path/to/polaris_omega_pr.o$JOBID>
Tests added/modified/impacted
- CTest: <name> — <one-line purpose>
- Polaris: <path or name> — <one-line purpose>
Performance (if applicable)
- PACE: <link>
- Before: <metric(s)>
- After: <metric(s)>
- Delta: <summary>
Updating Polaris
Many Omega changes are likely to also require Polaris changes. In such cases,
it makes sense to co-develop your Omega branch and a Polaris branch. The
Omega branch will need to be merged first, then the e3sm_submodules/Omega
submodule will need to be updated to include the Omega changes as part of
the Polaris pull request.
Here are some examples of Omega changes that will require corresponding Polaris changes.
Adding new Omega dimensions or variables
If you add new Omega variables that have a corresponding MPAS-Ocean name, you need to add the mapping between the names to mpaso_to_omega.yaml. For more details, see the Polaris Ocean Framework documentation.
For example:
dimensions:
Time: time
nVertLevels: NVertLayers
variables:
temperature: Temperature
salinity: Salinity
ssh: SshCell
In each case, the keys are the MPAS-Ocean names and the values are the corresponding Omega names.
Updates to Default.yml
If you add or remove sections, config options, streams, etc. in Default.yml, you may need to make corresponding changes to the YAML files for Polaris test cases for them to continue to work as expected.
Whenever possible, we try to use a mapping between MPAS-Ocean namelist options and Omega config options within mpaso_to_omega.yaml. For example:
config:
- section:
time_management: TimeIntegration
options:
config_start_time: StartTime
config_stop_time: StopTime
config_run_duration: RunDuration
config_calendar_type: CalendarType
If you have added new Omega config sections or options that have corresponding MPAS-Ocean namelist sections or options, please add them to the map. In each case, the keys are the MPAS-Ocean names and the values are the corresponding Omega names.
Frequently, there is no MPAS-Ocean equivalent to an Omega config option, and we currently do not try to map streams from MPAS-Ocean to Omega (their syntax is too different). In such cases, we have do define Omega YAML config options and streams in each test case, and you may need to update each one with your changes. Here is an example exerpt from the manufactured_solution test:
Omega:
Tendencies:
VelDiffTendencyEnable: false
VelHyperDiffTendencyEnable: false
UseCustomTendency: true
ManufacturedSolutionTendency: true
IOStreams:
InitialVertCoord:
Filename: init.nc
InitialState:
UsePointerFile: false
Filename: init.nc
Contents:
- NormalVelocity
- LayerThickness
RestartRead: {}
History:
Filename: output.nc
Freq: {{ output_freq }}
FreqUnits: Seconds
IfExists: append
# effectively never
FileFreq: 9999
FileFreqUnits: years
Contents:
- NormalVelocity
- LayerThickness
- SshCell
For more details on updating either the map or the YAML files for individual tests, see the Polaris Ocean Framework documentation.