compass.validate.compare_variables¶
- compass.validate.compare_variables(test_case, variables, filename1, filename2=None, l1_norm=0.0, l2_norm=0.0, linf_norm=0.0, quiet=True, check_outputs=True, skip_if_step_not_run=True)[source]¶
Compare variables between files in the current test case and/or with the baseline results. The results of the comparison are added to the test case’s “validation” dictionary, which the framework can use later to log the test case results and/or to raise an exception to indicate that the test case has failed.
- Parameters
test_case (compass.TestCase) – An object describing a test case to validate
variables (list) – A list of variable names to compare
filename1 (str) – The relative path to a file within the
work_dir
. Iffilename2
is also given, comparison will be performed withvariables
in that file. If a baseline directory was provided when setting up the test case, thevariables
will be compared between this test case and the same relative filename in the baseline version of the test case.filename2 (str, optional) – The relative path to another file within the
work_dir
if comparing between files within the current test case. If a baseline directory was provided, thevariables
from this file will also be compared with those in the corresponding baseline file.l1_norm (float, optional) – The maximum allowed L1 norm difference between the variables in
filename1
andfilename2
. To skip L1 norm check, pass None.l2_norm (float, optional) – The maximum allowed L2 norm difference between the variables in
filename1
andfilename2
. To skip L2 norm check, pass None.linf_norm (float, optional) – The maximum allowed L-Infinity norm difference between the variables in
filename1
andfilename2
. To skip Linf norm check, pass None.quiet (bool, optional) – Whether to print detailed information. If quiet is False, the norm tolerance values being compared against will be printed when the comparison is made. This is generally desirable when using nonzero norm tolerance values.
check_outputs (bool, optional) – Whether to check to make sure files are valid outputs of steps in the test case. This should be set to
False
if comparing with an output of a step in another test case.skip_if_step_not_run (bool, optional) – Whether to skip the variable comparison if a user did not run one (or both) of the steps involved in the comparison. This would happen if users are running steps individually or has edited
steps_to_run
in the config file to exclude one of the steps.