Area Under the Curve
The area bounded by the curve, the axis, and the boundary points.Tests Using the TruEra Web App¶
Under the Artifacts in the left-side navigator, click Test Harness. If there are currently no model tests defined, you'll receive the following message.
![Model test splash screen](../../img/model-tests/model-test-splash.png)
From here, you can:
Each option is covered next, in turn.
Creating Model Tests¶
To create a new model test, click on the Add new test button.
![Model test creation dialog](../../img/model-tests/model-test-dialog.png)
Select the type of test you wish to define, then click DEFINE TEST.
Performance Tests¶
Upon selecting the Performance test option, the performance test creation wizard is displayed.
![Performance test definition](../../img/model-tests/model-test-performance-test-definition.png)
Enter a name for the test, an optional description, then select the performance metric on which to measure the results.
Absolute conditions are specified by selecting a condition, then adjusting the threshold accordingly.
![Performance test absolute condition definition](../../img/model-tests/model-test-performance-absolute.png)
Relative thresholds are specified by selecting the Set relative threshold option, then specifying the relative baseline split or model setting for the test.
![Performance test relative condition definition](../../img/model-tests/model-test-performance-relative.png)
Defining Test Conditions
When a condition is defined relative to a split or a model only, the available data collection for testing is restricted to the data collection currently associated with the model. Selection is not restricted in cases for which an absolute condition is defined or the condition is relative to the performance of the model on one of its splits.
Click CONTINUE to select the data collection(s) on which the test will run.
![Model test data collection definition](../../img/model-tests/model-test-data-collection.png)
Select All data collections to run the test on all available data collections — those currently existing and those defined later — with respect to the splits/segments configured next. Choose Manually selected to select specific data collections for testing.
![Model test manual data collections](../../img/model-tests/model-test-manual-dc.png)
Click CONTINUE to select the test data split(s) and segment(s) with respect to previously defined data collections. Three options are available:
- All data splits
- Manual data splits (not available if you selected All data collections above)
- Data splits matching a regular expression.
![Model test data split definition](../../img/model-tests/model-test-split-definition.png)
Select All data splits to run the test on all data splits, both existing and subsequently defined. Select Manually selected to limit testing to the data splits manually chosen.
![Model test manual data split definition](../../img/model-tests/model-test-split-definition-manual.png)
Finally, select Regular expression to run the test on all data splits, both existing and those defined later, with names matching the regular expression entered. Preview matching data splits by clicking UPDATE.
![Model test regex data split definition](../../img/model-tests/model-test-split-definition-regex.png)
In this dialog, you can also select the segments on which to run the test. Simply check Filter data by segments, and select the segments.
![Model test segments definition](../../img/model-tests/segments.png)
Click SUBMIT to confirm these test parameters or click BACK to make changes.
Drift Tests¶
Upon selecting the Drift test option, the stability test creation wizard is displayed.
![Stability test definition](../../img/model-tests/model-test-stability-test-definition.png)
Enter the name of the test, an optional description, and select the Drift metric on which to measure the results.
Define failure and/or warning conditions by selecting a condition, then adjusting the threshold as desired.
![Model test absolute condition definition](../../img/model-tests/model-test-absolute.png)
Click CONTINUE to select the data collection(s) on which the test will run and the reference splits for the selected data collection(s).
![Model test data collection definition](../../img/model-tests/stability-alldc.png)
Select All data collections to run the test on all available data collections — those currently existing and those defined later — with respect to the splits/segments configured next. With this option, the reference split is automatically set to the respective training split for each data collection.
![Model test manual data collections](../../img/model-tests/stability-manualdc.png)
Choose Manually selected to select specific data collections for testing. Manual definition of the corresponding reference splits is available for each selected data collection(s).
Click CONTINUE to select the test data split(s) and segment(s) with respect to previously defined data collections. Three options are available:
- All data splits
- Manual data splits (not available if you selected All data collections above)
- Data splits matching a regular expression.
![Model test data split definition](../../img/model-tests/model-test-split-definition.png)
Select All data splits to run the test on all data splits, both existing and subsequently defined. Select Manually selected to limit testing to the data splits manually chosen.
![Model test manual data split definition](../../img/model-tests/model-test-split-definition-manual.png)
Finally, select Regular expression to run the test on all data splits, both existing and those defined later, with names matching the regular expression entered. Preview matching data splits by clicking UPDATE.
![Model test regex data split definition](../../img/model-tests/model-test-split-definition-regex.png)
You can also select the segments to run the test on. Simply check Filter data by segments, then select the segments.
![Model test segments definition](../../img/model-tests/segments.png)
Click SUBMIT to confirm these test parameters or click BACK to make changes.
Fairness Tests¶
Upon selecting the Fairness test option, the fairness test creation wizard appears.
![Fairness test definition](../../img/model-tests/model-test-fairness-test-definition.png)
Enter the name of the test, an optional description, and then select the fairness metric on which to measure the results.
Define failure and/or warning conditions by selecting a condition, then adjusting the threshold as desired.
![Model test absolute condition definition](../../img/model-tests/model-test-absolute.png)
Click CONTINUE to select the protected segments for the test.
![Protected segments definition](../../img/model-tests/protected-segments.png)
Select All protected segments to run the test on all available protected segments - those currently existing and those defined later with respect to the data collections/splits configured next. Choose Manually selected to limit testing to your desired protected segments.
Click CONTINUE to select the data collection(s) on which the test will run.
![Model test data collection definition](../../img/model-tests/model-test-data-collection.png)
Select All data collections to run the test on all available data collections — those currently existing and those defined later — with respect to the splits/segments configured next. Choose Manually selected to select specific data collections for testing.
![Model test manual data collections](../../img/model-tests/model-test-manual-dc.png)
Click CONTINUE to select the test data split(s) and segment(s) with respect to previously defined data collections. Three options are available:
- All data splits
- Manual data splits (not available if you selected All data collections above)
- Data splits matching a regular expression.
![Model test data split definition](../../img/model-tests/model-test-split-definition.png)
Select All data splits to run the test on all data splits, both existing and subsequently defined. Select Manually selected to limit testing to the data splits manually chosen.
![Model test manual data split definition](../../img/model-tests/model-test-split-definition-manual.png)
Finally, select Regular expression to run the test with names matching the regular expression entered on all data splits, both existing and those defined later. Preview matching data splits by clicking UPDATE.
![Model test regex data split definition](../../img/model-tests/model-test-split-definition-regex.png)
Click SUBMIT to confirm these test parameters or click BACK to make changes.
Feature importance Tests¶
Upon selecting the Feature importance test option, the Feature importance test creation wizard is displayed.
![Feature importance test definition](../../img/model-tests/model-test-fi-test-definition.png)
Enter the name of the test, an optional description, set the minimum importance value for features (between 0 and 1 exclusive), and set the score type (i.e. quantity of interest) for this test.
For feature importance tests, define failure and/or warning conditions by specifying the threshold number of unimportant features in our models. This has to be an integer quantity.
Click CONTINUE to select the data collection(s) on which the test will run.
![Model test data collection definition](../../img/model-tests/model-test-data-collection.png)
Select All data collections to run the test on all available data collections — those currently existing and those defined later — with respect to the splits/segments configured next. Choose Manually selected to select specific data collections for testing.
![Model test manual data collections](../../img/model-tests/model-test-manual-dc.png)
Click CONTINUE to select the test data split(s) and segment(s) with respect to previously defined data collections. Three options are available:
- All data splits
- Manual data splits (not available if you selected All data collections above)
- Data splits matching a regular expression.
![Model test data split definition](../../img/model-tests/model-test-split-definition.png)
Select All data splits to run the test on all data splits, both existing and subsequently defined. Select Manually selected to limit testing to the data splits manually chosen.
![Model test manual data split definition](../../img/model-tests/model-test-split-definition-manual.png)
Finally, select Regular expression to run the test on all data splits, both existing and those defined later, with names matching the regular expression entered. Preview matching data splits by clicking UPDATE.
![Model test regex data split definition](../../img/model-tests/model-test-split-definition-regex.png)
You can also select the segments to run the test on. Simply check Filter data by segments, and select the segments.
![Model test segments definition](../../img/model-tests/segments.png)
Click SUBMIT to confirm these test parameters or click BACK to make changes
Viewing Your Model Tests¶
When one or more tests haven been defined, the test harness page displays them in a teble.
![Table of existing model tests](../../img/model-tests/tests-table.png)
You can filter the list by test name, type, as well as data collections, splits, and models. Expanding any row allows you to view the metadata of the test.
![Metadata row](../../img/model-tests/metadata.png)
Opening the menu on the end of a row allows you to edit tests as well as delete tests.
Editing Your Model Tests¶
Selecting Edit test group from the menu at the end of a row allows you to edit a test, which brings up the wizard corresponding to the type of test. Completing the flow and clicking Submit updates the test.
![Editing a test](../../img/model-tests/edit.png)
Click Next below to continue.