Manage Requirements-Based Testing Artifacts for Analysis in the Model Testing Dashboard

When you develop and test software components using Model-Based Design, use the Model Testing Dashboard to assess the status and quality of your model testing activities. Requirements-based testing is a central element of model verification. By establishing traceability links between your requirements, model design elements, and test cases, you can measure the extent to which the requirements are implemented and verified. The Model Testing Dashboard analyzes this traceability information and provides detailed metric measurements on the traceability, status, and results of these testing artifacts.

Model Testing Dashboard

Each metric in the dashboard measures a different aspect of the quality of your model testing and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178C. To monitor the requirements-based testing quality of your models the Model Testing Dashboard, maintain your artifacts in a project and follow these considerations. For more information on using the Model Testing Dashboard, see Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.

Manage Artifact Files in a Project

To analyze your requirements-based testing activities in the Model Testing Dashboard, store your design and testing artifacts in a project. The artifacts that the testing metrics analyze include:

  • Models

  • Requirements that you create in Simulink® Requirements™

  • Libraries that the models use

  • Test cases that you create in Simulink Test™

  • Test results from the executed test cases

In order to analyze the latest assets in the Model Testing Dashboard, ensure that you:

  • Save the changes to your artifact files.

  • Export test results and save them in a results file.

  • Store the files that you want to analyze in the project.

Model Software Components for Requirements-Based Testing

The Model Testing Dashboard provides traceability and testing analysis for each component in your project. A component is a functional entity within your software architecture that you can execute and test independently or as part of larger system tests. The Model Testing Dashboard considers each model in your project to represent a component because you use models to design and test the algorithms for your software components in Model-Based Design. For each component, you develop functional requirements based on the high-level system requirements and the role of the component. You then model the component algorithm to fulfill the functional requirements. Then, to test the component, you derive the test cases from the requirements and run the tests on the model. Throughout this process, you create and maintain explicit or implicit traceability links between:

  • Each functional requirement and the model elements that implement it

  • Each functional requirement and the test cases that verify it

  • Each test case and the model that it tests

  • Each test case and the latest results that it produced

These traceability links allow you to track the completeness of your requirements, design, and testing activities. Links help you find gaps in design and testing. If a test fails, you can follow the traceability links to the test case that failed, the requirement that it tested, and to the model element that implemented the requirement. This allows you to quickly find possible design errors that caused a test failure. Industry standards for software development such as ISO 26262 and DO-178C require traceability between these artifacts to show testing completeness.

The Model Testing Dashboard returns traceability and metric results for the artifacts within the scope of a component. The artifacts in the scope of a component are the model and the requirements, test cases, and test results that trace to the model. In the dashboard, the Artifacts pane organizes the artifacts by the components that they trace to and uses the model name to represent each component.

Artifacts panel showing components and traced artifacts

The component list does not include systems that are not models, such as subsystems. When you model your system, if you want to test and analyze a portion of your design in the Model Testing Dashboard, save it as a model.

Trace Artifacts to Components for Model Testing Analysis

To determine which artifacts are in the scope of a component, the Model Testing Dashboard analyzes the traceability links between the artifacts and the models in the project, which correspond to the components. The Artifacts panel lists each component, represented by the model name, and these artifacts that trace to the component:

  • Functional Requirements

  • Design Artifacts

  • Test Cases

  • Test Results

After the list of components, the Untraced folder shows artifacts that the dashboard has not traced to any of the models. If an artifact returns an error during traceability analysis, the panel includes the artifact in the Errors folder. Use the traceability information in these sections and in the components to check if the testing artifacts trace to the models that you expect.

As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the traceability data in the Artifacts panel might be stale by enabling the Trace Artifacts button. To update the traceability data, click Trace Artifacts. If the button is not enabled, the dashboard has not detected changes that affect the traceability information.

Functional Requirements

The folder Functional Requirements shows requirements where the Type is set to Functional and that meet one of these linking criteria:

  • The requirement is linked to the model or to a library subsystem used by the model with a link where the Type is set to Implements.

  • The requirement is under a container requirement that is linked to the model or to a library subsystem used by the model with a link where the Type is set to Implements.

  • The requirement traces to the model through a combination of the previous two criteria. For example, a requirement that is under a container requirement that links to another requirement, which links to the model.

Create or import these requirements in a requirements file (.slreqx) by using Simulink Requirements. If you expect a requirement to appear under a component that appears under Untraced, check if it uses one of the link types that the dashboard does not trace, which are described in Untraced Artifacts. For more information about linking requirements, see Requirement Links (Simulink Requirements).

When you collect metric results for a component, the dashboard analyzes a subset of the requirements that appear in the Functional Requirements folder. The metrics analyze only requirements where the Type is set to Functional and that are directly linked to the model with a link where the Type is set to Implements. A requirement that traces to the component but does not have these settings appears in the Functional Requirements folder but does not contribute the metric results for requirements.

Design Artifacts

The folder Design Artifacts shows the model file that contains the block diagram for the component and libraries that are partially or fully used by the model.

Test Cases

The folder Test Cases shows test cases that trace to the model. This includes test cases that run on the model and test cases that run on subsystems in the model by using test harnesses. Create these test cases in a test suite file by using Simulink Test.

When you collect metric results for a component, the dashboard analyzes a subset of the test cases that appear in the Test Cases folder. The dashboard analyzes only test cases that run on the model. Test cases that test a subsystem in the model appear in the folder but do not contribute to the metrics because they do not test the whole model.

Test Results

The folder Test Results shows test results from test cases that test the model. To view and analyze test results in the Model Testing Dashboard, you must export and save the test results in a results file. Results that you have collected in the Test Manager but have not exported do not appear in the dashboard and do not contribute to metric results. If you expect a result to appear under a component that appears under Untraced, check if it uses one of the link types that the dashboard does not trace, which are described in Untraced Artifacts.

When you collect metric results for a component, the dashboard analyzes a subset of the test results that appear in the Test Results folder. The dashboard analyzes only the most recent results from the test cases that run on the model. The metrics do not include results from test cases that do not run on the model, such as test cases that test subsystems.

Untraced Artifacts

The folder Untraced shows artifacts that the dashboard has not traced to any models. When reviewing the traceability of artifacts in your project, consider these limitations:

  • The dashboard does not trace or analyze artifacts that are not saved in the current project.

  • A model must be on the MATLAB search path in order to appear in the Artifacts panel. Artifacts that link to a model that is not on the MATLAB search path appear in the Untraced folder and are not analyzed by the metrics.

  • When you change the MATLAB search path, the traceability information in the Artifacts panel is not updated. Do not change the search path while the dashboard is open.

  • The dashboard does not trace symbolic file links in a project, such as shortcuts.

  • The dashboard does not trace or analyze artifacts in referenced projects.

If a requirement is linked to a model or test case by one of these links, the dashboard does not trace the link:

  • Embedded links, which are requirements files that are saved directly in the model file.

  • Links to and from data dictionaries.

  • Links to MATLAB code files.

  • Links to embedded MATLAB Function blocks.

  • Links in deprecated requirement files, which have the extension .req. To analyze requirement links in the dashboard, save the links in an .slmx file or in a requirements file (.slreqx).

  • Links with custom types.

  • Links to requirements that use custom types.

  • Links to System Composer™ architecture models.

If one of these conditions is met when you run your test cases, the generated results are untraced because the dashboard cannot establish unambiguous traceability to the component:

  • No project is loaded.

  • The dashboard was not opened at least once for the project.

  • You do not have a Simulink Check™ license.

  • The test file is stored outside the project.

  • The test file has unsaved changes.

  • The tested model has unsaved changes.

  • The test file returns an error during traceability analysis.

  • The tested model returns an error during traceability analysis.

If one of these conditions is met when you export your test results, the generated results are untraced because the dashboard cannot establish unambiguous traceability to the component:

  • No project is loaded.

  • The dashboard was not opened at least once for the project.

  • You do not have a Simulink Check license.

  • The test result file returns an error during traceability analysis.

Use the Untraced folder to check if any artifacts are missing traceability to the components. If you add traceability to an artifact, update the information in the panel by clicking Trace Artifacts.

Artifact Errors

The folder Errors shows artifacts that returned errors when the dashboard performed artifact analysis. These are some errors that artifacts might return during traceability analysis:

  • An artifact returns an error if it has unsaved changes when traceability analysis starts.

  • A test results file returns an error if it was saved in a previous version of Simulink.

  • A model returns an error if it is not on the search path.

Open these artifacts and fix the errors. Then, to analyze the traceability in the dashboard, click Trace Artifacts.

Collect Metric Results

The Model Testing Dashboard collects metric results for each component listed in the Artifacts pane. Each metric in the dashboard measures a different aspect of the quality of the testing of your model and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178. For more information about the available metrics and the results that they return, see Model Testing Metrics.

As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the metric results in the dashboard might be stale. If your changes affect the traceability information in the Artifacts panel, click Trace Artifacts. After you update the traceability information, if any of the metric results might be affected by your artifact changes, the Stale Metrics icon Stale Metrics appears at the top of the dashboard. Affected widgets appear highlighted in gray. To update the results, click Collect Results > Collect All Results.

The dashboard does not indicate stale metric data for these changes:

  • After you run a test case and analyze the results in the dashboard, if you make changes to the test case, the dashboard indicates that test case metrics are stale but does not indicate that the results metrics are stale.

  • When you change a coverage filter file that your test results use, the coverage metrics in the dashboard do not indicate stale data or include the changes. After you save the changes to the filter file, re-run the tests and use the filter file for the new results.

When you collect metric results for a component, the dashboard returns results for a subset of the artifacts that trace to the component. However, metric results that count traceability links between requirements and test cases include links to artifacts that might trace to other components or to no components. For example, if a test case TestCaseA tests ModelA, then running the metric Test case linked to requirements on ModelA returns a result for that test case. When the metric checks for requirements that are linked to TestCaseA, the metric does not consider the implementation or traceability status of the requirements. If TestCaseA has a Verifies link to a requirement RequirementB, which is linked to a different model, then the metric returns true indicating that the test case is linked. However, if you run the metric Requirement linked to test cases on ModelA, it does not return a result for RequirementB because the requirement is not linked to ModelA. For a test case that is linked to requirements, check that the linked requirements are implemented by the model that the test case runs on. Additionally, for a requirement that is linked to test cases, check that the test cases run on the model that implements the requirement.

See Also

Related Topics