Assess Requirements-Based Testing Quality by Using the Model Testing Dashboard

You can assess the status of your model testing activities by using the metrics in the Model Testing Dashboard. When you test your models against requirements, you maintain traceability between the requirements, models, test cases, and results. The dashboard helps you to track the status of these artifacts and the traceability relationships between them. Each metric in the dashboard measures a different aspect of the quality of the testing artifacts and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178C. From the dashboard, you can identify and fix testing issues. Update the dashboard metrics to track your progress toward testing compliance.

Open the Project and Model Testing Dashboard

The Model Testing Dashboard shows data on the traceability and testing status of each component in your project. A component is a functional part of the architecture that you can execute and test independently when modeling software. The dashboard considers each model in your project to represent a component because you use models to design and test the algorithms.

  1. Open the project that contains the models and testing artifacts. For this example, at the command line, type dashboardCCProjectStart.

    dashboardCCProjectStart

  2. On the Project tab, click Model Testing Dashboard.

  3. The first time that you open the dashboard for the project, the dashboard must identify the artifacts in the project and trace them to the models.

    Project Requires Analysis dialog box

    To run the traceability analysis and collect metric results, click Trace and Collect All. Collecting metric results requires a license for Simulink® Check™, Simulink Requirements™, and Simulink Test™. Once metrics have been collected, viewing the results requires only a Simulink Check license.

Model Testing Dashboard showing results for component db_DriverSwRequest

The dashboard analyzes the traceability links from the artifacts to the models in the project and populates the widgets with metric results for the component that is selected in the Artifacts panel.

Assess Traceability of Artifacts

When the dashboard collects and reports metric data, it scopes the results to the artifacts in one component in the project. Use the Artifacts panel to see each component in the project, represented by the name of its model, and the artifacts that trace to it.

  1. In the Artifacts panel, click the component db_DriverSwRequest. The dashboard widgets populate with metric data from the artifacts in this component.

  2. In the Artifacts panel, expand the section for the component. Click the arrow to the left of db_DriverSwRequest. Each filtered section below the component shows the artifacts of each type that trace to the component.

  3. Expand the Functional Requirements section. This component uses requirements in the files db_req.slreqx and db_req_func_spec.slreqx. Click the arrow to the left of a file name to see the individual requirements that trace to the model.

You can explore the components and sections in the Artifacts panel to see which requirements, test cases, and test results trace to each component in the project. For more information on how the dashboard analyzes this traceability, see Trace Artifacts to Components for Model Testing Analysis.

Explore Metric Results for a Component

  1. On the Artifacts panel, click the component db_DriverSwRequest. The dashboard widgets populate with metric results for the component.

  2. To update the metric results for the component, click Collect Results.

  3. In the Test Case Analysis section of the dashboard, locate the Tests with Requirements widget. To view tooltips with details about the results, point to the sections of the dial or to the percentage result.

    Tests with requirements widget with tooltip indicating 7 tests with requirements

  4. To explore the metric data in more detail, click an individual metric widget. For example, click the green section of the Tests with Requirements widget.

    Table of test cases and status of whether each test case is linked to requirements

    The table shows each test case for the component, the test file containing each test case, and whether the test case is linked to requirements.

  5. The test case Set button is missing linked requirements. To open the test case in the Test Manager, in the Artifact column, click Set button.

  6. Return to the results for the component. Above the table, click db_DriverSwRequest.

You can view a table of detailed results by clicking each widget in the dashboard. Use the hyperlinks in the tables to open the artifacts and address testing gaps. For more information on using the data in the dashboard, see Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.

Track Testing Status of a Project Using the Model Testing Dashboard

To use the Model Testing Dashboard to track your testing activities, set up and maintain your project using the best practices described in Manage Requirements-Based Testing Artifacts for Analysis in the Model Testing Dashboard. As you develop and test your models, use the dashboard to identify testing gaps, fix the underlying artifacts, and track your progress towards testing completion. For more information on finding and addressing gaps in your model testing, see Fix Requirements-Based Testing Issues.

See Also