Component Verification

You can test a component of your model in isolation, or as part of a larger model. Testing in isolation is useful for debugging the component algorithm and ensuring readiness for component reuse. Testing as part of a larger model considers component behavior in response to particular application inputs and outputs.

This topic is a broad overview of verification activities, including tools in additional products you can use in your verification workflow.

Workflow for Component Verification

This graphic illustrates component verification testing in closed- and open-loop configurations.

  1. Choose your approach for component verification:

    • For closed-loop simulations, verify a component within the context of its container model by logging the signals to that component and storing them in a data file. If those signals do not constitute a complete test suite, generate a harness model and add or modify the test cases in the Signal Builder.

    • For open-loop simulations, verify a component independently of the container model by extracting the component from its container model and creating a harness model for the extracted component. Add or modify test cases in the Signal Builder and log the signals to the component in the harness model.

  2. Prepare component for verification.

  3. Create and log test cases. You can also merge the test case data into a single data file.

    The data file contains the test case data for simulating the component. If you cannot achieve the expected results with a certain set of test cases, add new test cases or modify existing test cases in the data file, and merge them into a single data file.

    Continue adding or modifying test cases until you achieve a test suite that satisfies the goals of your analysis.

  4. Execute the test cases in software-in-the-loop or processor-in-the-loop mode.

  5. After you have a complete test suite, you can:

    • Simulate the model and execute the test cases to:

      • Record coverage using Simulink® Coverage™.

      • Record output values to make sure that you get the expected results.

    • Invoke the Code Generation Verification (CGV) API to execute the generated code for the model that contains the component in simulation, software-in-the-loop (SIL), or processor-in-the-loop (PIL) mode.

      Note

      To execute a model in different modes of execution, you use the CGV API to verify the numerical equivalence of results. For more information about the CGV API, see Programmatic Code Generation Verification (Embedded Coder).

Test a Component in Isolation

This workflow illustrates common steps to test reusable components such as:

  • Model blocks

  • Atomic subsystems

  • Stateflow® atomic subcharts

  1. Depending on the type of component, take one of the following actions:

    • Model blocks — Open the referenced model.

    • Atomic subsystems — Extract the contents of the subsystem into its own Simulink model.

    • Atomic subcharts — Extract the contents of the Stateflow atomic subchart into its own Simulink model.

  2. Create a harness model for:

    • The referenced model

    • The extracted model that contains the contents of the atomic subsystem or atomic subchart

  3. Add or modify test cases in the Signal Builder in the harness model.

  4. Log the input signals from the Signal Builder to the test unit.

  5. Repeat steps 3 and 4 until you are satisfied with the test suite.

  6. Merge the test case data into a single file.

  7. Depending on your goals, take one of the following actions:

    • Execute the test cases to:

      • Record coverage.

      • Record output values and make sure that they equal the expected values.

    • Invoke the Code Generation Verification (CGV) API to execute the test cases in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode on the generated code for the model that contains the component.

If the test cases do not achieve the expected results, repeat steps 3 through 5.

Test a Model Block Included in a Larger Model

Use system analysis to:

  • Verify a Model block in the context of the block’s container model.

  • Analyze a closed-loop controller.

  1. Log the input signals to the component by simulating the container model or analyze the model using the Simulink Design Verifier™ software.

  2. If you want to add test cases to your test suite or modify existing test cases, create a harness model with the logged signals.

  3. Add or modify test cases in the Signal Builder in the harness model.

  4. Log the input signals from the Signal Builder to the test unit.

  5. Repeat steps 3 and 4 until you are satisfied with the test suite.

  6. Merge the test case data into a single file.

  7. Depending on your goals, do one of the following:

    • Execute the test cases to:

      • Record coverage.

      • Record output values and make sure that they equal the expected values.

    • Invoke the Code Generation Verification (CGV) API to execute the test cases in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode on the generated code for the model.

If the test cases do not achieve the expected results, repeat steps 3 through 5.