Use Sensors

To interact with the simulation of a model based on user actions or events occurring in the virtual world, you can use virtual reality sensors.To move graphics objects around in a virtual world during simulation or to change their appearance, based on user actions or events, you can:

  1. Define a sensor node, which generates events and output values depending on time, navigation, and actions and distance changes in the scene. For example, a TouchSensor node tracks the location and state of the pointing device. The sensor detects when you point at the geometry contained by the TouchSensor node parent group. See Add Sensors to Virtual Worlds.

  2. Add a VR Source block and select the sensor properties to read. See Read Sensor Values Using MATLAB.

    Note

    Instead of using a VR Source block to read sensor values, you can write an S-function or use a MATLAB Function block.

    If you are working in MATLAB®, you can read sensor values using vrnode object properties.

  3. Read sensor values using a VR Source block, whose outputs can be used to drive simulation behavior.

Add Sensors to Virtual Worlds

You can set up an interface in a Simulink® block diagram to sensors in a virtual reality scene. You can also input signals programmatically from the virtual world into a simulation model.

Virtual reality scenes can contain sensors, which are nodes that generate events and output values depending on time, navigation, and actions and distance changes in the scene. These nodes add interactivity to the virtual world. The virtual world sensors resemble real world sensors, such as ultrasonic, lidar, and touch sensors. You can use Simulink 3D Animation™ functions to read sensor field values into simulation models and control simulation based on the user interaction with the virtual scene.

Ways you can use sensors include:

  • Use sensor data from a virtual world to control a simulation.

  • Provide interactivity between user navigation and interaction in a virtual world and the simulation of the model.

  • Have a simulation react to virtual world events, such as time ticks or outputs from scripts.

  • Use static information from the virtual world, such as the size of a box, to control a simulation.

You can use collision detection to accurately model physical constraints of objects in the real world, where generally two objects cannot be in the same place at the same time. You can use collision detection node outputs to:

  • Change the state of other virtual world nodes.

  • Apply MATLAB algorithms to collision data.

  • Drive Simulink models.

For example, you can use geometric sensors for robotics modeling. For more information, see Detect Object Collisions.

You can define these sensors in a scene.

SensorsDescription
CylinderSensorMaps pointer motion (for example, a mouse) into a rotation on an invisible cylinder that is aligned with the y-axis of the local coordinate system.
PlaneSensorMaps pointing device motion into two-dimensional translation in a plane parallel to the z=0 plane of the local coordinate system.
ProximitySensorGenerates events when the viewer enters, exits, and moves within a region in space (defined by a box)
SphereSensorMaps pointing device motion into spherical rotation about the origin of the local coordinate system
TimeSensorGenerates events as time passes
TouchSensorTracks the location and state of the pointing device and detects when you point at geometry contained by the TouchSensor node parent group.
VisibilitySensorDetects visibility changes of a rectangular box as you navigate the world.
PointPickSensorUses point clouds to detect which of the points are inside colliding geometries
LinePickSensorUses ray fans or other sets of lines that detect the distance to the colliding geometries
PrimitivePickSensorPrimitive geometries (such as a cone, sphere, or box) that detect colliding geometries

Read Sensor Values

You can read values from sensor nodes in a virtual world by using:

Read Sensor Values Using VR Source Blocks

You can use the VR Source block for interactivity between a user navigating the virtual world and the simulation of a Simulink model. The VR Source block registers user interactions with the virtual world and passes that data to the model to affect the simulation of the model. The VR Source block reads the values from the virtual world fields specified in the Block Parameters dialog box and inputs them to a model.

For example, you can specify setpoints (the desired positions) in the virtual world, so that a user can specify the location of a virtual world object interactively. The simulation then responds to the changed location of the object. The VR Source block can read into the model events from the virtual world, such as time ticks or outputs from scripts. The VR Source block can also read into the model static information about the virtual world (for example, the size of a box defined in the virtual world 3D file).

For examples that use a VR Source block, see Virtual Control Panel and Magnetic Levitation Model.

Read Sensor Values Using S-Functions

To use the setpoint value in a Simulink model, you can write an S-function or a MATLAB Function block that reads the sensor output periodically. This example uses an S-function.

  1. Right-click the VR Sensor Reader block of Magnetic Levitation Model (vrmaglev) model and select Mask > Look Under Mask.

    The vrmaglev/VR Sensor Reader model displays. This model contains the vrextin block, which is an S-function block. The vrextin S-function synchronizes the sensor field in the setup method and periodically reads its value in the mdlUpdate method.

  2. Examine the S-function parameters. Right-click vrextin and select S-Function Parameters.

    The parameters defined in the mask supply the sample time, virtual world, and the node and field to read.

Notes About the vrextin S-Function

  • Instead of setting its own block outputs, the vrextin S-function sets the value of the adjacent Constant block value_holder. This setting makes the VR Sensor Reader block compatible with Simulink Coder™ code generation so that the model can run on Simulink Coder targets.

  • The signal loop between user action (moving the ball to a desired position using a mouse) closes through the associated Simulink model vrmaglev. Grabbing the ball and moving it to a new position works only when the model is running and when the model sets the blue selection method switch to the virtual reality sensor signal path. To experience the behavior of the PlaneSensor using the virtual scene only, save the maglev.wrl file under a new name. Remove the comment symbol (#) to enable the last line of this file. These actions activate direct routing of sensor output to a ball translation. Then you can experiment with the newly created scene instead of the original maglev.wrl world.

    ROUTE Grab_Sensor.translation_changed TO Ball.translation
  • You can use this approach to input information from all node fields of the type exposedField or eventOut, not only a Sensor eventOut field. See VRML Data Class Types for more information about virtual world data class types.

  • For fields of class exposedField, you can use an alternate name using the field name with the suffix, _changed. For example, translation and translation_changed are alternate names for requesting the translation field value of the Grab_Sensor node.

See Also

Blocks

Related Topics