Lidar sensor model in 3D simulation environment
Automated Driving Toolbox / Simulation 3D
The Simulation 3D Lidar block provides an interface to the lidar sensor in a 3D simulation environment. This environment is rendered using the Unreal Engine® from Epic Games®. The block returns a point cloud with the specified field of view and angular resolution. You can also output the distances from the sensor to object points. In addition, you can output the location and orientation of the sensor in the world coordinate system of the scene.
If you set Sample time to -1
, the block uses the
sample time specified in the Simulation 3D Scene Configuration block. To use
this sensor, ensure that the Simulation 3D Scene Configuration block is in your
model.
The Simulation 3D Scene Configuration block must execute before the Simulation 3D Lidar block. That way, the Unreal Engine 3D visualization environment prepares the data before the Simulation 3D Lidar block receives it. To check the block execution order, right-click the blocks and select Properties. On the General tab, confirm these Priority settings:
Simulation 3D Scene Configuration — 0
Simulation 3D Lidar — 1
For more information about execution order, see How 3D Simulation for Automated Driving Works.
Point cloud
— Point cloud dataPoint cloud data, returned as an m-by-n-by 3 array of positive, real-valued [x, y, z] points. m and n define the number of points in the point cloud, as shown in this equation:
where:
VFOV is the vertical field of view of the lidar, in degrees, as specified by the Vertical field of view (deg) parameter.
VRES is the vertical angular resolution of the lidar, in degrees, as specified by the Vertical resolution (deg) parameter.
HFOV is the horizontal field of view of the lidar, in degrees, as specified by the Horizontal field of view (deg) parameter.
HRES is the horizontal angular resolution of the lidar, in degrees, as specified by the Horizontal resolution (deg) parameter.
Each m-by-n entry in the array specifies the
x, y, and z coordinates of
a detected point in the sensor coordinate system. If the lidar does not detect a point
at a given coordinate, then x, y, and
z are returned as NaN
.
You can create a point cloud from these returned points by using point cloud functions in a MATLAB Function block. For a list of point cloud processing functions, see Lidar Processing. For an example that uses these functions, see Simulate Lidar Sensor Perception Algorithm.
Data Types: single
Distance
— Distance to object pointsDistance to object points measured by the lidar sensor, returned as an m-by-n positive real-valued matrix. Each m-by-n value in the matrix corresponds to an [x, y, z] coordinate point returned by the Point cloud output port.
To enable this port, on the Parameters tab, select Distance outport.
Data Types: single
Location
— Sensor locationSensor location along the X-axis, Y-axis, and Z-axis of the scene. The Location values are in the world coordinates of the scene. In this coordinate system, the Z-axis points up from the ground. Units are in meters.
To enable this port, on the Ground Truth tab, select Output location (m) and orientation (rad).
Data Types: double
Orientation
— Sensor orientationRoll, pitch, and yaw sensor orientation about the X-axis, Y-axis, and Z-axis of the scene. The Orientation values are in the world coordinates of the scene. These values are positive in the clockwise direction when looking in the positive directions of these axes. Units are in radians.
To enable this port, on the Ground Truth tab, select Output location (m) and orientation (rad).
Data Types: double
Sensor identifier
— Unique sensor identifier1
(default) | positive integerUnique sensor identifier, specified as a positive integer. In a multisensor system, the sensor identifier distinguishes between sensors. When you add a new sensor block to your model, the Sensor identifier of that block is N + 1. N is the highest Sensor identifier value among existing sensor blocks in the model.
Example: 2
Parent name
— Name of parent to which sensor is mountedScene Origin
(default) | vehicle nameName of the parent to which the sensor is mounted, specified as Scene
Origin
or as the name of a vehicle in your model. The vehicle
names that you can select correspond to the Name parameters of
the Simulation 3D
Vehicle with Ground Following blocks in your model. If you select
Scene Origin
, the block places a sensor at the scene
origin.
Example: SimulinkVehicle1
Mounting location
— Sensor mounting locationOrigin
(default) | Front bumper
| Rear bumper
| Right mirror
| Left mirror
| Rearview mirror
| Hood center
| Roof center
Sensor mounting location.
When Parent name is Scene
Origin
, the block mounts the sensor to the origin of
the scene, and Mounting location can be set to
Origin
only. During simulation, the
sensor remains stationary.
When Parent name is the name of a vehicle (for
example, SimulinkVehicle1
) the block mounts
the sensor to one of the predefined mounting locations described in the
table. During simulation, the sensor travels with the vehicle.
Vehicle Mounting Location | Description | Orientation Relative to Vehicle Origin [Roll, Pitch, Yaw] (deg) |
---|---|---|
Origin | Forward-facing sensor mounted to the vehicle origin, which is on the ground, at the geometric center of the vehicle (see Coordinate Systems for 3D Simulation in Automated Driving Toolbox) | [0, 0, 0] |
Front bumper | Forward-facing sensor mounted to the front bumper | [0, 0, 0] |
Rear bumper | Backward-facing sensor mounted to the rear bumper | [0, 0, 180] |
Right mirror | Downward-facing sensor mounted to the right side-view mirror | [0, –90, 0] |
Left mirror | Downward-facing sensor mounted to the left side-view mirror | [0, –90, 0] |
Rearview mirror | Forward-facing sensor mounted to the rearview mirror, inside the vehicle | [0, 0, 0] |
Hood center | Forward-facing sensor mounted to the center of the hood | [0, 0, 0] |
Roof center | Forward-facing sensor mounted to the center of the roof | [0, 0, 0] |
The (X, Y, Z) location of the sensor relative to the vehicle depends on the vehicle type. To specify the vehicle type, use the Type parameter of the Simulation 3D Vehicle with Ground Following block to which you are mounting. The tables show the X, Y, and Z locations of sensors in the vehicle coordinate system. In this coordinate system:
The X-axis points forward from the vehicle.
The Y-axis points to the left of the vehicle, as viewed when facing forward.
The Z-axis points up from the ground.
Roll, pitch, and yaw are clockwise-positive when looking in the positive direction of the X-axis, Y-axis, and Z-axis, respectively. When looking at a vehicle from the top down, then the yaw angle (that is, the orientation angle) is counterclockwise-positive, because you are looking in the negative direction of the axis.
Box Truck — Sensor Locations Relative to Vehicle Origin
Mounting Location | X (m) | Y (m) | Z (m) |
---|---|---|---|
Front bumper | 5.10 | 0 | 0.60 |
Rear bumper | –5 | 0 | 0.60 |
| 2.90 | 1.60 | 2.10 |
| 2.90 | –1.60 | 2.10 |
| 2.60 | 0.20 | 2.60 |
| 3.80 | 0 | 2.10 |
| 1.30 | 0 | 4.20 |
Hatchback — Sensor Locations Relative to Vehicle Origin
Mounting Location | X (m) | Y (m) | Z (m) |
---|---|---|---|
Front bumper | 1.93 | 0 | 0.51 |
Rear bumper | –1.93 | 0 | 0.51 |
| 0.43 | –0.84 | 1.01 |
| 0.43 | 0.84 | 1.01 |
| 0.32 | 0 | 1.27 |
| 1.44 | 0 | 1.01 |
| 0 | 0 | 1.57 |
Muscle Car — Sensor Locations Relative to Vehicle Origin
Mounting Location | X (m) | Y (m) | Z (m) |
---|---|---|---|
Front bumper | 2.47 | 0 | 0.45 |
Rear bumper | –2.47 | 0 | 0.45 |
| 0.43 | –1.08 | 1.01 |
| 0.43 | 1.08 | 1.01 |
| 0.32 | 0 | 1.20 |
| 1.28 | 0 | 1.14 |
| –0.25 | 0 | 1.58 |
Sedan — Sensor Locations Relative to Vehicle Origin
Mounting Location | X (m) | Y (m) | Z (m) |
---|---|---|---|
Front bumper | 2.42 | 0 | 0.51 |
Rear bumper | –2.42 | 0 | 0.51 |
| 0.59 | –0.94 | 1.09 |
| 0.59 | 0.94 | 1.09 |
| 0.43 | 0 | 1.31 |
| 1.46 | 0 | 1.11 |
| –0.45 | 0 | 1.69 |
Small Pickup Truck — Sensor Locations Relative to Vehicle Origin
Mounting Location | X (m) | Y (m) | Z (m) |
---|---|---|---|
Front bumper | 3.07 | 0 | 0.51 |
Rear bumper | –3.07 | 0 | 0.51 |
| 1.10 | –1.13 | 1.52 |
| 1.10 | 1.13 | 1.52 |
| 0.85 | 0 | 1.77 |
| 2.22 | 0 | 1.59 |
| 0 | 0 | 2.27 |
Sport Utility Vehicle — Sensor Locations Relative to Vehicle Origin
Mounting Location | X (m) | Y (m) | Z (m) |
---|---|---|---|
Front bumper | 2.42 | 0 | 0.51 |
Rear bumper | –2.42 | 0 | 0.51 |
| 0.60 | –1 | 1.35 |
| 0.60 | 1 | 1.35 |
| 0.39 | 0 | 1.55 |
| 1.58 | 0 | 1.39 |
| –0.56 | 0 | 2 |
To determine the location of the sensor in world coordinates, open the sensor block. Then, on the Ground Truth tab, select Output location (m) and orientation (rad) and inspect the data from the Location output port.
Specify offset
— Specify offset from mounting locationoff
(default) | on
Select this parameter to specify an offset from the mounting location by using the Relative translation [X, Y, Z] (m) and Relative rotation [Roll, Pitch, Yaw] (deg) parameters.
Relative translation [X, Y, Z] (m)
— Translation offset relative to mounting location[0, 0, 0]
(default) | real-valued 1-by-3 vectorTranslation offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [X, Y, Z]. Units are in meters.
If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are in the vehicle coordinate system, where:
The X-axis points forward from the vehicle.
The Y-axis points to the left of the vehicle, as viewed when facing forward .
The Z-axis points up.
The origin is the mounting location specified in the Mounting location parameter. This origin is different from the vehicle origin, which is the geometric center of the vehicle.
If you mount the sensor to the scene origin by setting Parent
name to Scene Origin
, then
X, Y, and Z are
in the world coordinates of the scene.
For more details about the vehicle and world coordinate systems, see Coordinate Systems for 3D Simulation in Automated Driving Toolbox.
Example: [0,0,0.01]
To enable this parameter, select Specify offset.
Relative rotation [Roll, Pitch, Yaw] (deg)
— Rotational offset relative to mounting location[0, 0, 0]
(default) | real-valued 1-by-3 vectorRotational offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [Roll, Pitch, Yaw] . Roll, pitch, and yaw are the angles of rotation about the X-, Y-, and Z-axes, respectively. Units are in degrees.
If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are in the vehicle coordinate system, where:
The X-axis points forward from the vehicle.
The Y-axis points to the left of the vehicle, as viewed when facing forward .
The Z-axis points up.
Roll, pitch, and yaw are clockwise-positive when looking in the forward direction of the X-axis, Y-axis, and Z-axis, respectively. If you view a scene from a 2D top-down perspective, then the yaw angle (also called the orientation angle) is counterclockwise-positive, because you are viewing the scene in the negative direction of the Z-axis.
The origin is the mounting location specified in the Mounting location parameter. This origin is different from the vehicle origin, which is the geometric center of the vehicle.
If you mount the sensor to the scene origin by setting Parent
name to Scene Origin
, then
X, Y, and Z are
in the world coordinates of the scene.
For more details about the vehicle and world coordinate systems, see Coordinate Systems for 3D Simulation in Automated Driving Toolbox.
Example: [0,0,10]
To enable this parameter, select Specify offset.
Sample time
— Sample time-1
(default) | positive scalarSample time of the block in seconds, specified as a positive scalar. The 3D simulation environment frame rate is the inverse of the sample time.
If you set the sample time to -1
, the block inherits its sample
time from the Simulation 3D
Scene Configuration block.
Detection range (m)
— Maximum distance measured by lidar sensor120
(default) | positive scalarMaximum distance measured by the lidar sensor, specified as a positive scalar. Points outside this range are ignored. Units are in meters.
Range resolution (m)
— Resolution of lidar sensor range0.002
(default) | positive real scalarResolution of the lidar sensor range, in meters, specified as a positive real scalar. The range resolution is also known as the quantization factor. The minimal value of this factor is Drange / 224, where Drange is the maximum distance measured by the lidar sensor, as specified in the Detection range (m) parameter.
Vertical field of view (deg)
— Vertical field of view 40
(default) | positive scalarVertical field of view of the lidar sensor, specified as a positive scalar. Units are in degrees.
Vertical resolution (deg)
— Vertical angular resolution1.25
(default) | positive scalarVertical angular resolution of the lidar sensor, specified as a positive scalar. Units are in degrees.
Horizontal field of view (deg)
— Horizontal field of view360
(default) | positive scalarHorizontal field of view of the lidar sensor, specified as a positive scalar. Units are in degrees.
Horizontal resolution (deg)
— Horizontal angular (azimuth) resolution0.16
(default) | positive scalarHorizontal angular (azimuth) resolution of the lidar sensor, specified as a positive scalar. Units are in degrees.
Distance outport
— Output distance to measured object pointsoff
(default) | on
Select this parameter to output the distance to measured object points at the Distance port.
Output location (m) and orientation (rad)
— Output location and orientation of sensoroff
(default) | on
Select this parameter to output the location and orientation of the sensor at the Location and Orientation ports, respectively.
To visualize point clouds that are output by the Point cloud port, you can either:
Use a pcplayer
object in a MATLAB
Function block. For an example of this visualization setup, see Simulate Lidar Sensor Perception Algorithm.
Use the Bird's-Eye Scope. For more details, see Visualize 3D Simulation Sensor Coverages and Detections.
The Unreal Engine can take a long time to start up between simulations, consider logging the signals that the sensors output. You can then use this data to develop perception algorithms in MATLAB®. See Configure a Signal for Logging (Simulink).