This example shows how to record synthetic lidar sensor data using a 3D simulation environment, develop a perception algorithm using the recorded data, and use that algorithm within the simulation environment.
Automated Driving Toolbox™ integrates a 3D simulation environment in Simulink®. The 3D simulation environment uses the Unreal Engine® by Epic Games®. Simulink blocks related to the 3D simulation environment can be found in the drivingsim3d
library. These blocks provide the ability to:
Select different scenes in the 3D simulation environment
Place and move vehicles in the scene
Attach and configure sensors on the vehicles
Simulate sensor data based on the environment around the vehicle
This powerful simulation tool can be used to supplement real data when developing, testing, and verifying the performance of automated driving algorithms, making it possible to test scenarios that are difficult to reproduce in the real world.
In this example, you evaluate a lidar perception algorithm using synthetic lidar data generated from the 3D simulation environment. The example walks you through the following steps:
Record and visualize synthetic lidar sensor data from the 3D simulation environment.
Develop a perception algorithm to build a map in MATLAB®.
Use the perception algorithm within the simulation environment.
First, set up a scenario in the 3D simulation environment that can be used to test the perception algorithm. Use a scene depicting a typical city block with a single vehicle, the vehicle under test. You can use this scene to test the performance of the algorithm in an urban road setting.
Next, select a trajectory for the vehicle to follow in the scene. The Select Waypoints for 3D Simulation example describes how to interactively select a sequence of waypoints from a scene and generate a vehicle trajectory. This example uses a recorded drive segment obtained using the helperSelectSceneWaypoints
function, as described in the waypoint selection example.
% Load reference path for recorded drive segment data = load('sim3d_DriveSegmentUSCityBlock.mat'); % Set up workspace variables used by model refPosesX = data.driveSegmentUSCityBlock.refPathX; refPosesY = data.driveSegmentUSCityBlock.refPathY; refPosesT = data.driveSegmentUSCityBlock.refPathT; % Display path on scene image sceneName = 'USCityBlock'; hScene = figure; helperShowSceneImage(sceneName); hold on scatter(refPosesX(:,2), refPosesY(:,2), [], 'filled') % Adjust axes limits xlim([-60 30]) ylim([-30 50])
The SimulateLidarSensorPerceptionAlgorithm
Simulink model is configured with the US City Block scene using the Simulation 3D Scene Configuration block. The model places a vehicle on the scene using the Simulation 3D Vehicle with Ground Following block. The lidar sensor is attached to the vehicle using the Simulation 3D Lidar block. In the block dialog box, use the Mounting tab to adjust the placement of the sensor, and the Parameters tab to configure properties of the sensor to simulate different lidar sensors. In this example, the lidar is mounted on the center of the roof. The lidar sensor is configured to model a typical Velodyne® HDL-32E sensor.
close(hScene) if ~ispc error(['3D Simulation is only supported on Microsoft', char(174), ' Windows', char(174), '.']); end % Open the model modelName = 'SimulateLidarSensorPerceptionAlgorithm'; open_system(modelName); snapnow;
The model uses an enabled subsystem to have two modes based on the status of the recordMode
workspace variable.
Record Mode: When recordMode
is set to true
, the model records and visualizes the synthetic lidar data.
Algorithm Mode: When recordMode
is set to false
, the model runs the perception algorithm described later on in the Simulate with Perception Algorithm section.
For more details about enabled subsystems, see Using Enabled Subsystems (Simulink).
The rest of the example follows these steps:
Simulate the model in record mode. In this mode, the model records synthetic lidar data generated by the sensor and saves it to the workspace.
Use the sensor data saved to the workspace to develop a perception algorithm in MATLAB. The perception algorithm builds a map of the surroundings.
Use the developed perception-in-the-loop algorithm by simulating the model in algorithm mode.
The Record and Visualize subsystem records the synthetic lidar data to the workspace using a To Workspace block. The Visualize Point Cloud MATLAB Function block uses a pcplayer
object to visualize the streaming point clouds.
subsystemName = [modelName, '/', 'Record and Visualize']; open_system(subsystemName, 'force'); snapnow;
Simulate the model in record mode and observe the streaming point cloud display, showing the synthetic lidar sensor data. Once the model has completed simulation, the simOut
variable holds a structure with variables written to the workspace. The helperGetPointCloud
function extracts the sensor data into an array of pointCloud
objects. The pointCloud
object is the fundamental data structure used to hold lidar data and perform point cloud processing in MATLAB.
% Close the Record and Visualize subsystem close_system(subsystemName); % Set mode to record data recordMode = true; %#ok<NASGU> % Update simulation stop time to end when reference path is completed simStopTime = refPosesX(end,1); set_param(gcs, 'StopTime', num2str(simStopTime)); % Run the simulation simOut = sim(modelName); % Create a pointCloud array from the recorded data ptCloudArr = helperGetPointCloud(simOut);
The synthetic lidar sensor data can be used to develop, experiment with, and verify a perception algorithm in different scenarios. This example uses an algorithm to build a 3D map of the environment from streaming lidar data. Such an algorithm is a building block for applications like simultaneous localization and mapping (SLAM). This algorithm can also be used to create high-definition (HD) maps for geographic regions that can then be used for online localization. The map building algorithm is encapsulated in the helperLidarMapBuilder
class. This class uses point cloud and lidar processing capabilities in MATLAB. See Lidar and Point Cloud Processing (Computer Vision Toolbox).
The helperLidarMapBuilder
class takes incoming point clouds from a lidar and progressively builds a map using the following steps:
Preprocess each incoming point cloud to remove the ground and the ego vehicle.
Register the incoming point cloud with the last point cloud using a normal distribution transform (NDT). The pcregisterndt
function performs the registration. To improve accuracy and efficiency of registration, pcdownsample
is used to downsample the point cloud prior to registration.
Use the estimated transformation obtained from registration to transform the incoming point cloud to the frame of reference of the map.
For simplicity, this example uses a lidar-only mapping algorithm, with no external cues from other sensors. Such an algorithm is susceptible to drift while accumulating a map over long sequences. Moreover, NDT-based registration is sensitive to initialization. It is typical to use external cues like dead reckoning or IMU to initialize registration.
% Create a lidar map builder mapBuilder = helperLidarMapBuilder(); % Loop through the point cloud array and progressively build a map skipFrames = 5; numFrames = numel(ptCloudArr); exitLoop = false; for n = 1 : skipFrames : numFrames-skipFrames % Update map with new lidar frame updateMap(mapBuilder, ptCloudArr(n)); % Update top-view display isDisplayOpen = updateDisplay(mapBuilder, exitLoop); % Check and exit if needed exitLoop = ~isDisplayOpen; end snapnow; % Close display closeDisplay = true; updateDisplay(mapBuilder, closeDisplay);
Visualize the accumulated map computed using the recorded data.
hFigRecorded = figure; pcshow(mapBuilder.Map) title('Point Cloud Map - Recorded Data') % Customize axes labels xlabel('X (m)') ylabel('Y (m)') zlabel('Z (m)')
After developing the perception algorithm using recorded data, you can use the algorithm in the simulation environment.
To update the model to the algorithm mode, set the recordMode
variable to false
. This mode enables the Build Map from Lidar subsystem, which uses a MATLAB Function block to run the perception algorithm during simulation.
close(hFigRecorded) subsystemName = [modelName, '/', 'Build Map from Lidar']; open_system(subsystemName, 'force'); snapnow; % Set record mode to false to enable perception in the loop recordMode = false; %#ok<NASGU> % Simulate with perception sim(modelName); snapnow;
In this example, you used the Simulink interface to the 3D simulation environment to:
Record synthetic sensor data for a lidar sensor.
Develop a perception algorithm using recorded data.
Test the perception algorithm during simulation.
By changing the scene, placing more vehicles in the scene, or updating the sensor mounting and parameters, the perception algorithm can be stress-tested under different scenarios. This approach can be used to increase coverage for scenarios that are difficult to reproduce in the real world.
% Close windows close_system(subsystemName) close_system(modelName) % Reset record mode recordMode = true;
helperGetPointCloud Extract an array of pointCloud
objects.
function ptCloudArr = helperGetPointCloud(simOut) % Extract signal ptCloudData = simOut.ptCloudData.signals.values; % Create a pointCloud array ptCloudArr = pointCloud(ptCloudData(:,:,:,1)); for n = 2 : size(ptCloudData,4) ptCloudArr(end+1) = pointCloud(ptCloudData(:,:,:,n)); %#ok<AGROW> end end
Simulation 3D Lidar | Simulation 3D Scene Configuration | Simulation 3D Vehicle with Ground Following