Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for designing, simulating, and testing systems that fuse data from multiple sensors to maintain situational awareness and localization. Reference examples provide a starting point for multi-object tracking and sensor fusion development for surveillance and autonomous systems, including airborne, spaceborne, ground-based, shipborne, and underwater systems.
You can fuse data from real-world sensors, including active and passive radar, sonar, lidar, EO/IR, IMU, and GPS. You can also generate synthetic data from virtual sensors to test your algorithms under different scenarios. The toolbox includes multi-object trackers and estimation filters for evaluating architectures that combine grid-level, detection-level, and object- or track-level fusion. It also provides metrics, including OSPA and GOSPA, for validating performance against ground truth scenes.
For simulation acceleration or rapid prototyping, the toolbox supports C code generation.
Learn the basics of Sensor Fusion and Tracking Toolbox
Examples for autonomous system tracking, surveillance system tracking, localization, and hardware connectivity
Quaternions, Euler angles, rotation matrices, and conversions
Ground-truth waypoint- and rate-based trajectories and scenarios
IMU, GPS, RADAR, ESM, and EO/IR
IMU and GPS sensor fusion to determine orientation and position
Kalman and particle filters, linearization functions, and motion models
Multi-sensor multi-object trackers, data association, and track fusion
Multi-object theater plots, detection and object tracks, and track metrics