Real-Time Image Acquisition, Image Processing, and Fixed-Point Blob Analysis for Target Practice Analysis

This example shows how to acquire real-time images from a webcam, process the images using fixed-point blob analysis, and determine world coordinates to score a laser pistol target.

The technology featured in this example is used in a wide range of applications, such as estimating distances to objects in front of a car, and medical image analysis of cells. Key features of this example include:

  • Fixed-point blob analysis for collecting measurements

  • Real-time image acquisition

  • Camera calibration to determine world coordinates of image points

  • Correct images for lens distortion to ensure accuracy of collected measurements in world units

  • Determine world coordinates of image points by mapping pixel locations to locations in real-world units

All code for this example is stored in the examples folder. To edit the code, navigate to this folder.

cd(fullfile(docroot,'toolbox','fixpoint',...
'examples','laser_target_example'));
Copy the +LaserTargetExample folder to a writeable location.

Hardware Setup

Cameras

Image Acquisition Toolbox™ enables you to acquire images and video from cameras and frame grabbers directly into MATLAB® and Simulink®. Using the Image Acquisition Toolbox Support Package for GigE Vision® Hardware or the MATLAB® Support Package for USB Webcams, set up a camera to acquire the real-time images to perform the analysis.

For more information on setting up the camera, see Device Connection (Image Acquisition Toolbox).

Target

Use the following commands to create a target to print for use in the exercise. The code generates a postscript file that can be opened and printed double-sided, with the target on one side, and the checkerboard for camera calibration on the other side.

distance = 10; % meters
offset_mm = 0; % mm
print_target = true;
LaserTargetExample.make_target_airpistol10m(distance, ...
offset_mm, print_target)

You can find pre-made targets in the +LaserTargetExample/targets_for_printing folder.

Setup

Set up the camera so that it faces the checkerboard side of the target. The shooter faces the target. You can keep the target and camera in fixed positions by mounting them on a board.

Algorithm

Calibrating the Image

Camera calibration is the process of estimating the parameters of the lens and the image sensor. These parameters measure objects captured by the camera. Use the Camera Calibrator (Computer Vision Toolbox) app to detect the checkerboard pattern on the back of the target, and remove any distortion. Determine the threshold of ambient light on the target. You may need to adjust camera settings or the lighting so that the image is not saturated. Use the pointsToWorld function to determine world coordinates of the image points.

For more information, see What Is Camera Calibration? (Computer Vision Toolbox).

Finding and Scoring the Shot

The algorithm scores the shots by detecting the bright light of the laser pistol. While shooting, get a frame and detect if there is a bright spot. If there is a bright spot over the specified threshold, process that frame.

Use blob analysis to find the center of the bright spot, and translate the location from pixel coordinates to world coordinates. The blob analysis is done in fixed point because the image is stored as an 8-bit signed integer. After finding the center of the bright spot in world coordinates, calculate its distance from the bullseye at the origin and assign a point value to the shot.

Run the Example

Add the example code to the path.

addpath(fullfile(docroot,'toolbox','fixpoint',...
'examples','laser_target_example'));

Start the simulation by executing the run script stored in the +LaserTargetExample folder.

LaserTargetExample.run
(1) gigecam
(2) webcam
(3) simulation
Enter the number of the source type:

The script prompts you to select the source to use for the simulation. Enter 3 to watch a simulation of a previously recorded session. There are eight previously recorded simulations available. Enter a number (1 through 8) to begin the simulation.

(1) saved_shots_20170627T201451
(2) saved_shots_20170627T201814
(3) saved_shots_20170702T153245
(4) saved_shots_20170702T153418
(5) saved_shots_20170702T162503
(6) saved_shots_20170702T162625
(7) saved_shots_20170702T162743
(8) saved_shots_20170702T162908
Enter number of file from list:

Entering 1 or 2 prompts you to set up a GigE Vision camera or a webcam. The example then prompts you to enter the distance from the shooter to the target (meters) and the name of the shooter.

Use a Different Camera

To set up the example using your own camera, use the Camera Calibrator (Computer Vision Toolbox) app to detect the checkerboard on the back of the target, and remove distortion. Save the calibration variables in a MAT-file. The calibration variables for the GigE Vision camera and a webcam are saved in the following MAT-files.

  • +LaserTargetExample/gigecam_240x240_calibration_parameters.mat

  • +LaserTargetExample/webcam_LifeCam_480x480_camera_parameters.mat

Edit one of the following files substituting the settings with appropriate values for your camera.

  • +LaserTargetExample/gigecam_setup.m

  • +LaserTargetExample/webcam_setup.m

Explore Data

Shot Database

Each time you shoot, the hits are recorded in a file named ShotDatabase.csv. You can load the data into a table object using readtable to visualize it. For example, after shooting, which populates the ShotDatabase.csv file, the following code plots the center of a group of many shots.

T = readtable('ShotDatabase.csv');
LaserTargetExample.make_target_airpistol10m;
LaserTargetExample.plot_shot_points(T.X, T.Y);
ax = gca;
line(mean(T.X)*[1,1], ax.YLim);
line(ax.XLim, mean(T.Y)*[1,1]);
grid on;

Simulation Recordings

Each time you shoot, the video frames in which shots were detected are stored in files in a folder named simulation_recordings. You can load these files and explore the raw data from the shots. You can also edit the algorithm.

The variable frames contains the first frame which was used for calibration, plus ten frames for each detected shot. The first frame in each run of ten is where a shot was detected. You can see your hand movement in the subsequent frames. You can make a short animation of the data using the following code.

d = dir(fullfile('simulation_recordings','*.mat'));
record = load(fullfile(d(1).folder, d(1).name));
t = LaserTargetExample.SerialDateNumber_to_seconds(...
    record.times);
t = t-t(1);
figure
for k = 1:size(record.frames, 3)
    imshow(record.frames(:,:,k), ...
        'InitialMagnification','fit');
    title(sprintf('Time since beginning of round: %.3f seconds',...
        t(k)))
    drawnow
end

See Also

(Computer Vision Toolbox) | (Computer Vision Toolbox) | (Computer Vision Toolbox) | (Computer Vision Toolbox)

Related Topics