Feature extraction by sparse filtering
SparseFiltering
uses sparse filtering to
learn a transformation that maps input predictors to new predictors.
Create a SparseFiltering
object using the sparsefilt
function.
FitInfo
— Fitting historyThis property is read-only.
Fitting history, returned as a structure with two fields:
Iteration
— Iteration numbers from 0 through the
final iteration.
Objective
— Objective function value at each
corresponding iteration. Iteration 0 corresponds to the initial values, before
any fitting.
Data Types: struct
InitialTransformWeights
— Initial feature transformation weightsp
-by-q
matrixThis property is read-only.
Initial feature transformation weights, returned as a
p
-by-q
matrix, where p
is the number of predictors passed in X
and
q
is the number of features that you want. These weights are the
initial weights passed to the creation function. The data type is single when the
training data X
is single.
Data Types: single
| double
ModelParameters
— Parameters used for training modelThis property is read-only.
Parameters used for training the model, returned as a structure. The
structure contains a subset of the fields that corresponds to the sparsefilt
name-value pairs
that were in effect during model creation:
IterationLimit
VerbosityLevel
Lambda
Standardize
GradientTolerance
StepTolerance
For details, see the sparsefilt
name-value pairs in
the documentation.
Data Types: struct
Mu
— Predictor means when standardizingp
-by-1
vectorThis property is read-only.
Predictor means when standardizing, returned as a
p
-by-1
vector. This property is nonempty when
the Standardize
name-value pair is
true
at model creation. The value is the vector of predictor
means in the training data. The data type is single when the training data
X
is single.
Data Types: single
| double
NumLearnedFeatures
— Number of output featuresThis property is read-only.
Number of output features, returned as a positive integer. This value is
the q
argument passed to
the creation function, which is the requested number of features to
learn.
Data Types: double
NumPredictors
— Number of input predictorsThis property is read-only.
Number of input predictors, returned as a positive integer. This value is
the number of predictors passed in X
to the creation
function.
Data Types: double
Sigma
— Predictor standard deviations when standardizingp
-by-1
vectorThis property is read-only.
Predictor standard deviations when standardizing, returned as a
p
-by-1
vector. This property is nonempty when
the Standardize
name-value pair is
true
at model creation. The value is the vector of predictor
standard deviations in the training data. The data type is single when the training data
X
is single.
Data Types: single
| double
TransformWeights
— Feature transformation weightsp
-by-q
matrixThis property is read-only.
Feature transformation weights, returned as a
p
-by-q
matrix, where p
is the number of predictors passed in X
and
q
is the number of features that you want. The data type is
single when the training data X
is single.
Data Types: single
| double
transform | Transform predictors into extracted features |
Create a SparseFiltering
object by using the sparsefilt
function.
Load the SampleImagePatches
image patches.
data = load('SampleImagePatches');
size(data.X)
ans = 1×2
5000 363
There are 5,000 image patches, each containing 363 features.
Extract 100 features from the data.
rng default % For reproducibility Q = 100; obj = sparsefilt(data.X,Q,'IterationLimit',100)
Warning: Solver LBFGS was not able to converge to a solution.
obj = SparseFiltering ModelParameters: [1x1 struct] NumPredictors: 363 NumLearnedFeatures: 100 Mu: [] Sigma: [] FitInfo: [1x1 struct] TransformWeights: [363x100 double] InitialTransformWeights: [] Properties, Methods
sparsefilt
issues a warning because it stopped due to reaching the iteration limit, instead of reaching a step-size limit or a gradient-size limit. You can still use the learned features in the returned object by calling the transform
function.
You have a modified version of this example. Do you want to open this example with your edits?