Tune fuzzy inference system or tree of fuzzy inference systems
tunes the fuzzy inference system using a function handle to a custom cost function,
fisout
= tunefis(fisin
,paramset
,custcostfcn
)custcostfcn
.
tunes the fuzzy inference system with additional options from the object
fisout
= tunefis(___,options
)options
created using tunefisOptions
.
Create the initial fuzzy inference system using genfis
.
x = (0:0.1:10)';
y = sin(2*x)./exp(x/5);
options = genfisOptions('GridPartition');
options.NumMembershipFunctions = 5;
fisin = genfis(x,y,options);
Obtain the tunable settings of inputs, outputs, and rules of the fuzzy inference system.
[in,out,rule] = getTunableSettings(fisin);
Tune the membership function parameters with "anfis"
.
fisout = tunefis(fisin,[in;out],x,y,tunefisOptions("Method","anfis"));
ANFIS info: Number of nodes: 24 Number of linear parameters: 10 Number of nonlinear parameters: 15 Total number of parameters: 25 Number of training data pairs: 101 Number of checking data pairs: 0 Number of fuzzy rules: 5 Start training ANFIS ... 1 0.0694086 2 0.0680259 3 0.066663 4 0.0653198 Step size increases to 0.011000 after epoch 5. 5 0.0639961 6 0.0626917 7 0.0612787 8 0.0598881 Step size increases to 0.012100 after epoch 9. 9 0.0585193 10 0.0571712 Designated epoch number reached. ANFIS training completed at epoch 10. Minimal training RMSE = 0.0571712
Create the initial fuzzy inference system using genfis
.
x = (0:0.1:10)';
y = sin(2*x)./exp(x/5);
options = genfisOptions('GridPartition');
options.NumMembershipFunctions = 5;
fisin = genfis(x,y,options);
Obtain the tunable settings of inputs, outputs, and rules of the fuzzy inference system.
[in,out,rule] = getTunableSettings(fisin);
Tune the rule parameter only. In this example, the pattern search method is used.
fisout = tunefis(fisin,rule,x,y,tunefisOptions("Method","patternsearch"));
Iter Func-count f(x) MeshSize Method 0 1 0.346649 1 1 19 0.346649 0.5 Refine Mesh 2 37 0.346649 0.25 Refine Mesh 3 55 0.346649 0.125 Refine Mesh 4 73 0.346649 0.0625 Refine Mesh 5 91 0.346649 0.03125 Refine Mesh 6 109 0.346649 0.01562 Refine Mesh 7 127 0.346649 0.007812 Refine Mesh 8 145 0.346649 0.003906 Refine Mesh 9 163 0.346649 0.001953 Refine Mesh 10 181 0.346649 0.0009766 Refine Mesh 11 199 0.346649 0.0004883 Refine Mesh 12 217 0.346649 0.0002441 Refine Mesh 13 235 0.346649 0.0001221 Refine Mesh 14 253 0.346649 6.104e-05 Refine Mesh 15 271 0.346649 3.052e-05 Refine Mesh 16 289 0.346649 1.526e-05 Refine Mesh 17 307 0.346649 7.629e-06 Refine Mesh 18 325 0.346649 3.815e-06 Refine Mesh 19 343 0.346649 1.907e-06 Refine Mesh 20 361 0.346649 9.537e-07 Refine Mesh Optimization terminated: mesh size less than options.MeshTolerance.
Create the initial fuzzy inference system using genfis
.
x = (0:0.1:10)';
y = sin(2*x)./exp(x/5);
options = genfisOptions('GridPartition');
options.NumMembershipFunctions = 5;
fisin = genfis(x,y,options);
Obtain the tunable settings of inputs, outputs, and rules of the fuzzy inference system.
[in,out,rule] = getTunableSettings(fisin);
You can tune with custom parameter settings using setTunable
or dot notation.
Do not tune input 1.
in(1) = setTunable(in(1),false);
For output 1:
do not tune membership functions 1 and 2,
do not tune membership function 3,
set the minimum parameter range of membership function 4 to -2,
and set the maximum parameter range of membership function 5 to 2.
out(1).MembershipFunctions(1:2) = setTunable(out(1).MembershipFunctions(1:2),false); out(1).MembershipFunctions(3).Parameters.Free = false; out(1).MembershipFunctions(4).Parameters.Minimum = -2; out(1).MembershipFunctions(5).Parameters.Maximum = 2;
For the rule settings,
do not tune rules 1 and 2,
set the antecedent of rule 3 to non-tunable,
allow NOT logic in the antecedent of rule 4,
and do not ignore any outputs in rule 3.
rule(1:2) = setTunable(rule(1:2),false); rule(3).Antecedent.Free = false; rule(4).Antecedent.AllowNot = true; rule(3).Consequent.AllowEmpty = false;
Set the maximum number of iterations to 20 and tune the fuzzy inference system.
opt = tunefisOptions("Method","particleswarm"); opt.MethodOptions.MaxIterations = 20; fisout = tunefis(fisin,[in;out;rule],x,y,opt);
Best Mean Stall Iteration f-count f(x) f(x) Iterations 0 90 0.3265 1.857 0 1 180 0.3265 4.172 0 2 270 0.3265 3.065 1 3 360 0.3265 3.839 2 4 450 0.3265 3.386 3 5 540 0.3265 3.249 4 6 630 0.3265 3.311 5 7 720 0.3265 2.901 6 8 810 0.3265 2.868 7 9 900 0.3181 2.71 0 10 990 0.3181 2.068 1 11 1080 0.3181 2.692 2 12 1170 0.3165 2.146 0 13 1260 0.3165 1.869 1 14 1350 0.3165 2.364 2 15 1440 0.3165 2.07 0 16 1530 0.3164 1.678 0 17 1620 0.2978 1.592 0 18 1710 0.2977 1.847 0 19 1800 0.2954 1.666 0 20 1890 0.2947 1.608 0 Optimization ended: number of iterations exceeded OPTIONS.MaxIterations.
To prevent the overfitting of your tuned FIS to your training data using k-fold cross validation.
Load training data. This training data set has one input and one output.
load fuzex1trnData.dat
Create a fuzzy inference system for the training data.
opt = genfisOptions('GridPartition'); opt.NumMembershipFunctions = 4; opt.InputMembershipFunctionType = "gaussmf"; inputData = fuzex1trnData(:,1); outputData = fuzex1trnData(:,2); fis = genfis(inputData,outputData,opt);
For reproducibility, set the random number generator seed.
rng('default')
Configure the options for tuning the FIS. Use the default tuning method with a maximum of 30
iterations.
tuningOpt = tunefisOptions; tuningOpt.MethodOptions.MaxGenerations = 30;
Configure the following options for using k-fold cross validation.
Use a k-fold value of 3
.
Compute the moving average of the validation cost using a window of length 2
.
Stop each training-validation iteration when the average cost is 5% greater than the current minimum cost.
tuningOpt.KFoldValue = 3; tuningOpt.ValidationWindowSize = 2; tuningOpt.ValidationTolerance = 0.05;
Obtain the settings for tuning the membership function parameters of the FIS.
[in,out] = getTunableSettings(fis);
Tune the FIS.
[outputFIS,info] = tunefis(fis,[in;out],inputData,outputData,tuningOpt);
Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.2421 0.5109 0 2 590 0.2292 0.4688 0 3 780 0.2292 0.4443 1 4 970 0.2256 0.4145 0 5 1160 0.2165 0.3957 0 6 1350 0.2165 0.3835 1 7 1540 0.2077 0.3548 0 8 1730 0.2077 0.3435 1 9 1920 0.2012 0.3414 0 10 2110 0.1857 0.316 0 Optimization terminated: validation tolerance exceeded. Cross validation iteration 1: Minimum validation cost 0.294718 found at training cost 0.207704 Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.2089 0.3924 0 2 590 0.2059 0.3655 0 Optimization terminated: validation tolerance exceeded. Cross validation iteration 2: Minimum validation cost 0.306682 found at training cost 0.220498 Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.2489 0.3936 0 2 590 0.2438 0.3837 0 3 780 0.2438 0.3779 1 4 970 0.2067 0.3476 0 Optimization terminated: validation tolerance exceeded. Cross validation iteration 3: Minimum validation cost 0.220104 found at training cost 0.255407
Evaluate the FIS for each of the training input values.
outputTuned = evalfis(outputFIS,inputData);
Plot the output of the tuned FIS along with the expected training output.
plot([outputData,outputTuned]) legend("Expected Output","Tuned Output","Location","southeast") xlabel("Data Index") ylabel("Output value")
fisin
— Fuzzy inference systemmamfis
object | sugfis
object | mamfistype2
object | sugfistype2
object | fistree
objectFuzzy inference system, specified as one of the following:
mamfis
object — Mamdani fuzzy inference system
sugfis
object — Sugeno fuzzy inference system
mamfistype2
object — Type-2 Mamdani fuzzy inference system
sugfistype2
object — Type-2 Sugeno fuzzy inference system
fistree
object — Tree of
interconnected fuzzy inference systems
paramset
— Tunable parameter settingsTunable parameter settings, specified as an array of input, output, and rule
parameter settings in the input FIS. To obtain these parameter settings, use the
getTunableSettings
function
with the input fisin
.
paramset
can be the input, output, or rule parameter settings,
or any combination of these settings.
in
— Input training dataInput training data, specified as an m-by-n matrix, where m is the total number of input datasets and n is the number of inputs. The number of input and output datasets must be the same.
out
— Output training dataOutput training data, specified as an m-by-q matrix, where m is the total number of output datasets and q is the number of outputs. The number of input and output datasets must be the same.
options
— FIS tuning optionstunefisOptions
option setFIS tuning options, specified as a tunefisOptions
object. You can
specify the tuning algorithm method and other options for the tuning process.
custcostfcn
— custom cost functionsCustom cost function, specified as a function handle. The custom cost function
evaluates fisout
to calculate its cost with respect to an
evaluation criterion, such as input/output data. custcostfcn
must
accept at least one input argument for fisout
and returns a cost
value. You can provide an anonymous function handle to attach additional data for cost
calculation, as described in this example:
function fitness = custcost(cost,trainingData) ... end custcostfcn = @(fis)custcost(fis,trainingData);
fisout
— Fuzzy inference systemmamfis
object | sugfis
object | mamfistype2
object | sugfistype2
object | fistree
objectFuzzy inference system, specified as one of the following:
mamfis
object — Mamdani fuzzy inference system
sugfis
object — Sugeno fuzzy inference system
mamfistype2
object — Type-2 Mamdani fuzzy inference
system
sugfistype2
object — Type-2 Sugeno fuzzy inference
system
fistree
object — Tree of interconnected fuzzy inference
systems
fisout
is the same type of FIS as
fisin
.
summary
— Tuning algorithm summaryTuning algorithm summary, specified as a structure containing the following fields:
tuningOutputs
— Algorithm-specific tuning information
totalFunctionCount
— Total number of evaluations of the
optimization cost function
totalRuntime
— Total execution time of the tuning process in
seconds
errorMessage
— Any error message generated when updating
fisin
with new parameter values
tuningOutputs
is a structure that contains tuning information for
the algorithm specified in options
. The fields in
tuningOutputs
depend on the specified tuning algorithm. When using
k-fold cross validation, tuningOutputs
is an
array of k structures, each containing the tuning information for one
training-validation iteration.
When using k-fold validation,
totalFunctionCount
and totalRuntime
the total
function cost function evaluations and total run time across all k
training-validation iterations.
You have a modified version of this example. Do you want to open this example with your edits?