Tuning Fuzzy Inference Systems

Designing a complex fuzzy inference system (FIS) with a large number of inputs and membership functions (MFs) is a challenging problem due to the large number of MF parameters and rules. To design such a FIS, you can use a data-driven approach to learn rules and tune FIS parameters. To tune a fuzzy system, use the tunefis function and configure the tuning process using a tunefisOptions object.

Using Fuzzy Logic Toolbox™ software, you can tune both type-1 and type-2 FISs as well as FIS trees. For examples, see Predict Chaotic Time Series Using Type-2 FIS and Tune FIS Tree for Gas Mileage Prediction.

During training, the optimization algorithm generates candidate FIS parameter sets. The fuzzy system is updated with each parameter set and then evaluated using the input training data.

If you have input/output training data, the cost for each solution is computed based on the difference between the output of the fuzzy system and the expected output values from the training data. For an example that uses this approach, see Tune Mamdani Fuzzy Inference System.

The cost for a given parameter set is computed by comparing the output of the fuzzy system with the expected output from the training data.

If you do not have input/output training data, you can specify a custom model and cost function for evaluating candidate FIS parameter sets. The cost measurement function sends an input to the fuzzy system and receives the evaluated output. The cost is based on the difference between the evaluated output and the output expected by the model. For more information and an example that uses this approach, see Tune Fuzzy Robot Obstacle Avoidance System Using Custom Cost Function.

A custom cost function computes the cost for a given parameter set by comparing the output of the fuzzy system with the output computed by a custom model.

For more information on tuning fuzzy systems see the following examples.

Tuning Methods

The following table shows the tuning methods supported by the tunefis function. These tuning methods find the optimal FIS parameters

MethodDescriptionMore Information
Genetic algorithmPopulation-based global optimization method that searches randomly by mutation and crossover among population membersWhat Is the Genetic Algorithm? (Global Optimization Toolbox)
Particle swarm optimizationPopulation-based global optimization method in which population members step throughout a search regionWhat Is Particle Swarm Optimization? (Global Optimization Toolbox)
Pattern searchDirect-search local optimization method that searches a set of points near the current point to find a new optimumWhat Is Direct Search? (Global Optimization Toolbox)
Simulated annealingA local optimization method that simulates a heating and cooling process to that finds a new optimal point near the current pointWhat Is Simulated Annealing? (Global Optimization Toolbox)
Adaptive neuro-fuzzy inferenceBack-propagation algorithm that tunes membership function parameters. Alternatively, you can use the anfis function.Neuro-Adaptive Learning and ANFIS

The first four tuning methods require Global Optimization Toolbox software.

Global optimization methods, such as genetic algorithms and particle swarm optimization, perform better for large parameter tuning ranges. These algorithms are useful for both the rule-learning and parameter-tuning stages of FIS optimization.

On the other hand, local search methods, such as pattern search and simulated annealing, perform better for small parameter ranges. If a FIS is generated from training data using genfis or a rule base is already added to a FIS using training data, then these algorithms can produce faster convergence compared to global optimization methods.

Prevent Overfitting of Tuned System

Data overfitting is a common problem in FIS parameter optimization. When overfitting occurs, the tuned FIS produces optimized results for the training data set but performs poorly for a test data set. To overcome the data overfitting problem, a tuning process can stop early based on an unbiased evaluation of the model using a separate validation dataset.

When tuning using the tunefis function, you can prevent overfitting using k-fold cross validation. To prevent For more information and an example, see FIS Parameter Optimization with K-fold Cross Validation.

Improve Tuning Results

To improve the performance of your tuned fuzzy systems, consider the following guidelines.

  • Use multiple phases in your tuning process. For example, first learn the rules of a fuzzy system, and then tune input/output MF parameters using the learned rule base.

  • Increase the number of iterations in both the rule-learning and parameter-tuning phases. Doing so increases the duration of the optimization process and can also increase validation error due to overtuned system parameters with the training data. To avoid overfitting, train your system using k-fold cross validation.

  • Change the clustering technique used by genfis. Depending on the clustering technique, the generated rules can differ in their representation of the training data. Hence, the use of different clustering techniques can affect the performance of tunefis.

  • Change FIS properties. Try changing properties such as the type of FIS, number of inputs, number of input/output MFs, MF types, and number of rules. A Sugeno system has fewer output MF parameters (assuming constant MFs) and faster defuzzification. Therefore, for fuzzy systems with a large number of inputs, a Sugeno FIS generally converges faster than a Mamdani FIS. Small numbers of MFs and rules reduce the number of parameters to tune, producing a faster tuning process. Furthermore, a large number of rules might overfit the training data.

  • Modify tunable parameter settings for MFs and rules. For example, you can tune the support of a triangular MF without changing its peak location. Doing so reduces the number of tunable parameters and can produce a faster tuning process for specific applications. For rules, you can exclude zero MF indices by setting the AllowEmpty tunable setting to false, which reduces the overall number of rules during the learning phase.

To improve the tuning results for fuzzy trees, consider the following guidelines.

  • You can separately tune the parameters of each FIS in a FIS tree. You can then tune all the fuzzy systems together to generalize the parameter values.

  • Change FIS tree properties, such as the number of fuzzy systems and the connections between the fuzzy systems.

  • Use different rankings and groupings of the inputs to a FIS tree. For more information about creating FIS trees, see Fuzzy Trees.

See Also

| |

Related Topics