Back to Home Page

Adaptive Fuzzy Fitness Granulation (AFFG):

A novel method for accelerating the convergence rate of

evolutionary computing

 

Abstract: Fitness approximation in evolutionary computation

In lots of real word optimization problems including engineering problems, the optimization cost is dominated by number of fitness function evaluations needed to obtain a good solution. In order to obtain efficient optimization algorithms, it is crucial to use information gained during optimization process. This leads to build model of the fitness function to choose 'smart' search steps. The model replaces expensive fitness function evaluations with cheap model evaluations. AFFG is a novel granulation based fitness approximation scheme that proposed in order to approximate the fitness function for substituting the time consuming large-scale problem analysis (L-SPA) by FEA.

Introduction:

Optimization of L-SPA, such as sizing optimization of multi-storey 3-D frames or shape optimization of 2-D mechanical parts, is a computationally intensive task. In sizing optimization the aim is to minimize the weight of the structure under certain restrictions imposed by design codes, whereas in shape optimization the goal is to improve a given topology by minimizing an objective function subject to certain constraints. Automatic methods of design include stochastic combinatorial search techniques such as genetic algorithms (GA) that do not require gradient information. Such combinatorial optimization techniques are in general more robust and present a better global behavior than mathematical programming methods. They may suffer, however, from a slow rate of convergence towards the global optimum, see e.g., [1]. Unfortunately, repeated fitness function evaluation for such complex problems is often the most prohibitive and limiting segment of artificial evolutionary algorithms.  Finding optimal solution to complex high dimensional, multimodal problems often requires very expensive fitness function evaluations. In real world problems such as structural optimization problems, one single function evaluation may require several hours to several days of complete simulation. Typical optimization method can not deal with such a type of problem. In this case, it may be necessary to forgo an exact evaluation and use an approximated fitness that is computationally efficient. It is apparent that amalgamation of approximate models may be one of the most promising approaches to convincingly use EA to solve complex real life problems, especially where: (i). Fitness computation time of a single solution is extremely high, (ii). Precise model for fitness computation is missing, (iii). The fitness function is uncertain or noisy as a result of simulation/ measurement errors or changes in the design variables and/or the environmental parameters etc. To alleviate this problem, a variety of techniques for constructions of surrogates – often also referred to as metamodels or approximation models - for  computationally expensive optimization problems are investigated [2, 3].

A popular subclass of fitness function approximation methods is fitness inheritance where fitness is simply inherited. A similar approach has also been suggested for fitness approximation where the fitness of a child individual is the weighted sum of its parents. Coello and his colleagues incorporated the concept of fitness inheritance into a multi-objective particle swarm optimizer to reduce the number of fitness evaluations. They tested their approach on a set of well-known test suite of multi-objective optimization problems. They generally reported lower computation cost, while the quality of their results improved in higher dimensional spaces. However, it has shown in this research that the performance of parents may not be a good predictor of their children for sufficiently complex and multi-objective problems rendering fitness inheritance inappropriate under such circumstances.

Other common approaches based on learning and interpolation from known fitness values of a small population, (e.g. low-order polynomials and the least square estimations, artificial neural networks (ANN), including multi-layer perceptrons and radial basis function networks, support vector machine (SVM) regression model,  etc.) are also been employed.

 Because of the limited number of training samples and high dimensionality encountered in engineering design optimization, constructing a globally valid approximate model remains to be difficult. Evolutionary algorithms using such approximate fitness functions may converge to local optimums. Therefore, it can be beneficial to selectively use the original fitness function together with the approximate model.

The basic idea behind the proposed AFFG framework is to:

        Avoid initial training.

        Use guided approximation to speed up search process.

        Exploit design variable space by means of Fuzzy Similarity Analysis (FSA) as a type of fuzzy surrogates to avoid premature convergence.

        Gradually set up an independent model of initial training data to compensate the Lack of sufficient training data and to reach a model with sufficient approximation accuracy.

        Dynamic updating the approximate model with no or negligible overhead cost.

        Avoiding the use of model in unrepresented design variable regions in the training set.

In the proposed algorithm, as explained in detail in [4, 5], an adaptive pool of solutions (fuzzy granules) with an exactly computed fitness function is maintained. If a new individual is sufficiently similar to a known fuzzy granule, then that granule’s fitness is used instead as a crude estimate. Otherwise, that individual is added to the pool as a new fuzzy granule. In this fashion, regardless of the competition’s outcome, fitness of the new individual is always a physically realizable one, even if it is a “crude” estimate and not an exact measurement. The pool size as well as each granule’s radius of influence is adaptive and will grow/shrink depending on the utility of each granule and the overall population fitness. To encourage fewer function evaluations, each granule’s radius of influence is initially large and is gradually shrunk in latter stages of evolution. This encourages more exact fitness evaluations when competition is fierce among more similar and converging solutions. Furthermore, to prevent the pool from growing too large, granules that are not used are gradually eliminated. This fuzzy granulation scheme is applied to solve engineering optimization problems including Detecting Hidden Information from Watermarked Signal in addition to several structural optimization problems.

Figure 1: Flowchart of the Purposed AFFG Algorithm

Conclusion:

New methods for fast design optimisation Evolutionary algorithms (EAs) are now well known and popular for solving many kinds of hard optimisation problem such as designing optimal designs for structural beams, for factory production schedules, for electrical antennae, and so on. A drawback is that they can often take too long to converge to good solutions. This limits their applicability in some industries. Meanwhile, recent developments in computer science are pointing to ways in which information can be handled more conveniently and quickly. One such approach is to use ‘granular computing’. AFFG is a novel granulation based fitness approximation method that is designed to accelerate the convergence rate of EAs. The main idea is to automatically recast an optimisation problem into a smaller, more manageable form, via the granular computing approach, and then develop EAs that work with this new form that leads to much faster optimisation paradigm.

References:

[1]      M. Olhofer, T. Arima, T. Sonoda, and B. Sendhoff. “Optimisation of a stator blade used in a transonic compressor cascade with evolution strategies”. In Adaptive Computation in Design and Manufacture, pages 45--54. Springer, 2000.

[2]      Ratle. A., “Accelerating the convergence of evolutionary algorithms by fitness landscape approximation”, Parallel Problem Solving from Nature-PPSN V, Springer-Verlag, pp. 87--96, 1998.

[3]      El-Beltagy M. A. and Keane A. J., “Evolutionary optimization for computationally expensive problems using Gaussian processes”, Proc. Int. Conf. on Artificial Intelligence (IC-AI'2001), CSREA Press, Las Vegas, pp. 708--714, 2001.

[4]      M. Davarynejad, "Fuzzy Fitness Granulation in Evolutionary Algorithms for complex optimization", M.Sc. Thesis. Ferdowsi University of Mashhad/Iran, Department of Electrical Engineering, 2007.

[5]      M. Davarynejad, M.-R. Akbarzadeh-T, N. Pariz, “A Novel General Framework for Evolutionary Optimization: Adaptive Fuzzy Fitness Granulation”, Proceedings of the 2007 IEEE International Conference on Evolutionary Computing, pp. 951--956, Singapore, September 25-28, 2007.

 

This web page is still under construction!

Back to Home Page