The Feedback Artificial Tree Algorithm (FAT): Great Potential to Solve Wide Range of Practical Optimization Problems

1. Introduction

The Feedback Artificial Tree Algorithm employs a tree-based recovery mechanism that defined to maintains reachability information to three generations of family in the tree [1]. The entire material exchange process means that both of the transfer of organic matters and the feedback of moistures are taken into account. Meanwhile, with the moisture feedback mechanism, two new branch update operators that are the self-propagating operator and the dispersive propagation operator are implemented [2]. The results show that FAT achieves better performance in providing reliable service in ad hoc networks, in terms of reliability, scalability, and delivery efficiency [3]. The computational results presented in this work have clearly demonstrated that the proposed FAT has a great potential to solve a wide range of optimization problems efficiently.

2. The Feedback Machanism

A feedback mechanism is a loop system wherein the system responds to a choas. The response may be in the same or opposite directions which means the positive or the negative feedback [4]. In biological sense, a feedback mechanism involves a biological process, a signal, or a mechanism that tends to initiate to accelerate or to inhibit or slow down a process

An example of feedback is a high pitched noise that is returned from a speaker when a microphone gets too close to it. Therefore, in  FAT contains two processes one is the the transfer process of organic matters and the feedback process of moisture sand the another one is the moistures are absorbed from the soil through the roots and passed to the thickest tree trunk.

3. Artificial tree algorithm

The organic matters are primarily produced in the leaves, and then they spreads from top to bottom in all branches. The transfer direction of organic matters is the uniform as the update direction of branches, and the transfer of organic matters depends on the update of branches. Therefore, the update of branches determines the implementation of the AT algorithm [5]. A thicker branch denotes a better solution, and the thickest tree trunk is the best solution. The tiny branches connected to the leaves denote the initial branch population. Each branch has its own territory, which is the growth space of the branch. A thicker branch tends to have a larger territory.

Fig 1: Feedback Artificial Tree Algorithm

4. Inspiration of FAT

The FAT is inspired by the transport of organic matters and the update theories of branches. Organic matter refers to the widely source of carbon-based compounds  within natural and environments. Therefore FAT is, besides the transfer of organic matters, the feedback mechanism of moistures is introduced. Meanwhile, the self-propagating operator and dispersive propagation operator are also implemented in FAT. In nature, for the normal growth of trees, the exchange of materials of trees is the transmission of organic matters from leaves to roots and the transport of moistures from roots to leaves ensure the normal growth of trees. In addition, the delivery of moistures is the feedback process of the transport of organic matters, and the branches which spread more organic matters get more moisture from the feedback [6].Therefore, the exchange process of materials is a feedback mechanism in FAT.

Fig 2: Inspiration of FAT

5. The basic theory of artificial tree algorithm

The whole theory of artificial tree algorithm optimization process can be summarized as follows: Initially, the branch population is randomly assigned in the design space. Then, these branches are updated based on these operators, and the branch population is also updated from generation to generation. The best solutions of all generation are obtained [7]. Finally, the maximum number of function evaluation is attained, and the global best solution is obtained

5.1 Branch territory

Each branch has one territory, and the range of the territory depends on the thickness of the branch. A thicker branch varies to have a larger branch territory [8]. The equation of the branch territory is defined as follows;

Ba = (T +T * fit (ma)*2                                                                         (1)

Ba =2 T (1+fit (ma)                                                                              (2)

Ba = (T +T * fit (ma)*2                                                                         (1)

Ba =2 T (1+fit (ma)                                                                              (2)

Where L is the territory parameter which is defined in advance to calculate the branch territory. The value of L is recommended between 0 and 1

ma = (ma1, ma2,…. madim)                                                                 (3)

Where dim is the dimension. This work focuses on solving the minimization problem, which is described as follows;


Where f () is the objective value of the solution, and the better solution tends to have the higher values of fit ().

5.2 Crowd distance

The crowd distance is applied to evaluate the spacing between the branches. The crowd distance between branch i and branch j is calculated as follows;

Dxy =|| ma– mb|| 2                                                                                  (5)

Where ma and mb are the positions of branch a and branch b, respectively, and Dxy is a number which is the spatial distance between ma and mb.

5.3 Crossover operator

Fig 3: Ring Crossover

A branch is randomly created in half of the branch territory and it combines with current branch by linear interpolation to generate a new branch [9]. The mathematical model of the crossover operator is presented as follows:

 mab = mab+ rand (-1, 1)* ci / 2

          = mab + rand (-1, 1)* X* (1+fit (ma)                                             (6)

 Mnew = rand (0, 1)* m0 + rand (0, 1)* ma                                                          (7)

Where b = 1, 2… D, rand (-1, 1) is a random number between – 1 and 1.

  Mnew = (y1 + y2) mab + y1* rand (-1, 1) x (1+ fit (ma)                           (8)

Where y1 = rand (0, 1); and y2 = rand (0, 1)

5.4 Self-evolution operator

The mathematical derivation of the self-evolution operator can be written as

mnew= ma +rand (0,1) * (mbest –ma)                                                       (9)

Where mbest is the best branch position that has been found so far.

5.5 Random operator

The new branch produced by the crossover operator or the self-evolution operator is compared with the original branch [10]. If the new branch is better than the original one, the new branch replaces the old one. Another new branch is generated, and a new comparison between this new branch and the original branch is carried out. Repeat this process until a better branch is found for AT, which is called the maximum search number Li. For different branches, their maximum search number is different which can be calculated as follows:

Ta (ma) = C * fit (ma) +C                                                                     (10)

Ta (ma) = C * (1+ fit (ma)                                                                     (11)

Where C is the search parameter which is a constant, Ta (ma) is the maximum search number of branch position xi that is proportional to the fitness value fit (ma).

 If fit (ma) = 0,                                                                                     (12)

 The maximum search number of is ma

Ta (ma) = C                                                                                          (13)

6. Flowchart of FAT

Fig 4: Flowchart of FAT

7. Advantages of FAT

Fig 5: Advantages of FAT

8. Pseudo code of FAT

Fig 6: Pseudo code of FAT

9. Applications of FAT

The FAT is applied to the various applications which are to be listed as below;

  •  Structural Optimizations [10].
  • Load Identification [11].
  • Image Processing [12].
  • Decision Tree [13].
  • Linear Regression [14].
Fig 7: Applications of FAT


[1] Li Q, He Z, Li E (2020) The feedback artificial tree (FAT) algorithm. Soft Computing. doi: 10.1007/s00500-020-04758-2

[2] Chen K, Zhou F, Yin L et al. (2018) A hybrid particle swarm optimizer with sine cosine acceleration coefficients. Information Sciences 422:218-241. doi: 10.1016/j.ins.2017.09.015

[3] Li R, Emmerich M, Eggermont J et al. (2013) Mixed Integer Evolution Strategies for Parameter Optimization. Evolutionary Computation 21:29-64. doi: 10.1162/evco_a_00059

[4] Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation 1:3-18. doi: 10.1016/j.swevo.2011.02.002

[5] Dorigo M, Caro G, Gambardella L (1999) Ant Algorithms for Discrete Optimization. Artificial Life 5:137-172. doi: 10.1162/106454699568728

[6] Duan L, Jiang H, Cheng A et al. (2018) Multi-objective reliability-based design optimization for the VRB-VCS FLB under front-impact collision. Structural and Multidisciplinary Optimization 59:1835-1851. doi: 10.1007/s00158-018-2142-9

[7] Gao W, Liu S (2012) A modified artificial bee colony algorithm. Computers & Operations Research 39:687-697. doi: 10.1016/j.cor.2011.06.007

[8] Ghambari S, Rahati A (2018) An improved artificial bee colony algorithm and its application to reliability optimization problems. Applied Soft Computing 62:736-767. Doi: 10.1016/j.asoc.2017.10.040

[9] Hamzaçebi C (2008) Improving genetic algorithms’ performance by local search for continuous function optimization. Applied Mathematics and Computation 196:309-317. doi: 10.1016/j.amc.2007.05.068

[10] Holland J (1992) Genetic Algorithms. Scientific American 267:66-72. doi: 10.1038/scientificamerican0792-66

[11] Karaboga D, Basturk B (2008) On the performance of artificial bee colony (ABC) algorithm. Applied Soft Computing 8:687-697. doi: 10.1016/j.asoc.2007.05.007

[12] (1997) 1997 Index IEEE Transactions on Systems, Man, and Cybernetics Part B: Cybernetics Vol. 27. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 27:1-13. doi: 10.1109/tsmcb.1997.650066

[13] Li Q, He Z, Li E, Cheng A (2019) Improved impact responses of a honeycomb sandwich panel structure with internal resonators. Engineering Optimization 52:731-752. doi: 10.1080/0305215x.2019.1613389

[14] Lin Q, Hu B, Tang Y et al. (2017) A local search enhanced differential evolutionary algorithm for sparse recovery. Applied Soft Computing 57:144-163. doi: 10.1016/j.asoc.2017.03.034

Leave a Reply

%d bloggers like this: