**1. Introduction **

Genetic Algorithm (GA) is developed in 1975 by Prof. John Holland was inspired by Darwin’s theory of evolutions which states that the survival of an organism is affected by rule “the strongest species that survives”. Darwin also stated that the survival of an organism can be maintained through the process of reproduction, crossover and mutation. A chromosome is composed from genes and its value can be either numerical, binary, symbols or characters depending on the problem want to be solved. A chromosome is composed from genes and its value can be either numerical, binary, symbols or characters depending on the problem want to be solved. The number of chromosomes which will undergo crossover and mutation is controlled by crossover rate and mutation rate value. The chromosome which has higher fitness value will have greater probability of being selected again in the next generation. After several generations, the chromosome value will converges to a certain value which is the best solution for the problem [1].

**2.
Genetic Algorithm**

GA is a heuristic search method used in artificial intelligence and computing. It is used for finding optimized solutions to search problems based on the theory of natural selection and evolutionary biology. Genetic algorithms are excellent for searching through large and complex data sets. They are considered capable of finding reasonable solutions to complex issues as they are highly capable of solving unconstrained and constrained optimization issues [2].

**2.1. Background **

Fig 1 shows the general process of GA. The algorithm is started with a set of solutions (represented by chromosomes) called population. Solutions from one population are taken and used to form a new population. This is motivated by a hope, that the new population will be better than the old one. Solutions which are selected to form new solutions (offspring) are selected according to their fitness the more suitable they are the more chances they have to reproduce.

**2.2. Steps for genetic Algorithm **

- Initialization
- Selection
- Cross over
- Mutation
- Termination

**2.3.
GA Process**

** **Fig 2 shows the process of GA. The populations of individuals (elephants) are maintained within search space. Each individual represent a solution in search space for given problem. Each individual is coded as a finite length vector of components. These variable components are analogous to Genes. Thus a chromosome is composed of several genes (variable components). A Fitness Score is given to each individual who shows the ability of an individual to “compete”. The individual having optimal fitness score (or near optimal) are sought. The GAs maintains the population of n individuals along with their fitness scores. The individuals having better fitness scores are given more chance to reproduce than others. The individuals with better fitness scores are selected who mate and produce better offspring by combining chromosomes of parents. According to Darwin’s evolution theory the best ones should survive and create new offspring. The population size is static so the room has to be created for new arrivals. So, some individuals die and get replaced by new arrivals eventually creating new generation when all the mating opportunity of the old population is exhausted. It is hoped that over successive generations better solutions will arrive while least fit die. Each new generation has on average more “better genes” than the individual (solution) of previous generations. Thus each new generation have better “partial solutions” than previous generations. Once the offsprings produced having no significant difference than offspring produced by previous populations, the population is converged. The algorithm is said to be converged to a set of solutions for the problem. The key points are as follows: From an initial population of solutions, they are ranked according to their fitness, the worst ones are eliminated and the best ones are used to produce new solutions. In this manner, the new solutions are used and produced.

**2.3.1.
Initialization**

Initially many individual solutions are (usually) randomly generated to form an initial population. The population size depends on the nature of the problem, but typically contains several hundreds or thousands of possible solutions. Fig 3 shows the initialization step.

**2.3.2.
Selection**

Each successive generation, a proportion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness based process, where fitted solutions (as measured by a fitness function) are typically more likely to be selected. Certain selection methods rate the fitness of each solution and preferentially select the best solutions. Fig 4 shows the selection.

**2.3.3. Crossover**

The crossover operator is analogous to reproduction and biological crossover. In this more than one parent is selected and one or more off-springs are produced using the genetic material of the parents. Fig 5 shows the crossover.

**2.3.4.
Mutation**

Mutation is a genetic operator used to maintain genetic diversity from one generation of a population of GA chromosomes to the next. It is analogous to biological mutation. Mutation alters one or more gene values in a chromosome from its initial state. Fig 6 shows the mutation.

**2.3.5. Termination**

This generational process is repeated until a termination condition has been reached. Common terminating conditions are:

A solution is found which satisfies the objective(s)

- Fixed number of generations
- Maximum allocated budget/resource is reached
- Maximum time specified is reached
- The highest ranking solution’s fitness is reached
- Successive iterations no longer produce better results
- Manual inspection
- Combination of the above

**2.4. Numerical Example**

Here are examples of applications that use genetic algorithms to solve the problem of combination. Suppose there are equality f(x) =2x+y, GA will be used to find the value of x, y that is satisfy the above equation. First we should formulate First the objective function, for this problem the objective is minimizing the value of function f(x) where f(x) =2x+y. Since there are two variables in the equation, namely x, y we can compose the chromosome as follow:

**Step 1: Initialization **

There are 6 chromosomes used in this process,

Chromosome (1) = [x, y] = [12, 05]

Chromosome (2) = [x, y] = [04, 22]

Chromosome (3) = [x, y] = [15, 06]

Chromosome (4) = [x, y] = [20, 04]

Chromosome (5) = [x, y] = [05, 13]

Chromosome (6) = [x, y] = [17, 04]

**Step 2:
Evaluation **

We compute the objective function value for each chromosome produced in initialization step:

f(x_{1})
= ((2 *12) +05)

=24+5

=29

f(x_{2})
= ((2*04) +22)

=8+22

=30

f(x_{3})
= ((2*15) +06)

=30+06

=36

f(x_{4}) = ((2*20) +04)

=40+04

=44

f(x_{5}) = ((2*05) +13)

=10+13

=23

f(x_{6}) = ((2*17) +04)

=34+04

=38

**Step 3: Selection**

The fittest chromosomes have higher probability to be selected for the next generation. To compute fitness probability we must compute the fitness of each chromosome. To avoid divide by zero problem, the value of f(x) is added by 1.

Fitness
(1) = 1/ (1+f(x_{1}))

=1/30

=0.033

Fitness
(2) = 1/ (1+f(x_{2}))

=1/31

=0.032

Fitness
(3) = 1/ (1+f(x_{3}))

=1/37

=0.027

Fitness
(4) = 1/ (1+f(x_{4}))

=1/45

=0.022

Fitness
(5) = 1/ (1+f(x_{5}))

=1/24

=0.041

Fitness
(6) = 1/ (1+f(x_{6}))

=1/38

=0.026

Total=0.033+0.032+0.027+0.022+0.041+0.026

=0.181

The probability for each chromosomes is formulated by: P[i] = Fitness (I) / Total

P(1)=0.033/0.181

=0.182

P(2)=0.032/0.181

=0.176

P(3)=0.027/0.181

=0.149

P(4)=0.022/0.181

=0.121

P(5)=0.041/0.181

=0.226

P(6)=0.026/0.181

=0.143

**Step 4: Cross over**

After chromosome selection, the next process is determining the position of the crossover point.

**Chromosome (1)>< Chromosome (2)**

**Chromosome (2)>< Chromosome (5)**

**Chromosome (5)>< Chromosome (1)**

Then for first cross over, second crossover and third crossover, parent’s gens will be cut respectively, e.g.

**Chromosome (1) = Chromosome (1)>< Chromosome
(2)**

= [12, 05]>< [04, 22] = [12, 22]

**Chromosome (2) = Chromosome (2) ><Chromosome
(5)**

= [04, 22]>< [05, 13] = [04, 13]

**Chromosome (5) = Chromosome (5)>< Chromosome
(1)**

= [05, 13]>< [12, 05] = [05, 05]

Chromosome process after cross over process is given by,

Chromosome (1) = [12, 22]

Chromosome (2) = [04, 13]

Chromosome (3) = [15, 06]

Chromosome (4) = [20, 04]

Chromosome (5) = [05, 05]

Chromosome (6) = [17, 04]

**Step5: Mutation**

Total gen = Number of chromosome * Number of population =2*6 =12

Total gen = (1- 12)

Mutation expected random value= 0.9

Number of mutation = 0.9*12 =10.8 ≈11

**2.5. Applications of GA**

Fig 7 shows the applications of GA and they are listed as follows:

- Neural networks [3, 4]
- Multimodal Optimization [5]
- Multiple criteria production scheduling [6]
- Rare event analysis [7]
- Stochastic Optimization [8]
- Wireless sensor/ad-hoc networks [9]
- Finding hardware bugs [10]
- Power electronics design [11]
- Image processing [12]
- Global temperature changes [13]
- Ground water monitoring network [14]
- Real options valuation [15]
- Airlines revenue management [16]
- Vehicle routing problem [17]
- Computer automated design [18]
- Protein folding [19]

**2.6. Advantages of GA**

- Does not require any derivative information which may not be available for many real-world problems.
- Faster and more efficient as compared to the traditional methods.
- Optimizes both continues and discrete functions and also multi-objective problems.
- Provides a list of “Good” solutions and not just a single solutions. Always gets an answer which gets better over the time.
- Useful when the search space is very large and there are a large number of parameters involved.

**Reference**

[1] F. Pezzella, G. Morganti and G. Ciaschetti, “A genetic algorithm for the Flexible Job-shop Scheduling Problem”, Computers & Operations Research, vol. 35, no. 10, pp. 3202-3212, 2008. Available: 10.1016/j.cor.2007.02.014 [Accessed 19 June 2019].

[2] J. Tsai, T. Liu and J. Chou, “Hybrid Taguchi-Genetic Algorithm for Global Numerical Optimization”, IEEE Transactions on Evolutionary Computation, vol. 8, no. 4, pp. 365-377, 2004. Available: 10.1109/tevc.2004.826895 [Accessed 19 June 2019].

[3] “Applying genetic algorithms to neural network problems”, Neural Networks, vol. 1, p. 230, 1988. Available: 10.1016/0893-6080(88)90267-5.

[4] P. Festa, “A biased random-key genetic algorithm for data clustering”, Mathematical Biosciences, vol. 245, no. 1, pp. 76-85, 2013. Available: 10.1016/j.mbs.2013.07.011.

[5] K. Wong, C. Wu, R. Mok, C. Peng and Z. Zhang, “Evolutionary multimodal optimization using the principle of locality”, Information Sciences, vol. 194, pp. 138-170, 2012. Available: 10.1016/j.ins.2011.12.016.

[6] J. MOU, “Multi-objective Genetic Algorithm for Solving Multi-objective Flow-shop Inverse Scheduling Problems”, Journal of Mechanical Engineering, vol. 52, no. 22, p. 186, 2016. Available: 10.3901/jme.2016.22.186.

[7] P. Del Moral and J. Garnier, “Genealogical particle analysis of rare events”, The Annals of Applied Probability, vol. 15, no. 4, pp. 2496-2534, 2005. Available: 10.1214/105051605000000566.

[8] X. Ruiz del Portal, “Non-smooth monotonicity constraints in optimal control problems: Some economic applications”, Optimal Control Applications and Methods, vol. 32, no. 4, pp. 396-413, 2010. Available: 10.1002/oca.948.

[9] W. Jia, “Communicating object group and protocols for distributed systems”, Journal of Systems and Software, vol. 45, no. 2, pp. 113-126, 1999. Available: 10.1016/s0164-1212(98)10072-9.

[10] K. Yasuda, O. Yamazaki and T. Watanabe, “Proposal of a cannibalism bug-based search strategy using genetic algorithms (C-BUGS) and its application to multi objective optimization problems”, Electrical Engineering in Japan, vol. 139, no. 1, pp. 51-64, 2002. Available: 10.1002/eej.1146.

[11] Jun Zhang, H. Chung and W. Lo, “Pseudo co evolutionary genetic algorithms for power electronic circuits optimization”, IEEE Transactions on Systems, Man and Cybernetics, Part C (Applications and Reviews), vol. 36, no. 4, pp. 590-598, 2006. Available: 10.1109/tsmcc.2005.855497.

[12] S. Suvarna, M. K, P. N, P. R and V. Kumar, “MAP-V Secucam Using Pixel to Pixel Matching Algorithm”, IJIREEICE, vol. 5, no. 2, pp. 128-130, 2017. Available: 10.17148/ijireeice/ncaee.2017.29.

[13] K. Stanislawska, K. Krawiec and Z. Kundzewicz, “Modeling global temperature changes with genetic programming”, Computers & Mathematics with Applications, vol. 64, no. 12, pp. 3717-3728, 2012. Available: 10.1016/j.camwa.2012.02.049.

[14] J. Ryu, B. Contor, G. Johnson, R. Allen and J. Tracy, “System Dynamics to Sustainable Water Resources Management in the Eastern Snake Plain Aquifer Under Water Supply Uncertainty1”, JAWRA Journal of the American Water Resources Association, vol. 48, no. 6, pp. 1204-1220, 2012. Available: 10.1111/j.1752-1688.2012.00681.x

[15] S. Zhang and V. Babovic, “An evolutionary real options framework for the design and management of projects and systems with complex real options and exercising conditions”, Decision Support Systems, vol. 51, no. 1, pp. 119-129, 2011. Available: 10.1016/j.dss.2010.12.001.

[16] H. Kilic, S. Zaim and D. Delen, “Development of a hybrid methodology for ERP system selection: The case of Turkish Airlines”, Decision Support Systems, vol. 66, pp. 82-92, 2014. Available: 10.1016/j.dss.2014.06.011.

[17] T. Vidal, T. Crainic, M. Gendreau, N. Lahrichi and W. Rei, “A Hybrid Genetic Algorithm for Multidepot and Periodic Vehicle Routing Problems”, Operations Research, vol. 60, no. 3, pp. 611-624, 2012. Available: 10.1287/opre.1120.1048.

[18] Y. Li, K. Ang, G. Chong, W. Feng, K. Tan and H. Kashiwagi, “CAutoCSD-evolutionary search and optimization enabled computer automated control system design”, International Journal of Automation and Computing, vol. 1, no. 1, pp. 76-88, 2004. Available: 10.1007/s11633-004-0076-8.

[19] Willett P (1995). “Genetic algorithms in molecular recognition and design”. In Biotechnology. 13 (12): 516–521. doi:10.1016/S0167-7799(00)89015-0. MID 8595137.

Good innovative presentation… Thank you…

Good Work…

Excellent work…

creative work

Creative Work

Fabulous work

Fantastic and Fabulous presentation

Nice presentation

nice..

EXCELLENT

🌟🌟🌟🌟🌟

Nice

Excellent presentation.