 HubPages»
 Education and Science
The Power of Heuristic Algorithms
In the science as well as in everyday life we face various problems that are not solved precisely, regardless because it is impossible or it would be impractical.
Let imagine that we have to put a gold nuggets into a knapsack. Each nugget has a certain weights of gold and a certain volume, where the weight in general doesn't correspond to the volume. The task is to fill the knapsack in order to collect as much gold as possible. This is a very known optimization problem called the knapsack problem. Another example could be to find the shortest possible route that visits each spot exactly once (see the title 'Traveling salesman' problem below). Although these instances might looks theoretical and impractical it is possible to model real problems with these 'prototypes'.
In everyday life we often behave heuristically rather than deterministically. For example, when fill a trunk of car with several pieces of luggage, we usually only roughly decide on order of putting the luggage. This means that usually stuck with free space in the trunk, then take out some pieces, try again with new schedule, repeating the procedure until all luggage is in the trunk.
The optimization problem can be defined as a problem whose solutions are in the finite set of X, called the solution space. Members of this set are called feasible solution if it satisfies the basic conditions to be a possible solution to a given problem. We define the objective function (also called profit, fitness function), which assigns real numbers to the elements of solution space. The optimal solution is a feasible solution for which the objective function takes the maximum value (or minimum, depending on the direction of optimization).
Heuristic algorithms solve optimization problems searching the solution space by means of a creative, heuristic strategy that is often based on the principles of trial and error. There is no guarantee that the optimal (the best possible) solution will be found, but practice shows that the heuristic algorithms can be very precise and effective. Depending on the problem and applied metaheuristics, there is high probability that an obtained solution is very close to the optimal one.
Local and global optimum
However, within heuristic algorithms we don't use pure try to error approach but refine techniques called metaheuristic. Hill climbing, simulated annealing, tabu search and generic algorithms are some of the most famous. There are also a few other very efficient metaheuristic including ant colony optimization. The basic terms within every heuristic search are local and global optimums, and the goal of the search is to find a local optimum that is a global optimum in the same time. We can illustrate this with a task to find the highest mountain peak among peaks in a given area. As far as we succeed to find a higher spot of a current position  we move to this new position progressing to a peak. However, this peak might is no the highest one, that we are looking for. So, heuristic search usually have two mechanisms: one for seeking for a local optimum and another one for escape from local optimum.
Firstly we are going to explain the four mentioned metaheuristic: hill climbing, Simulated annealing, tabu search and generic algorithm. Besides, one can find a case study where a chosen problem will be explained in details.
Hill climbing
Hill climbing is a heuristic algorithm based on the strategy of steepest ascent. The algorithm works as long as the heuristic function generates a solution that is better than the previous one. When a new candidate is not 'better' (closer to the optimal solution) than the previous one, the algorithm finishes its work. Some implementations of this algorithm allow a new candidate to be equally good as the previous, so the algorithm ends when a new solution is worse than the previous one.
The metaheuristics hill climbing has no a mechanism for the escape from local optimum. Thus, it is perfect for problems with only one local optimum – that is a global solution in the same time. However, it is possible to use this metaheuristics even for problems with more local optimums. The following is the pseudocode of hill climbing. The fitness function is denoted by P() and heuristic funciton is denoted by h().
Algorithm HillClimbing()
external h(), P()
select a feasible solution
x_{best} = x
searching = true
while searching
y = h(x)
if P(x) > P(x_{best}) then
x = y
x_{best} = x
else searching = false
return (x_{best})
Simulated annealing
Simulated annealing is a heuristic algorithm, whose name and ideas originate from the annealing process in metallurgy. Annealing technique is obtained by increasing the material strength and hardness, which is achieved by reducing defects in the crystal lattice and bringing the system into a state of lower internal energy of the initial state.
In general, at the beginning of the annealing the structure of materials is in a local optimum of the internal energy. Because of the heating, atoms leave their original positions in the crystal lattice. During slow cooling they have a chance to take a different, better, place. This process brings the whole system into a state of lower internal energy than was the initial one. An ideal case is to find the least possible state of energy, just as within a heuristic algorithm to find the global optimum. The following is the pseudocode of the simulated annealing algorithm.
There are three input parameters: the number of iteration c_{max}, coefficient of cooling α, and initial temperature T_{0}. It is shown that the algorithm is the most sensitive to the coefficient of cooling. The temperature is decreased in each iteration, by formula T = α T_{0}. The value of coefficient α is greater than 0 and less than 1, but usually is close to 1, for example a typical value is like 0.998. The number of iteration varies from a value like 1000 to several thousand.
Heuristic algorithms are the art of learning by mistakes.
 Within the heuristic algorithms there is no guarantee that the found solution is the best possible.

However, in practice heuristic algorithms presents itself as a very efficient and precise.
 Vital part of any heuristic algorithm is the heuristic function. In general, it is designed creatively but often the same heuristic function is able to solve completely different problems.

There is a wide variety of problems in science, engineering and other areas that can be solved by heuristic algorithms. The pattern recognition and to find optimal shape of an objects are only some of them.
 Genetic algorithm is one of the most popular metaheuristics.
.
Algorithm SimulatedAnnealing (c_{max}, α, T_{0})
set T to T_{0}
select initial solution X
P_{x} = Fitness(X)
X_{best }= X
P_{best} = P_{x}
while (c <= c_{max})
Y = Heuristic(X)
if (Fitness(Y) > P_{x}) than X = Y
P_{x} = Fitness(Y)
if (P_{x} > P_{best}) than
X_{best }= X
P_{best }= P_{x}
else
r = Random(1)
if r < e^{(Fitness(Y)Fitness(X))/T}then
X = Y
P_{x} = Fitness(Y)
T = α T_{0}
return(X_{best})
Tabu search
Tabu search algorithm is a heuristic algorithm where the heuristic function returns the best candidate from the current solution neighborhood. In the new cycle is passed an element from the current neighborhood with the extreme value of objective function. Contrary to hill climbing and simulated annealing, tabu search always accept a new candidate, either it is better than the previous one or not. That procedure provides searching of new parts of solution space, while heuristic function has the role to find the best candidates in the current neighborhood.
In order to avoid searching of the same neighborhood more than once, a 'tabu list' is formed. Thus, in the tabu list are written candidates whose neighborhood has already been searched. It follows the pseudocode of the tabu search algorithm. Input parameters are the length of tabu list L and the number of iteration c_{max}. For example, a typical value of L is 10. After any change of current solutions (X to Y), the move is written in the tabu list meaning that it is forbidden.
Travelling salesman problem
Given a list of cities with known distances between each pair, one have to find the shortest possible route that visits each city exactly once. This problem mathematically formulated is known as Traveling Salesman Problem and serve as a prototype for other concrete problems. It would be very impractical to find the solution deterministically.
Actually, for a relatively many cities it is practically impossible since the calculation can't finish in realistic time neither on the fastest computers. Such a problems are typical job for a heuristic algorithm.
Algorithm TabuSearch (c_{max}, L)
set initial solution X
X_{best }= X
P_{best} = Fitness(X)
while (c <= c_{max})
N = N(X) \ TabuList[d]; d = cL,…,c1
find Y in N such that fitness(Y) is maximum
TabuList[c] = Change(Y, X)
X = Y
if (Fitness(X) > P_{best}) then
X_{best} = X
P_{best }= Fitness(X)
return(X_{best})
Video 1. Genetic algorithm in action. A popular demo of genetic algorithm.
The task is to draw as better approximation of Mona Lisa as possible with a certain number of circles.
The colour and radius of a circle vary through iterations. Here we see an impressive approximation of Mona Lisa in 250 circles.
Genetic algorithm
Genetic algorithm is probably the most popular among metaheuristics. It is a heuristic algorithm that imitate the natural evolution process – as described by the Darwin's theory.
Within the genetic algorithm a solutiuon space consists of individuals, represented by chromosomes. Thus, a chromosome is a feasible solution. The specimens were left to the process of evolution. This means that in the next generation are transferred individuals with the best properties i.e. those for which the objective function has a maximum value. The duration of the process is given in advance on several ways, including the number of iteration. Basic mechanisms of selecting better and better chromosomes are as in the Darwin's theory selection, crossover and mutation.
The selection operator selects certain number of the best chromosomes for the next generation (iteration). These chromosomes will form new candidates by the crossover operator. Finally, mutation operator has the task to form candidates from unexplored parts of the solution space. Thus, the main role of the crossover is to converge towards the optimum, mutation provides a search of overall solution space and escape from a local optimum.
Genetic algorithms are known as very robust algorithms, capable of solving wide range of problems. Contrary to the other heuristics algorithms, genetic algorithms deal with a set of candidates in each iteration instead of only one. Because of that, genetic algorithms usually require a lot of computational resources.
Genetic Algorithm
generate initial population P(0)
evaluate the fitness of each individual in the population
t = 0
repeat
select bestranking individuals to reproduce
create new generation through crossover
and mutation (P(t) from P(t1))
evaluate individual fitnesses
t = t +1
until (terminating condition)
return (best chromosome)
Case study
As an illustration of the described ideas, we are going to find the maximum of the function
f(x)=x^cos(x)
in the interval [0,50] (see the figure above). For the purpose to solve this problem we will use hill climbing metaheuristics. The figure below shows a simple implementation in the programming language C. The implementation is without code optimization and we don't take care on boundary conditions (it is possible that the best solution slightly 'escape' from the given interval). In the first part of the code we see the heuristic function with the task of an heuristic search for the function maximum on the current interval. In every next iteration current interval become neighborhood of the currently best solution.
With this procedure, the solution space is search more and more in details with every next iteration. Of course, there is an uncertainty whether the procedure will skip the global optimum or not. However, a reasonable choice of the input parameter 'step' n, it is possible to reach the global optimum in a few iterations. Table 1 shows results of the implemented algorithm for several choices of n.
n
 numiter
 x_best
 f(x_best)


50
 7
 43.988305
 43.985301

100
 6
 43.988305
 43.985301

500
 5
 43.988305
 43.985301

4