IBM ILOG Solver User's Manual > Local Search > Basic Local Search > Metaheuristics |
Metaheuristics |
INDEX
![]() |
In the map coloring example we examined in the previous sections, all our local searches were of the greedy variety. That is, they made only moves which improved the objective. This greedy technique is efficient, but can encounter difficulties even on small problems. The difficulty is that although there may be no improving moves from the current solution, this does not mean that better solutions do not exist. Moreover, there is no bound upon how much better a solution could be.
For example, consider the problem:
IloEnv env; IloModel mdl(env); IloIntVarArray a(env, 2, 0, 1); IloInt C = 1; IloNumVar cost(env, 0, 2 * C + 1); mdl.add(cost == C * (a[0] + a[1]) + (C + 1.0) * (a[0] != a[1])); |
The costs of different configurations are given below:
a[0] |
a[1] |
cost |
---|---|---|
0 | 0 | 0 |
0 | 1 | 2C + 1 |
1 | 0 | 2C + 1 |
1 | 1 | 2C |
If the current solution happens to be at state a[0]=1
, a[1]=1
(which we shall term 11
by omitting reference to the array a
), and we are making moves by flipping single bits, then there is no move that will improve the cost. Moves to state 01
and 10
increase the cost by one. However, the state 00
has cost which is 2C
less than that of state 11
. Obviously, we can make this disparity arbitrarily large by increasing C
. The configurations 01
and 10
act as a barrier to local search "guarding" the global minimum.
The well-known problem just described is the local minimum problem: greedy searches become trapped in these local minima, and thus the quality of solutions obtained can suffer. Because of this problem, metaheuristics are commonly used to escape from these local minima. A variety of escape techniques can be used, the simplest of which is to allow moves which degrade the cost of the solution, in the hope that subsequent moves that improve the cost can be found.
Care should be taken, however, about the way in which we allow the cost to degrade. To simply accept any move without guidance leads to a random walk through the neighborhood space, and is unlikely to bring about a good solution. Instead metaheuristics use controlled methods of acceptance of degrading solutions, in order to both escape local minima and then go on to find better solutions.
© Copyright IBM Corp. 1987, 2009. Legal terms. | PREVIOUS NEXT |