typical local search algorithm requires that, with each solution x
i
¢ R
d
, there
is associated a predefi ned neighborhood N(x
i
) R
d
. Given a current solution
point x
i
¢ R
d
, the set N(x
i
) is searched for a point x
i
+1 with f(x
i
+ 1) < f(−x
i
) or
(f(x
i
+ 1) > f(x
i
)). If such a point exists, it becomes the new current solution
point (x
i
x
i
+1), and then the process is iterated. Otherwise, x
i
is retained as
a local optimum with respect to N(x
i
). Then a set of feasible solution points
is generated, and each of them is locally improved within its neighborhood.
Local search methods only check the neighborhood of current feasible
solution x
i
, so the search range has been dramatically reduced and the
convergence speed has been accelerated. A major shortcoming of local
search is that the algorithm has a tendency to get stuck at a locally optimum
confi guration, i.e., a local minima point, as the point 2 or 4 shown in Fig.
7.1. Different neighborhood structures result in difference terrain surface
structures of the search space and produce different numbers of local
minimum points. The effectiveness of a local search algorithm relies on
the number of local minimum points in the search space [16], that is, local
minimum points make a search problem hard. The smaller the number of
local minimum points, the more effective a local search algorithm is. In
order to reduce the infl uence of local minimum to local search algorithm,
some local minimum traps must be fi lled. Gu and Huang [16] has called
the method of fi lling minimum trap as the smoothing search space, and it
is able to dynamically reconstruct the problem structure and smooth the
rugged terrain surface of the search space. The smoothed search space could
hide some local minimum points, therefore, improving the performance
of the traditional local search algorithm. Figure 7.2.1 is the illustration of
smoothing search space.
From Fig. 7.2.1, we can see that Many local minimum traps are fi lled
after running a smoothing operator. The real line curve shows the original
search space which has lots of minimum traps, and dashes shows the
smoothed search space with fewer minimum traps. At the former discussing,
we can fi nd that lots of the local minimum points which are embedded
in the search space make a local search problem hard and sensitive to the
initialization. The essence of iterative refi nement clustering algorithms is the
local search method, thus they have the same real reason for initialization
sensitivity problem. The main idea of smoothing search space is always
common, but different application areas have different ways to smoothing
the search space. In clustering area, clustering is defi ned as the process
of partitioning unlabeled data objects into meaningful groups (clusters)
so that the value of cluster criterion Q(c) is minimized. Minimizing Q(c)
value means that the intra-similarities of all clusters are maximized or
the distances of each data object to its cluster center is minimized. So the
cluster criterion Q(c) has a close relationship with the similarity or distance
between data objects.
Advanced Clustering Analysis 157