Research Article An Improved Animal Migration Optimization ...complex optimization problems....

13
Research Article An Improved Animal Migration Optimization Algorithm for Clustering Analysis Mingzhi Ma, 1 Qifang Luo, 1,2 Yongquan Zhou, 1,2 Xin Chen, 1 and Liangliang Li 1 1 College of Information Science and Engineering, Guangxi University for Nationalities, Nanning 530006, China 2 Guangxi High School Key Laboratory of Complex System and Intelligent Computing, Nanning 530006, China Correspondence should be addressed to Qifang Luo; [email protected] Received 14 June 2014; Revised 17 December 2014; Accepted 17 December 2014 Academic Editor: Josef Dibl´ ık Copyright © 2015 Mingzhi Ma et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Animal migration optimization (AMO) is one of the most recently introduced algorithms based on the behavior of animal swarm migration. is paper presents an improved AMO algorithm (IAMO), which significantly improves the original AMO in solving complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many fields. e well-known method in solving clustering problems is -means clustering algorithm; however, it highly depends on the initial solution and is easy to fall into local optimum. To improve the defects of the -means method, this paper used IAMO for the clustering problem and experiment on synthetic and real life data sets. e simulation results show that the algorithm has a better performance than that of the -means, PSO, CPSO, ABC, CABC, and AMO algorithm for solving the clustering problem. 1. Introduction Data clustering is the process of grouping data into a number of clusters. e goal of data clustering is to make the data in the same cluster share a high degree of similarity while being very dissimilar to data from other clusters. It is a main task of exploratory data mining and a common technique for statistical data analysis used in many fields, including machine learning, pattern recognition, image analysis, infor- mation retrieval, and bioinformatics. Cluster analysis was originated in anthropology by Driver and Kroeber in 1932 and introduced to psychology by Zubin in 1938 and Tryon in 1939 and famously used by Cattell beginning in 1943 [1] for trait theory classification in personality psychology. Many clustering methods have been proposed; they are divided into two main categories: hierarchical and partitional. e - means clustering method [2] is one of the most commonly used partitional methods. However, the results of -means solving the clustering problem highly depend on the initial solution and it is easy to fall into local optimal solutions. Zhang et al. have proposed an improved -means clustering algorithm called -harmonic means [3]. But the accuracy of the results obtained by the method is not high enough. In recent years, many studies have been inspired by animal behavior phenomena for developing optimization techniques, such as firefly algorithm (FA) [4], cuckoo search (CS) [5], bat algorithm (BA) [6], artificial bee colony (ABC) [7], and particle swarm optimization (PSO) [8]. Because of its advantages of global, parallel efficiency, robustness, and uni- versality, these bioinspired algorithms have been widely used in constrained optimization and engineering optimization [9, 10], scientific computing, automatic control, and clustering problem [1121]. Niknam et al. have proposed an efficient hybrid evolutionary algorithm based on combining ACO and SA for clustering problem [15, 16] in 2008. In 1991, Colorni et al. have presented ant colony optimization (ACO) algorithm based on the behavior of ants seeking a path between their colony and a source of food [22]. en Shelokar et al. and Kao and Cheng have solved the clustering problem using the ACO algorithm [17, 18] in 2004 and 2006. Eberhart and Kennedy have proposed particle swarm optimizer (PSO) algorithm which simulates the movement of organisms in a bird flock or fish school [8] in 1995 and the algorithm also has been adopted to solve this problem by Omran et al. and van der Merwe and Engelbrecht [19, 23] in 2005 and 2003. Kao et al. have presented a hybrid approach according to combination Hindawi Publishing Corporation Discrete Dynamics in Nature and Society Volume 2015, Article ID 194792, 12 pages http://dx.doi.org/10.1155/2015/194792

Transcript of Research Article An Improved Animal Migration Optimization ...complex optimization problems....

Page 1: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Research ArticleAn Improved Animal Migration Optimization Algorithm forClustering Analysis

Mingzhi Ma1 Qifang Luo12 Yongquan Zhou12 Xin Chen1 and Liangliang Li1

1College of Information Science and Engineering Guangxi University for Nationalities Nanning 530006 China2Guangxi High School Key Laboratory of Complex System and Intelligent Computing Nanning 530006 China

Correspondence should be addressed to Qifang Luo lqf163com

Received 14 June 2014 Revised 17 December 2014 Accepted 17 December 2014

Academic Editor Josef Diblık

Copyright copy 2015 Mingzhi Ma et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Animal migration optimization (AMO) is one of the most recently introduced algorithms based on the behavior of animal swarmmigration This paper presents an improved AMO algorithm (IAMO) which significantly improves the original AMO in solvingcomplex optimization problems Clustering is a popular data analysis and data mining technique and it is used in many fieldsThe well-known method in solving clustering problems is 119896-means clustering algorithm however it highly depends on the initialsolution and is easy to fall into local optimum To improve the defects of the 119896-means method this paper used IAMO for theclustering problem and experiment on synthetic and real life data sets The simulation results show that the algorithm has a betterperformance than that of the 119896-means PSO CPSO ABC CABC and AMO algorithm for solving the clustering problem

1 Introduction

Data clustering is the process of grouping data into a numberof clusters The goal of data clustering is to make the datain the same cluster share a high degree of similarity whilebeing very dissimilar to data from other clusters It is a maintask of exploratory data mining and a common techniquefor statistical data analysis used in many fields includingmachine learning pattern recognition image analysis infor-mation retrieval and bioinformatics Cluster analysis wasoriginated in anthropology by Driver and Kroeber in 1932and introduced to psychology by Zubin in 1938 and Tryonin 1939 and famously used by Cattell beginning in 1943 [1]for trait theory classification in personality psychology Manyclustering methods have been proposed they are dividedinto two main categories hierarchical and partitional The 119896-means clustering method [2] is one of the most commonlyused partitional methods However the results of 119896-meanssolving the clustering problem highly depend on the initialsolution and it is easy to fall into local optimal solutionsZhang et al have proposed an improved 119896-means clusteringalgorithm called 119896-harmonic means [3] But the accuracy ofthe results obtained by the method is not high enough

In recent years many studies have been inspired byanimal behavior phenomena for developing optimizationtechniques such as firefly algorithm (FA) [4] cuckoo search(CS) [5] bat algorithm (BA) [6] artificial bee colony (ABC)[7] and particle swarm optimization (PSO) [8] Because of itsadvantages of global parallel efficiency robustness and uni-versality these bioinspired algorithms have been widely usedin constrained optimization and engineering optimization [910] scientific computing automatic control and clusteringproblem [11ndash21] Niknam et al have proposed an efficienthybrid evolutionary algorithm based on combiningACO andSA for clustering problem [15 16] in 2008 In 1991 Colorni etal have presented ant colony optimization (ACO) algorithmbased on the behavior of ants seeking a path between theircolony and a source of food [22]Then Shelokar et al and KaoandCheng have solved the clustering problemusing theACOalgorithm [17 18] in 2004 and 2006 Eberhart and Kennedyhave proposed particle swarm optimizer (PSO) algorithmwhich simulates the movement of organisms in a bird flockor fish school [8] in 1995 and the algorithm also has beenadopted to solve this problem by Omran et al and van derMerwe and Engelbrecht [19 23] in 2005 and 2003 Kao et alhave presented a hybrid approach according to combination

Hindawi Publishing CorporationDiscrete Dynamics in Nature and SocietyVolume 2015 Article ID 194792 12 pageshttpdxdoiorg1011552015194792

2 Discrete Dynamics in Nature and Society

of the 119896-means algorithm Nelder-Mead simplex search andPSO for clustering analysis [14] in 2008 Niknam et al havepresented a hybrid evolutionary algorithm based on PSO andSA (simulated annealing algorithm 1989 [24]) to solve theclustering problem [13] in 2009 Zou et al have proposeda cooperative artificial bee colony algorithm to solve theclustering problem and experiment on synthetic and real lifedata sets to evaluate the performance [11] in 2010 Niknamand Amiri have proposed an efficient hybrid approach basedon PSO ACO and 119896-means called PSO-ACO-K approachfor clustering analysis [12] in 2010 The artificial bee colony(ABC) algorithm is described by Karaboga [25] in 2005and it has been adopted to solve clustering problem byKaraboga and Ozturk [20] in 2011 Voges and Pope haveused an evolutionary-based rough clustering algorithm forthe clustering problem [21] in 2012 Chen et al have usedmonkey search algorithm for clustering analysis [26] in 2014

Animal migration algorithm (AMO) is a new bioinspiredintelligent optimization algorithm by simulating animalmigration behavior proposed by Li et al [27] in 2013 AMOsimulates the widespread migration phenomenon in the ani-mal kingdom through the change of position replacementof individual and finding the optimal solution graduallyAMO has obtained good experimental results on manyoptimization problems This paper presents an algorithmto improve the performance of AMO We proposed a newmigration method to modify the performance of AMO themigration process based on shrinking animals living areaoperator this method guarantees AMO rapid convergenceto global optimum By means of selecting the better solutionspace around the current solution it improves search abilityand accelerates convergence velocity and it has more chanceto find the global optima

The structure of the paper is as follows In Section 2 thetraditional method 119896-means for clustering is presented InSection 3 the original AMO algorithm is introduced Sec-tion 4 describes our proposed novel approach of migrationprocess Section 5 elaborates the improved AMO and somebiological foundations of animal behaviors are explainedSection 6 illustrates experiments and discusses the resultsSection 7 studies the extent of different size of shrinkagecoefficient impact of the proposed algorithm At the endof the paper we conclude it with future directions anddevelopments with the improved AMO

2 The 119896-Means Clustering Algorithm

The target of data clustering is grouping data into a number ofclusters 119896-means is one of the simplest unsupervised learningalgorithms that solve the clustering problem It is proposedby MacQueen in 1967 [28] The procedure follows a simpleand easy way to classify a given data set 119863 = 119909

1 1199092 119909

119899

through a certain number of clusters 1198661 1198662 119866

119870(assume

119870 clusters) fixed a priori each data vector is a 119901-dimensionalvector satisfying the following conditions [29 30]

(1) 119866119894= 119894 = 1 2 119870

(2) 119866119894cap 119866119895= 119894 119895 = 1 2 119870 119894 = 119895

(3) ⋃119870

119894=1119866119894= 1199091 1199092 119909

119899

The 119896-means clustering algorithm is as follows

(1) Set the number of clusters 119870 and the data set 119863 =

1199091 1199092 119909

119899

(2) Randomly choose119870 points 1198881 1198882 119888

119870as the cluster

centroids from 1199091 1199092 119909

119899

(3) Assign each object 119909119894to the group that has the closest

centroid The principle of division is as follows if119889(119909119894minus 119888119894) lt 119889(119909

119894minus 119888119896) 119896 = 1 2 119870 and 119894 = 119896 The

data 119909119894will be divided into classified collection 119866

119894

(4) When all objects have been assigned recalculate thepositions of the 119870 centroids 119888lowast

1 119888lowast

2 119888

lowast

119870

119888lowast

119894=

11003816100381610038161003816119866119894

1003816100381610038161003816

sum

119909119895isin119866119895

119909119895 119894 = 1 2 119870 (1)

where |119866119894| is the number of the points in the classified

collection 119866119894

(5) Repeat Steps (2) and (4) until the centroids no longermove

The main idea is to define 119870 centroids one for eachcluster These centroids should be placed in a cunning waybecause of different location causing different result So thebetter choice is to place them as much as possible far awayfrom each other In this study we will use Euclidian metric asa distance metric The expression is given as follows

119889 (119909119894 119888119895) = radic

119901

sum

119896=1

(119909119894119896

minus 119888119895119896)2

(2)

Finally this algorithm aims at minimizing an objective func-tion in this case a squared error functionThe objective func-tion is as follows

119891 (119883 119862) =

119899

sum

119894=1

min 1003817100381710038171003817119909119894 minus 119888

119896

10038171003817100381710038172

| 119896 = 1 2 119870 (3)

3 Animal Migration Optimization (AMO)

Animal migration algorithm can be divided into animal mig-ration process and animal updating process In the migrationprocess the algorithm simulates how the groups of animalsmove from current position to a new position During thepopulation updating process the algorithm simulates howanimals are updated by the probabilistic method

31 Animal Migration Process During the animal migrationprocess an animal should obey three rules (1) avoid colli-sions with your neighbors (2) move in the same directionas your neighbors and (3) remain close to your neighborsIn order to define concept of the local neighborhood of anindividual we use a topological ring as has been illustratedin Figure 1 For the sake of simplicity we set the length ofthe neighborhood to be five for each dimension of the indi-vidual Note that in our algorithm the neighborhood topol-ogy is static and is defined on the set of indices of vectors

Discrete Dynamics in Nature and Society 3

If the index of animal is 119894 then its neighborhood consistsof animal having indices 119894 minus 2 119894 minus 1 119894 119894 + 1 119894 + 2 if theindex of animal is 1 the neighborhood consists of animalhaving indices 119873119875 minus 1119873119875 1 2 3 and so forth Once theneighborhood topology has been constructed we select oneneighbor randomly and update the position of the individualaccording to this neighbor as can be seen in the followingformula

119883119894119866+1

= 119883119894119866

+ 120575 sdot (119883neiborhood119866 minus 119883119894119866

) (4)

where 119883neiborhood119866 is the current position of the neighbor-hood 120575 is produced by using a random number generatorcontrolled by a Gaussian distribution 119883

119894119866is the current

position of 119894th individual and 119883119894119866+1

is the new position of119894th individual

32 Population Updating Process During the populationupdating process the algorithm simulates how some animalsleave the group and some join in the new populationIndividuals will be replaced by some new animals with aprobability 119875119886 The probability is used according to thequality of the fitness We sort fitness in descending order sothe probability of the individual with best fitness is 1119873119875 andthe individual with worst fitness by contrast is 1 and theprocess can be shown in Algorithm 1

In Algorithm 1 1199031 1199032isin [1 119873119875] are randomly chosen

integers 1199031

= 1199032

= 119894 After producing the new solution119883119894119866+1

it will be evaluated and compared with the 119883119894119866 and

we choose the individual with a better objective fitness

119883119894=

119883119894119866

if 119891 (119883119894119866

) is better than 119891 (119883119894119866+1

)

119883119894119866+1

otherwise(5)

4 The New Migration Process Method

In AMO algorithm uses migration process and populationupdating process to find a satisfactory solutionThe proposedalgorithm used a new migration process by establishing aliving area by the leader animal (the individuals with bestfitness value) and animalsmigrate from current locations intothis new living area to simulate animal migration process

At first there are 119873119875 animals that live in living area asshown in Figure 2(a) moving eating drinking reproducingand so on some individuals move randomly and theirposition be updated and then we calculate the best positionof animals by fitness function and record it But the amountof food or water gradually diminished as the time wore onas shown in Figure 2(b) and some animals migrate fromthe current areas which have no food and water to a newarea with abundant food and water as shown in Figure 2(c)In Figure 2 the green parts represent the living areas withabundant food and water animals can live in these areasAnd the yellow parts represent the areas that lack food orwater animals can no longer live in these areas and theymustmigrate to a new living area (the green parts in Figure 2(c))We shrink the living area after a period of time (as shown inFigures 2(a) and 2(c)) and then animals migrate to the newliving area ceaselessly As a rule of thumb the globally optimal

solution always nearby is the current best solution in IAMOthe animals living area is smaller and smaller (by formula(6)) after each iteration and the individuals get closer andcloser to the globally optimal solution so we can acceleratethe convergence velocity and precision of the algorithm tosome extent

The boundary of the living area is established by

low = 119883best minus 119877 up = 119883best + 119877

119877 = 120588 sdot 119877

(6)

where 119883best is the leader animal (the current best solution)low and up are the lower and upper bound of the living area119877 is living area radius 120588 is shrinkage coefficient 120588 isin (0 1)and low up and 119877 are all 1 times 119863 row vector In general theoriginal value of 119877 depends on the size of the search spaceAs iterations go on a big value of 119877 improves the explorationability of the algorithm and a small value of 119877 improves theexploitation ability of the algorithm

5 The IAMO Algorithm forSolving Clustering Problem

51 Initializing the Population During the initialization pro-cess the algorithm begins with initializing a set of119873119875 animalpositions 119883

1 1198832 1198833 119883

119873119875 each animal position 119883

119894is a

1 times (119870 times 119863)-dimensional vector where 119870 is the number ofclustering center and119863 is the dimension of the test set119866

119873times119863

The cluster centers 119909lowast

119894= (1199091198941 1199091198942 119909

119894119863) (119894 = 1 2 119870)

each center 119909lowast

119894is 1 times 119863-dimensional vector and the lower

bound of the centers is the minimum of each column in testset 119866119873times119863

namely 119886lowast119894

= min1198661 1198662 119866

119863 and the upper

bound of the centers is 119887lowast

119894= max119866

1 1198662 119866

119863 So we can

initialize the position of an individual 119883119894= 119909lowast

1 119909lowast

2 119909lowast

119870=

(11990911 11990912 119909

1119863) (11990921 11990922 119909

2119863) (119909

1198701 1199091198702

119909119870119863

) =

11990911 11990912 119909

119870119863 and then the lower and upper bounds of

the solution space are 119886 = (119886lowast

1 119886lowast

2 119886

lowast

119870) and 119887 = (119887

lowast

1 119887lowast

2

119887lowast

119870)

Animals are randomly and uniformly distributedbetween the prespecified lower initial parameter bound 119886

and the upper initial parameter bound 119887 So the 119895th compo-nent of the 119894th vector is as follows

119909119894119895

= 119886119895+ rand

119894119895 [0 1] sdot (119887119895 minus 119886119895)

119894 = 1 119873119875 119895 = 1 119870 times 119863

(7)

where rand119894119895[0 1] is a uniform distribution random number

between 0 and 1

52 Animals Migration During the migration processbecause of animals hunting foraging or drinking in theliving area some parts of the living area are lacking food orwater or climate condition change and some animalsmigratefrom the current living area to the new living area whichhas abundant food and water or climate condition suitablefor living We assume that there is only one living area andanimals out of the new living area would be migrating intothe new living area as depicted in Section 4 We calculate the

4 Discrete Dynamics in Nature and Society

(1) For 119894 = 1 to 119873119875 do(2) For 119895 = 1 to 119863 do(3) If rand gt 119875

119886then

(4) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(5) End If(6) End For(7) End For

Algorithm 1 Population updating process

(1) Begin(2) Set the generation counter 119866 living area radius 119877 shrinkage coefficient 120588 and randomly initialize 119883

119894

with a population of 119873119875 animals in solution space(3) Evaluate the fitness for each individual 119883

119894 record the best individual 119883best

(4)While stopping criteria is not satisfied do(5) Establish a new living area by low = 119883best minus 119877 up = 119883best + 119877

(6) Animals migrate into the new living area(7) For 119894 = 1 to 119873119875 do(8) For 119895 = 1 to 119863 do(9) Select randomly 1199031 = 1199032 = 119894

(10) If rand gt 119875119886then

(11) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(12) End If(13) End For(14) End For(15) For 119894 = 1 to 119873119875 do(16) Evaluate the offspring119883

119894119866+1

(17) If 119883119894119866+1

is better than 119883119894then

(18) 119883119894= 119883119894119866+1

(19) End If(20) End for(21) Memorize the best solution achieved so far(22) 119877 = 119877 sdot 120588

(23) End while(24) End

Algorithm 2 An improved animalrsquos migration optimization algorithm (IAMO)

distance between cluster centers 119909lowast1 119909lowast

2 sdot sdot sdot 119909

lowast

119870and text data

set thenwe classify test data set into119870 categories according tothe distance and finally we can obtain the fitness accordingthe fitness function

119891 (119883 119866) =

119873

sum

119894=1

min 1003817100381710038171003817119866119894 minus 119909

119896

10038171003817100381710038172

| 119896 = 1 2 119870 (8)

According to the fitness function we obtain the bestindividual 119883best and the new living area can be establishedby 119883best and 119877

53 Individuals in Population Updating During the popula-tion updating process algorithm simulates some animals thatare preyed by their enemies or some animals leave the groupand some join in the group from other groups or some newanimals are born In IAMO we assume that the number of

available animals is fixed and every animal will be replacedby 119875119886 as shown in Section 32

Specific implementation steps of the improved animalsmigration optimization algorithm (IAMO) can be shown asin Algorithm 2

6 Numerical Simulation Experiments

All of the algorithm was programmed in MATLAB R2008anumerical experiment was set up on AMD Athlon(tm)II lowast4640 processor and 2GB memory

The experimental results comparing the IAMO clusteringalgorithm with six typical stochastic algorithms includingthe PSO [31] CPSO [32] ABC [20] CABC [11] AMO [27]and 119896-means algorithms are provided for two artificial datasets and eight real life data sets (Iris teaching assistantevaluation (TAE) wine seeds StatLog (heart) Hagermanrsquos

Discrete Dynamics in Nature and Society 5

i i minus 2

i minus 1

i + 1

i + 2

middot middot middot

Figure 1 The concept of the neighborhood of an animal

survival balance scale and Wisconsin breast cancer) whichare selected from the UCI machine learning repository [33]

Artificial Data Set One (119873 = 250 119889 = 3 and 119870 = 5) This is athree-featured problem with five classes where every featureof the classes was distributed according to Class 1-Uniform(85 100) Class 2-Uniform (70 85) Class 3-Uniform (55 70)Class 4-Uniform (40 55) and Class 5-Uniform (25 40) [1214] The data set is illustrated in Figure 3

Artificial Data Set Two (119873 = 600 119889 = 2 and 119870 = 4) This is atwo-featured problemwith four unique classes A total of 600patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732(120583 = (

119898119894

119898119894

) Σ = [[05 005

005 05]]) (9)

where 119894 = 1 2 3 4 gt 4 1198981

= minus3 1198982

= 0 1198983

= 3 and1198984= 6120583 and Σ are mean vector and covariance matrix respec-

tively [12 14] The data set is illustrated in Figure 4

Iris Data (119873 = 150 119889 = 4 and 119870 = 3) This data set with150 random samples of flowers from the Iris species setosaversicolor and virginica collected by Anderson [34] Fromeach species there are 50 observations for sepal length sepalwidth petal length and petal width in cm This data set wasused by Fisher [35] in his initiation of the linear-discriminant-function technique [11 12 33]

Teaching Assistant Evaluation (119873 = 151 119889 = 5 and 119870 =

3) The data consist of evaluations of teaching performanceover three regular semesters and two summer semestersof 151 teaching assistant (TA) assignments at the StatisticsDepartment of the University of Wisconsin-Madison Thescores were divided into 3 roughly equal-sized categories(ldquolowrdquo ldquomediumrdquo and ldquohighrdquo) to form the class variable [33]

Wine Data (119873 = 178 119889 = 13 and 119870 = 3) This isthe wine data set which is also taken from MCI laboratoryThese data are the results of a chemical analysis of winesgrown in the same region in Italy but derived from three

different cultivars The analysis determined the quantities of13 constituents found in each of the three types of winesThere are 178 instanceswith 13 numeric attributes inwine dataset All attributes are continuousThere is nomissing attributevalue [11 12 33]

Seeds Data (119873 = 210 119889 = 7 and119870 = 3)This data set consistsof 210 patterns belonging to three different varieties of wheatKama Rosa and Canadian From each species there are 70observations for area 119860 perimeter 119875 compactness 119862 (119862 =

4 lowast 119901119894 lowast 1198601198752) length of kernel width of kernel asymmetry

coefficient and length of kernel groove [33]

StatLog (Heart) Data (119873 = 270 119889 = 13 and119870 = 2)This dataset is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [33]

Hagermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) The dataset contains cases from a study that was conducted between1958 and 1970 at the University of Chicagorsquos Billings Hospitalon the survival of patients who had undergone surgery forbreast cancer It records two survival status patients with theage of patient at time of operation patientrsquos year of operationand number of positive axillary nodes detected [33]

Balance Scale Data (119873 = 625 119889 = 4 and119870 = 3)This data setwas generated to model psychological experimental resultsEach example is classified as having the balance scale tip to theright to the left or balancedThe attributes are the leftweightthe left distance the right weight and the right distance Thecorrect way to find the class is the greater of (left-distance lowast

left-weight) and (right-distance lowast right-weight) If they areequal it is balanced [33]

Wisconsin Breast Cancer (119873 = 683 119889 = 9 and 119870 =

2) It consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [11 12 33]

Herewe set the parameters of AMOand IAMOas followsThe population size of the AMO and IAMO is 100 In IAMOthe original living area radius 119877 = 03(119887 minus 119886) and shrinkagecoefficient 120588 = 092 For the PSO inertia weight 119908 = 0729acceleration coefficients 1198881 = 2 1198882 = 2 and population size119872 = 100 The population size of the CPSO is 20 The popu-lation size of the ABC and CABC are 50 and 10 respectivelyIn order to compare with other algorithms the maximumgenerations of all algorithms are 100

For every data set each algorithm is applied 20 times indi-vidually with random initial solution For the Art1 and Art2data set once the randomly generated parameters are deter-mined the same parameters are used to test the perform-ance of three algorithms We ranked each algorithm accord-ing to the mean result The results are kept four digits afterthe decimal point The mean value the best value the worstvalue the standard deviation and the rank value are recordedin Tables 1 2 3 4 5 6 7 8 9 and 10

6 Discrete Dynamics in Nature and Society

(a) The 119866th iteration living area (b) Animals begin to migrate (c) The 119866+ 1th iteration living area

Figure 2 Animals migration process

0

50

100

050

10020

40

60

80

100

Art1 data distribution

Figure 3 The distribution image of Art1

minus5 0 5 10minus6

minus4

minus2

0

2

4

6

8

10 Art2 data distribution

Figure 4 The distribution image of Art2

Table 1 Results obtained by the algorithms for 20 different runs onArt1 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 21066059 17207627 24560468 3588518 7PSO 19913796 17216508 24481993 1927566 5CPSO 18605699 17186937 24175543 3057752 4ABC 17185496 17182939 17189832 01955 3CABC 17184434 17182544 17203302 05488 2AMO 20621954 19744561 21991275 548921 6

IAMO 17182540 17182538 17182540 10990119890 minus

051

Table 2 Results obtained by the algorithms for 20 different runs onArt2 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 6442100 5146623 8964104 1811884 7PSO 5147965 5139191 5238308 23765 5

CPSO 5139046 5139035 5139085 13621119890 minus

034

ABC 5139037 5139035 5139045 32720119890 minus

043

CABC 5139037 5139035 5139696 15235119890 minus

042

AMO 5250879 5161389 5462325 84842 6

IAMO 5139035 5139035 5139035 93111119890 minus

061

Tables 1ndash10 show that IAMO is very precise than otheralgorithms in solving the ten data sets As seen from theresults the IAMOalgorithmprovides the best value and smallstandard deviation in comparison with other methods Forthe Art1 and Art2 data set in Tables 1 and 2 which were ran-domly generated IAMOobtained the best mean and smalleststandard deviation compared to other algorithms The mean

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 2: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

2 Discrete Dynamics in Nature and Society

of the 119896-means algorithm Nelder-Mead simplex search andPSO for clustering analysis [14] in 2008 Niknam et al havepresented a hybrid evolutionary algorithm based on PSO andSA (simulated annealing algorithm 1989 [24]) to solve theclustering problem [13] in 2009 Zou et al have proposeda cooperative artificial bee colony algorithm to solve theclustering problem and experiment on synthetic and real lifedata sets to evaluate the performance [11] in 2010 Niknamand Amiri have proposed an efficient hybrid approach basedon PSO ACO and 119896-means called PSO-ACO-K approachfor clustering analysis [12] in 2010 The artificial bee colony(ABC) algorithm is described by Karaboga [25] in 2005and it has been adopted to solve clustering problem byKaraboga and Ozturk [20] in 2011 Voges and Pope haveused an evolutionary-based rough clustering algorithm forthe clustering problem [21] in 2012 Chen et al have usedmonkey search algorithm for clustering analysis [26] in 2014

Animal migration algorithm (AMO) is a new bioinspiredintelligent optimization algorithm by simulating animalmigration behavior proposed by Li et al [27] in 2013 AMOsimulates the widespread migration phenomenon in the ani-mal kingdom through the change of position replacementof individual and finding the optimal solution graduallyAMO has obtained good experimental results on manyoptimization problems This paper presents an algorithmto improve the performance of AMO We proposed a newmigration method to modify the performance of AMO themigration process based on shrinking animals living areaoperator this method guarantees AMO rapid convergenceto global optimum By means of selecting the better solutionspace around the current solution it improves search abilityand accelerates convergence velocity and it has more chanceto find the global optima

The structure of the paper is as follows In Section 2 thetraditional method 119896-means for clustering is presented InSection 3 the original AMO algorithm is introduced Sec-tion 4 describes our proposed novel approach of migrationprocess Section 5 elaborates the improved AMO and somebiological foundations of animal behaviors are explainedSection 6 illustrates experiments and discusses the resultsSection 7 studies the extent of different size of shrinkagecoefficient impact of the proposed algorithm At the endof the paper we conclude it with future directions anddevelopments with the improved AMO

2 The 119896-Means Clustering Algorithm

The target of data clustering is grouping data into a number ofclusters 119896-means is one of the simplest unsupervised learningalgorithms that solve the clustering problem It is proposedby MacQueen in 1967 [28] The procedure follows a simpleand easy way to classify a given data set 119863 = 119909

1 1199092 119909

119899

through a certain number of clusters 1198661 1198662 119866

119870(assume

119870 clusters) fixed a priori each data vector is a 119901-dimensionalvector satisfying the following conditions [29 30]

(1) 119866119894= 119894 = 1 2 119870

(2) 119866119894cap 119866119895= 119894 119895 = 1 2 119870 119894 = 119895

(3) ⋃119870

119894=1119866119894= 1199091 1199092 119909

119899

The 119896-means clustering algorithm is as follows

(1) Set the number of clusters 119870 and the data set 119863 =

1199091 1199092 119909

119899

(2) Randomly choose119870 points 1198881 1198882 119888

119870as the cluster

centroids from 1199091 1199092 119909

119899

(3) Assign each object 119909119894to the group that has the closest

centroid The principle of division is as follows if119889(119909119894minus 119888119894) lt 119889(119909

119894minus 119888119896) 119896 = 1 2 119870 and 119894 = 119896 The

data 119909119894will be divided into classified collection 119866

119894

(4) When all objects have been assigned recalculate thepositions of the 119870 centroids 119888lowast

1 119888lowast

2 119888

lowast

119870

119888lowast

119894=

11003816100381610038161003816119866119894

1003816100381610038161003816

sum

119909119895isin119866119895

119909119895 119894 = 1 2 119870 (1)

where |119866119894| is the number of the points in the classified

collection 119866119894

(5) Repeat Steps (2) and (4) until the centroids no longermove

The main idea is to define 119870 centroids one for eachcluster These centroids should be placed in a cunning waybecause of different location causing different result So thebetter choice is to place them as much as possible far awayfrom each other In this study we will use Euclidian metric asa distance metric The expression is given as follows

119889 (119909119894 119888119895) = radic

119901

sum

119896=1

(119909119894119896

minus 119888119895119896)2

(2)

Finally this algorithm aims at minimizing an objective func-tion in this case a squared error functionThe objective func-tion is as follows

119891 (119883 119862) =

119899

sum

119894=1

min 1003817100381710038171003817119909119894 minus 119888

119896

10038171003817100381710038172

| 119896 = 1 2 119870 (3)

3 Animal Migration Optimization (AMO)

Animal migration algorithm can be divided into animal mig-ration process and animal updating process In the migrationprocess the algorithm simulates how the groups of animalsmove from current position to a new position During thepopulation updating process the algorithm simulates howanimals are updated by the probabilistic method

31 Animal Migration Process During the animal migrationprocess an animal should obey three rules (1) avoid colli-sions with your neighbors (2) move in the same directionas your neighbors and (3) remain close to your neighborsIn order to define concept of the local neighborhood of anindividual we use a topological ring as has been illustratedin Figure 1 For the sake of simplicity we set the length ofthe neighborhood to be five for each dimension of the indi-vidual Note that in our algorithm the neighborhood topol-ogy is static and is defined on the set of indices of vectors

Discrete Dynamics in Nature and Society 3

If the index of animal is 119894 then its neighborhood consistsof animal having indices 119894 minus 2 119894 minus 1 119894 119894 + 1 119894 + 2 if theindex of animal is 1 the neighborhood consists of animalhaving indices 119873119875 minus 1119873119875 1 2 3 and so forth Once theneighborhood topology has been constructed we select oneneighbor randomly and update the position of the individualaccording to this neighbor as can be seen in the followingformula

119883119894119866+1

= 119883119894119866

+ 120575 sdot (119883neiborhood119866 minus 119883119894119866

) (4)

where 119883neiborhood119866 is the current position of the neighbor-hood 120575 is produced by using a random number generatorcontrolled by a Gaussian distribution 119883

119894119866is the current

position of 119894th individual and 119883119894119866+1

is the new position of119894th individual

32 Population Updating Process During the populationupdating process the algorithm simulates how some animalsleave the group and some join in the new populationIndividuals will be replaced by some new animals with aprobability 119875119886 The probability is used according to thequality of the fitness We sort fitness in descending order sothe probability of the individual with best fitness is 1119873119875 andthe individual with worst fitness by contrast is 1 and theprocess can be shown in Algorithm 1

In Algorithm 1 1199031 1199032isin [1 119873119875] are randomly chosen

integers 1199031

= 1199032

= 119894 After producing the new solution119883119894119866+1

it will be evaluated and compared with the 119883119894119866 and

we choose the individual with a better objective fitness

119883119894=

119883119894119866

if 119891 (119883119894119866

) is better than 119891 (119883119894119866+1

)

119883119894119866+1

otherwise(5)

4 The New Migration Process Method

In AMO algorithm uses migration process and populationupdating process to find a satisfactory solutionThe proposedalgorithm used a new migration process by establishing aliving area by the leader animal (the individuals with bestfitness value) and animalsmigrate from current locations intothis new living area to simulate animal migration process

At first there are 119873119875 animals that live in living area asshown in Figure 2(a) moving eating drinking reproducingand so on some individuals move randomly and theirposition be updated and then we calculate the best positionof animals by fitness function and record it But the amountof food or water gradually diminished as the time wore onas shown in Figure 2(b) and some animals migrate fromthe current areas which have no food and water to a newarea with abundant food and water as shown in Figure 2(c)In Figure 2 the green parts represent the living areas withabundant food and water animals can live in these areasAnd the yellow parts represent the areas that lack food orwater animals can no longer live in these areas and theymustmigrate to a new living area (the green parts in Figure 2(c))We shrink the living area after a period of time (as shown inFigures 2(a) and 2(c)) and then animals migrate to the newliving area ceaselessly As a rule of thumb the globally optimal

solution always nearby is the current best solution in IAMOthe animals living area is smaller and smaller (by formula(6)) after each iteration and the individuals get closer andcloser to the globally optimal solution so we can acceleratethe convergence velocity and precision of the algorithm tosome extent

The boundary of the living area is established by

low = 119883best minus 119877 up = 119883best + 119877

119877 = 120588 sdot 119877

(6)

where 119883best is the leader animal (the current best solution)low and up are the lower and upper bound of the living area119877 is living area radius 120588 is shrinkage coefficient 120588 isin (0 1)and low up and 119877 are all 1 times 119863 row vector In general theoriginal value of 119877 depends on the size of the search spaceAs iterations go on a big value of 119877 improves the explorationability of the algorithm and a small value of 119877 improves theexploitation ability of the algorithm

5 The IAMO Algorithm forSolving Clustering Problem

51 Initializing the Population During the initialization pro-cess the algorithm begins with initializing a set of119873119875 animalpositions 119883

1 1198832 1198833 119883

119873119875 each animal position 119883

119894is a

1 times (119870 times 119863)-dimensional vector where 119870 is the number ofclustering center and119863 is the dimension of the test set119866

119873times119863

The cluster centers 119909lowast

119894= (1199091198941 1199091198942 119909

119894119863) (119894 = 1 2 119870)

each center 119909lowast

119894is 1 times 119863-dimensional vector and the lower

bound of the centers is the minimum of each column in testset 119866119873times119863

namely 119886lowast119894

= min1198661 1198662 119866

119863 and the upper

bound of the centers is 119887lowast

119894= max119866

1 1198662 119866

119863 So we can

initialize the position of an individual 119883119894= 119909lowast

1 119909lowast

2 119909lowast

119870=

(11990911 11990912 119909

1119863) (11990921 11990922 119909

2119863) (119909

1198701 1199091198702

119909119870119863

) =

11990911 11990912 119909

119870119863 and then the lower and upper bounds of

the solution space are 119886 = (119886lowast

1 119886lowast

2 119886

lowast

119870) and 119887 = (119887

lowast

1 119887lowast

2

119887lowast

119870)

Animals are randomly and uniformly distributedbetween the prespecified lower initial parameter bound 119886

and the upper initial parameter bound 119887 So the 119895th compo-nent of the 119894th vector is as follows

119909119894119895

= 119886119895+ rand

119894119895 [0 1] sdot (119887119895 minus 119886119895)

119894 = 1 119873119875 119895 = 1 119870 times 119863

(7)

where rand119894119895[0 1] is a uniform distribution random number

between 0 and 1

52 Animals Migration During the migration processbecause of animals hunting foraging or drinking in theliving area some parts of the living area are lacking food orwater or climate condition change and some animalsmigratefrom the current living area to the new living area whichhas abundant food and water or climate condition suitablefor living We assume that there is only one living area andanimals out of the new living area would be migrating intothe new living area as depicted in Section 4 We calculate the

4 Discrete Dynamics in Nature and Society

(1) For 119894 = 1 to 119873119875 do(2) For 119895 = 1 to 119863 do(3) If rand gt 119875

119886then

(4) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(5) End If(6) End For(7) End For

Algorithm 1 Population updating process

(1) Begin(2) Set the generation counter 119866 living area radius 119877 shrinkage coefficient 120588 and randomly initialize 119883

119894

with a population of 119873119875 animals in solution space(3) Evaluate the fitness for each individual 119883

119894 record the best individual 119883best

(4)While stopping criteria is not satisfied do(5) Establish a new living area by low = 119883best minus 119877 up = 119883best + 119877

(6) Animals migrate into the new living area(7) For 119894 = 1 to 119873119875 do(8) For 119895 = 1 to 119863 do(9) Select randomly 1199031 = 1199032 = 119894

(10) If rand gt 119875119886then

(11) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(12) End If(13) End For(14) End For(15) For 119894 = 1 to 119873119875 do(16) Evaluate the offspring119883

119894119866+1

(17) If 119883119894119866+1

is better than 119883119894then

(18) 119883119894= 119883119894119866+1

(19) End If(20) End for(21) Memorize the best solution achieved so far(22) 119877 = 119877 sdot 120588

(23) End while(24) End

Algorithm 2 An improved animalrsquos migration optimization algorithm (IAMO)

distance between cluster centers 119909lowast1 119909lowast

2 sdot sdot sdot 119909

lowast

119870and text data

set thenwe classify test data set into119870 categories according tothe distance and finally we can obtain the fitness accordingthe fitness function

119891 (119883 119866) =

119873

sum

119894=1

min 1003817100381710038171003817119866119894 minus 119909

119896

10038171003817100381710038172

| 119896 = 1 2 119870 (8)

According to the fitness function we obtain the bestindividual 119883best and the new living area can be establishedby 119883best and 119877

53 Individuals in Population Updating During the popula-tion updating process algorithm simulates some animals thatare preyed by their enemies or some animals leave the groupand some join in the group from other groups or some newanimals are born In IAMO we assume that the number of

available animals is fixed and every animal will be replacedby 119875119886 as shown in Section 32

Specific implementation steps of the improved animalsmigration optimization algorithm (IAMO) can be shown asin Algorithm 2

6 Numerical Simulation Experiments

All of the algorithm was programmed in MATLAB R2008anumerical experiment was set up on AMD Athlon(tm)II lowast4640 processor and 2GB memory

The experimental results comparing the IAMO clusteringalgorithm with six typical stochastic algorithms includingthe PSO [31] CPSO [32] ABC [20] CABC [11] AMO [27]and 119896-means algorithms are provided for two artificial datasets and eight real life data sets (Iris teaching assistantevaluation (TAE) wine seeds StatLog (heart) Hagermanrsquos

Discrete Dynamics in Nature and Society 5

i i minus 2

i minus 1

i + 1

i + 2

middot middot middot

Figure 1 The concept of the neighborhood of an animal

survival balance scale and Wisconsin breast cancer) whichare selected from the UCI machine learning repository [33]

Artificial Data Set One (119873 = 250 119889 = 3 and 119870 = 5) This is athree-featured problem with five classes where every featureof the classes was distributed according to Class 1-Uniform(85 100) Class 2-Uniform (70 85) Class 3-Uniform (55 70)Class 4-Uniform (40 55) and Class 5-Uniform (25 40) [1214] The data set is illustrated in Figure 3

Artificial Data Set Two (119873 = 600 119889 = 2 and 119870 = 4) This is atwo-featured problemwith four unique classes A total of 600patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732(120583 = (

119898119894

119898119894

) Σ = [[05 005

005 05]]) (9)

where 119894 = 1 2 3 4 gt 4 1198981

= minus3 1198982

= 0 1198983

= 3 and1198984= 6120583 and Σ are mean vector and covariance matrix respec-

tively [12 14] The data set is illustrated in Figure 4

Iris Data (119873 = 150 119889 = 4 and 119870 = 3) This data set with150 random samples of flowers from the Iris species setosaversicolor and virginica collected by Anderson [34] Fromeach species there are 50 observations for sepal length sepalwidth petal length and petal width in cm This data set wasused by Fisher [35] in his initiation of the linear-discriminant-function technique [11 12 33]

Teaching Assistant Evaluation (119873 = 151 119889 = 5 and 119870 =

3) The data consist of evaluations of teaching performanceover three regular semesters and two summer semestersof 151 teaching assistant (TA) assignments at the StatisticsDepartment of the University of Wisconsin-Madison Thescores were divided into 3 roughly equal-sized categories(ldquolowrdquo ldquomediumrdquo and ldquohighrdquo) to form the class variable [33]

Wine Data (119873 = 178 119889 = 13 and 119870 = 3) This isthe wine data set which is also taken from MCI laboratoryThese data are the results of a chemical analysis of winesgrown in the same region in Italy but derived from three

different cultivars The analysis determined the quantities of13 constituents found in each of the three types of winesThere are 178 instanceswith 13 numeric attributes inwine dataset All attributes are continuousThere is nomissing attributevalue [11 12 33]

Seeds Data (119873 = 210 119889 = 7 and119870 = 3)This data set consistsof 210 patterns belonging to three different varieties of wheatKama Rosa and Canadian From each species there are 70observations for area 119860 perimeter 119875 compactness 119862 (119862 =

4 lowast 119901119894 lowast 1198601198752) length of kernel width of kernel asymmetry

coefficient and length of kernel groove [33]

StatLog (Heart) Data (119873 = 270 119889 = 13 and119870 = 2)This dataset is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [33]

Hagermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) The dataset contains cases from a study that was conducted between1958 and 1970 at the University of Chicagorsquos Billings Hospitalon the survival of patients who had undergone surgery forbreast cancer It records two survival status patients with theage of patient at time of operation patientrsquos year of operationand number of positive axillary nodes detected [33]

Balance Scale Data (119873 = 625 119889 = 4 and119870 = 3)This data setwas generated to model psychological experimental resultsEach example is classified as having the balance scale tip to theright to the left or balancedThe attributes are the leftweightthe left distance the right weight and the right distance Thecorrect way to find the class is the greater of (left-distance lowast

left-weight) and (right-distance lowast right-weight) If they areequal it is balanced [33]

Wisconsin Breast Cancer (119873 = 683 119889 = 9 and 119870 =

2) It consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [11 12 33]

Herewe set the parameters of AMOand IAMOas followsThe population size of the AMO and IAMO is 100 In IAMOthe original living area radius 119877 = 03(119887 minus 119886) and shrinkagecoefficient 120588 = 092 For the PSO inertia weight 119908 = 0729acceleration coefficients 1198881 = 2 1198882 = 2 and population size119872 = 100 The population size of the CPSO is 20 The popu-lation size of the ABC and CABC are 50 and 10 respectivelyIn order to compare with other algorithms the maximumgenerations of all algorithms are 100

For every data set each algorithm is applied 20 times indi-vidually with random initial solution For the Art1 and Art2data set once the randomly generated parameters are deter-mined the same parameters are used to test the perform-ance of three algorithms We ranked each algorithm accord-ing to the mean result The results are kept four digits afterthe decimal point The mean value the best value the worstvalue the standard deviation and the rank value are recordedin Tables 1 2 3 4 5 6 7 8 9 and 10

6 Discrete Dynamics in Nature and Society

(a) The 119866th iteration living area (b) Animals begin to migrate (c) The 119866+ 1th iteration living area

Figure 2 Animals migration process

0

50

100

050

10020

40

60

80

100

Art1 data distribution

Figure 3 The distribution image of Art1

minus5 0 5 10minus6

minus4

minus2

0

2

4

6

8

10 Art2 data distribution

Figure 4 The distribution image of Art2

Table 1 Results obtained by the algorithms for 20 different runs onArt1 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 21066059 17207627 24560468 3588518 7PSO 19913796 17216508 24481993 1927566 5CPSO 18605699 17186937 24175543 3057752 4ABC 17185496 17182939 17189832 01955 3CABC 17184434 17182544 17203302 05488 2AMO 20621954 19744561 21991275 548921 6

IAMO 17182540 17182538 17182540 10990119890 minus

051

Table 2 Results obtained by the algorithms for 20 different runs onArt2 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 6442100 5146623 8964104 1811884 7PSO 5147965 5139191 5238308 23765 5

CPSO 5139046 5139035 5139085 13621119890 minus

034

ABC 5139037 5139035 5139045 32720119890 minus

043

CABC 5139037 5139035 5139696 15235119890 minus

042

AMO 5250879 5161389 5462325 84842 6

IAMO 5139035 5139035 5139035 93111119890 minus

061

Tables 1ndash10 show that IAMO is very precise than otheralgorithms in solving the ten data sets As seen from theresults the IAMOalgorithmprovides the best value and smallstandard deviation in comparison with other methods Forthe Art1 and Art2 data set in Tables 1 and 2 which were ran-domly generated IAMOobtained the best mean and smalleststandard deviation compared to other algorithms The mean

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 3: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Discrete Dynamics in Nature and Society 3

If the index of animal is 119894 then its neighborhood consistsof animal having indices 119894 minus 2 119894 minus 1 119894 119894 + 1 119894 + 2 if theindex of animal is 1 the neighborhood consists of animalhaving indices 119873119875 minus 1119873119875 1 2 3 and so forth Once theneighborhood topology has been constructed we select oneneighbor randomly and update the position of the individualaccording to this neighbor as can be seen in the followingformula

119883119894119866+1

= 119883119894119866

+ 120575 sdot (119883neiborhood119866 minus 119883119894119866

) (4)

where 119883neiborhood119866 is the current position of the neighbor-hood 120575 is produced by using a random number generatorcontrolled by a Gaussian distribution 119883

119894119866is the current

position of 119894th individual and 119883119894119866+1

is the new position of119894th individual

32 Population Updating Process During the populationupdating process the algorithm simulates how some animalsleave the group and some join in the new populationIndividuals will be replaced by some new animals with aprobability 119875119886 The probability is used according to thequality of the fitness We sort fitness in descending order sothe probability of the individual with best fitness is 1119873119875 andthe individual with worst fitness by contrast is 1 and theprocess can be shown in Algorithm 1

In Algorithm 1 1199031 1199032isin [1 119873119875] are randomly chosen

integers 1199031

= 1199032

= 119894 After producing the new solution119883119894119866+1

it will be evaluated and compared with the 119883119894119866 and

we choose the individual with a better objective fitness

119883119894=

119883119894119866

if 119891 (119883119894119866

) is better than 119891 (119883119894119866+1

)

119883119894119866+1

otherwise(5)

4 The New Migration Process Method

In AMO algorithm uses migration process and populationupdating process to find a satisfactory solutionThe proposedalgorithm used a new migration process by establishing aliving area by the leader animal (the individuals with bestfitness value) and animalsmigrate from current locations intothis new living area to simulate animal migration process

At first there are 119873119875 animals that live in living area asshown in Figure 2(a) moving eating drinking reproducingand so on some individuals move randomly and theirposition be updated and then we calculate the best positionof animals by fitness function and record it But the amountof food or water gradually diminished as the time wore onas shown in Figure 2(b) and some animals migrate fromthe current areas which have no food and water to a newarea with abundant food and water as shown in Figure 2(c)In Figure 2 the green parts represent the living areas withabundant food and water animals can live in these areasAnd the yellow parts represent the areas that lack food orwater animals can no longer live in these areas and theymustmigrate to a new living area (the green parts in Figure 2(c))We shrink the living area after a period of time (as shown inFigures 2(a) and 2(c)) and then animals migrate to the newliving area ceaselessly As a rule of thumb the globally optimal

solution always nearby is the current best solution in IAMOthe animals living area is smaller and smaller (by formula(6)) after each iteration and the individuals get closer andcloser to the globally optimal solution so we can acceleratethe convergence velocity and precision of the algorithm tosome extent

The boundary of the living area is established by

low = 119883best minus 119877 up = 119883best + 119877

119877 = 120588 sdot 119877

(6)

where 119883best is the leader animal (the current best solution)low and up are the lower and upper bound of the living area119877 is living area radius 120588 is shrinkage coefficient 120588 isin (0 1)and low up and 119877 are all 1 times 119863 row vector In general theoriginal value of 119877 depends on the size of the search spaceAs iterations go on a big value of 119877 improves the explorationability of the algorithm and a small value of 119877 improves theexploitation ability of the algorithm

5 The IAMO Algorithm forSolving Clustering Problem

51 Initializing the Population During the initialization pro-cess the algorithm begins with initializing a set of119873119875 animalpositions 119883

1 1198832 1198833 119883

119873119875 each animal position 119883

119894is a

1 times (119870 times 119863)-dimensional vector where 119870 is the number ofclustering center and119863 is the dimension of the test set119866

119873times119863

The cluster centers 119909lowast

119894= (1199091198941 1199091198942 119909

119894119863) (119894 = 1 2 119870)

each center 119909lowast

119894is 1 times 119863-dimensional vector and the lower

bound of the centers is the minimum of each column in testset 119866119873times119863

namely 119886lowast119894

= min1198661 1198662 119866

119863 and the upper

bound of the centers is 119887lowast

119894= max119866

1 1198662 119866

119863 So we can

initialize the position of an individual 119883119894= 119909lowast

1 119909lowast

2 119909lowast

119870=

(11990911 11990912 119909

1119863) (11990921 11990922 119909

2119863) (119909

1198701 1199091198702

119909119870119863

) =

11990911 11990912 119909

119870119863 and then the lower and upper bounds of

the solution space are 119886 = (119886lowast

1 119886lowast

2 119886

lowast

119870) and 119887 = (119887

lowast

1 119887lowast

2

119887lowast

119870)

Animals are randomly and uniformly distributedbetween the prespecified lower initial parameter bound 119886

and the upper initial parameter bound 119887 So the 119895th compo-nent of the 119894th vector is as follows

119909119894119895

= 119886119895+ rand

119894119895 [0 1] sdot (119887119895 minus 119886119895)

119894 = 1 119873119875 119895 = 1 119870 times 119863

(7)

where rand119894119895[0 1] is a uniform distribution random number

between 0 and 1

52 Animals Migration During the migration processbecause of animals hunting foraging or drinking in theliving area some parts of the living area are lacking food orwater or climate condition change and some animalsmigratefrom the current living area to the new living area whichhas abundant food and water or climate condition suitablefor living We assume that there is only one living area andanimals out of the new living area would be migrating intothe new living area as depicted in Section 4 We calculate the

4 Discrete Dynamics in Nature and Society

(1) For 119894 = 1 to 119873119875 do(2) For 119895 = 1 to 119863 do(3) If rand gt 119875

119886then

(4) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(5) End If(6) End For(7) End For

Algorithm 1 Population updating process

(1) Begin(2) Set the generation counter 119866 living area radius 119877 shrinkage coefficient 120588 and randomly initialize 119883

119894

with a population of 119873119875 animals in solution space(3) Evaluate the fitness for each individual 119883

119894 record the best individual 119883best

(4)While stopping criteria is not satisfied do(5) Establish a new living area by low = 119883best minus 119877 up = 119883best + 119877

(6) Animals migrate into the new living area(7) For 119894 = 1 to 119873119875 do(8) For 119895 = 1 to 119863 do(9) Select randomly 1199031 = 1199032 = 119894

(10) If rand gt 119875119886then

(11) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(12) End If(13) End For(14) End For(15) For 119894 = 1 to 119873119875 do(16) Evaluate the offspring119883

119894119866+1

(17) If 119883119894119866+1

is better than 119883119894then

(18) 119883119894= 119883119894119866+1

(19) End If(20) End for(21) Memorize the best solution achieved so far(22) 119877 = 119877 sdot 120588

(23) End while(24) End

Algorithm 2 An improved animalrsquos migration optimization algorithm (IAMO)

distance between cluster centers 119909lowast1 119909lowast

2 sdot sdot sdot 119909

lowast

119870and text data

set thenwe classify test data set into119870 categories according tothe distance and finally we can obtain the fitness accordingthe fitness function

119891 (119883 119866) =

119873

sum

119894=1

min 1003817100381710038171003817119866119894 minus 119909

119896

10038171003817100381710038172

| 119896 = 1 2 119870 (8)

According to the fitness function we obtain the bestindividual 119883best and the new living area can be establishedby 119883best and 119877

53 Individuals in Population Updating During the popula-tion updating process algorithm simulates some animals thatare preyed by their enemies or some animals leave the groupand some join in the group from other groups or some newanimals are born In IAMO we assume that the number of

available animals is fixed and every animal will be replacedby 119875119886 as shown in Section 32

Specific implementation steps of the improved animalsmigration optimization algorithm (IAMO) can be shown asin Algorithm 2

6 Numerical Simulation Experiments

All of the algorithm was programmed in MATLAB R2008anumerical experiment was set up on AMD Athlon(tm)II lowast4640 processor and 2GB memory

The experimental results comparing the IAMO clusteringalgorithm with six typical stochastic algorithms includingthe PSO [31] CPSO [32] ABC [20] CABC [11] AMO [27]and 119896-means algorithms are provided for two artificial datasets and eight real life data sets (Iris teaching assistantevaluation (TAE) wine seeds StatLog (heart) Hagermanrsquos

Discrete Dynamics in Nature and Society 5

i i minus 2

i minus 1

i + 1

i + 2

middot middot middot

Figure 1 The concept of the neighborhood of an animal

survival balance scale and Wisconsin breast cancer) whichare selected from the UCI machine learning repository [33]

Artificial Data Set One (119873 = 250 119889 = 3 and 119870 = 5) This is athree-featured problem with five classes where every featureof the classes was distributed according to Class 1-Uniform(85 100) Class 2-Uniform (70 85) Class 3-Uniform (55 70)Class 4-Uniform (40 55) and Class 5-Uniform (25 40) [1214] The data set is illustrated in Figure 3

Artificial Data Set Two (119873 = 600 119889 = 2 and 119870 = 4) This is atwo-featured problemwith four unique classes A total of 600patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732(120583 = (

119898119894

119898119894

) Σ = [[05 005

005 05]]) (9)

where 119894 = 1 2 3 4 gt 4 1198981

= minus3 1198982

= 0 1198983

= 3 and1198984= 6120583 and Σ are mean vector and covariance matrix respec-

tively [12 14] The data set is illustrated in Figure 4

Iris Data (119873 = 150 119889 = 4 and 119870 = 3) This data set with150 random samples of flowers from the Iris species setosaversicolor and virginica collected by Anderson [34] Fromeach species there are 50 observations for sepal length sepalwidth petal length and petal width in cm This data set wasused by Fisher [35] in his initiation of the linear-discriminant-function technique [11 12 33]

Teaching Assistant Evaluation (119873 = 151 119889 = 5 and 119870 =

3) The data consist of evaluations of teaching performanceover three regular semesters and two summer semestersof 151 teaching assistant (TA) assignments at the StatisticsDepartment of the University of Wisconsin-Madison Thescores were divided into 3 roughly equal-sized categories(ldquolowrdquo ldquomediumrdquo and ldquohighrdquo) to form the class variable [33]

Wine Data (119873 = 178 119889 = 13 and 119870 = 3) This isthe wine data set which is also taken from MCI laboratoryThese data are the results of a chemical analysis of winesgrown in the same region in Italy but derived from three

different cultivars The analysis determined the quantities of13 constituents found in each of the three types of winesThere are 178 instanceswith 13 numeric attributes inwine dataset All attributes are continuousThere is nomissing attributevalue [11 12 33]

Seeds Data (119873 = 210 119889 = 7 and119870 = 3)This data set consistsof 210 patterns belonging to three different varieties of wheatKama Rosa and Canadian From each species there are 70observations for area 119860 perimeter 119875 compactness 119862 (119862 =

4 lowast 119901119894 lowast 1198601198752) length of kernel width of kernel asymmetry

coefficient and length of kernel groove [33]

StatLog (Heart) Data (119873 = 270 119889 = 13 and119870 = 2)This dataset is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [33]

Hagermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) The dataset contains cases from a study that was conducted between1958 and 1970 at the University of Chicagorsquos Billings Hospitalon the survival of patients who had undergone surgery forbreast cancer It records two survival status patients with theage of patient at time of operation patientrsquos year of operationand number of positive axillary nodes detected [33]

Balance Scale Data (119873 = 625 119889 = 4 and119870 = 3)This data setwas generated to model psychological experimental resultsEach example is classified as having the balance scale tip to theright to the left or balancedThe attributes are the leftweightthe left distance the right weight and the right distance Thecorrect way to find the class is the greater of (left-distance lowast

left-weight) and (right-distance lowast right-weight) If they areequal it is balanced [33]

Wisconsin Breast Cancer (119873 = 683 119889 = 9 and 119870 =

2) It consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [11 12 33]

Herewe set the parameters of AMOand IAMOas followsThe population size of the AMO and IAMO is 100 In IAMOthe original living area radius 119877 = 03(119887 minus 119886) and shrinkagecoefficient 120588 = 092 For the PSO inertia weight 119908 = 0729acceleration coefficients 1198881 = 2 1198882 = 2 and population size119872 = 100 The population size of the CPSO is 20 The popu-lation size of the ABC and CABC are 50 and 10 respectivelyIn order to compare with other algorithms the maximumgenerations of all algorithms are 100

For every data set each algorithm is applied 20 times indi-vidually with random initial solution For the Art1 and Art2data set once the randomly generated parameters are deter-mined the same parameters are used to test the perform-ance of three algorithms We ranked each algorithm accord-ing to the mean result The results are kept four digits afterthe decimal point The mean value the best value the worstvalue the standard deviation and the rank value are recordedin Tables 1 2 3 4 5 6 7 8 9 and 10

6 Discrete Dynamics in Nature and Society

(a) The 119866th iteration living area (b) Animals begin to migrate (c) The 119866+ 1th iteration living area

Figure 2 Animals migration process

0

50

100

050

10020

40

60

80

100

Art1 data distribution

Figure 3 The distribution image of Art1

minus5 0 5 10minus6

minus4

minus2

0

2

4

6

8

10 Art2 data distribution

Figure 4 The distribution image of Art2

Table 1 Results obtained by the algorithms for 20 different runs onArt1 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 21066059 17207627 24560468 3588518 7PSO 19913796 17216508 24481993 1927566 5CPSO 18605699 17186937 24175543 3057752 4ABC 17185496 17182939 17189832 01955 3CABC 17184434 17182544 17203302 05488 2AMO 20621954 19744561 21991275 548921 6

IAMO 17182540 17182538 17182540 10990119890 minus

051

Table 2 Results obtained by the algorithms for 20 different runs onArt2 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 6442100 5146623 8964104 1811884 7PSO 5147965 5139191 5238308 23765 5

CPSO 5139046 5139035 5139085 13621119890 minus

034

ABC 5139037 5139035 5139045 32720119890 minus

043

CABC 5139037 5139035 5139696 15235119890 minus

042

AMO 5250879 5161389 5462325 84842 6

IAMO 5139035 5139035 5139035 93111119890 minus

061

Tables 1ndash10 show that IAMO is very precise than otheralgorithms in solving the ten data sets As seen from theresults the IAMOalgorithmprovides the best value and smallstandard deviation in comparison with other methods Forthe Art1 and Art2 data set in Tables 1 and 2 which were ran-domly generated IAMOobtained the best mean and smalleststandard deviation compared to other algorithms The mean

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 4: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

4 Discrete Dynamics in Nature and Society

(1) For 119894 = 1 to 119873119875 do(2) For 119895 = 1 to 119863 do(3) If rand gt 119875

119886then

(4) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(5) End If(6) End For(7) End For

Algorithm 1 Population updating process

(1) Begin(2) Set the generation counter 119866 living area radius 119877 shrinkage coefficient 120588 and randomly initialize 119883

119894

with a population of 119873119875 animals in solution space(3) Evaluate the fitness for each individual 119883

119894 record the best individual 119883best

(4)While stopping criteria is not satisfied do(5) Establish a new living area by low = 119883best minus 119877 up = 119883best + 119877

(6) Animals migrate into the new living area(7) For 119894 = 1 to 119873119875 do(8) For 119895 = 1 to 119863 do(9) Select randomly 1199031 = 1199032 = 119894

(10) If rand gt 119875119886then

(11) 119883119894119866+1

= 1198831199031119866

+ rand sdot (119883best119866 minus 119883119894119866

) + rand sdot (1198831199032119866

minus 119883119894119866

)

(12) End If(13) End For(14) End For(15) For 119894 = 1 to 119873119875 do(16) Evaluate the offspring119883

119894119866+1

(17) If 119883119894119866+1

is better than 119883119894then

(18) 119883119894= 119883119894119866+1

(19) End If(20) End for(21) Memorize the best solution achieved so far(22) 119877 = 119877 sdot 120588

(23) End while(24) End

Algorithm 2 An improved animalrsquos migration optimization algorithm (IAMO)

distance between cluster centers 119909lowast1 119909lowast

2 sdot sdot sdot 119909

lowast

119870and text data

set thenwe classify test data set into119870 categories according tothe distance and finally we can obtain the fitness accordingthe fitness function

119891 (119883 119866) =

119873

sum

119894=1

min 1003817100381710038171003817119866119894 minus 119909

119896

10038171003817100381710038172

| 119896 = 1 2 119870 (8)

According to the fitness function we obtain the bestindividual 119883best and the new living area can be establishedby 119883best and 119877

53 Individuals in Population Updating During the popula-tion updating process algorithm simulates some animals thatare preyed by their enemies or some animals leave the groupand some join in the group from other groups or some newanimals are born In IAMO we assume that the number of

available animals is fixed and every animal will be replacedby 119875119886 as shown in Section 32

Specific implementation steps of the improved animalsmigration optimization algorithm (IAMO) can be shown asin Algorithm 2

6 Numerical Simulation Experiments

All of the algorithm was programmed in MATLAB R2008anumerical experiment was set up on AMD Athlon(tm)II lowast4640 processor and 2GB memory

The experimental results comparing the IAMO clusteringalgorithm with six typical stochastic algorithms includingthe PSO [31] CPSO [32] ABC [20] CABC [11] AMO [27]and 119896-means algorithms are provided for two artificial datasets and eight real life data sets (Iris teaching assistantevaluation (TAE) wine seeds StatLog (heart) Hagermanrsquos

Discrete Dynamics in Nature and Society 5

i i minus 2

i minus 1

i + 1

i + 2

middot middot middot

Figure 1 The concept of the neighborhood of an animal

survival balance scale and Wisconsin breast cancer) whichare selected from the UCI machine learning repository [33]

Artificial Data Set One (119873 = 250 119889 = 3 and 119870 = 5) This is athree-featured problem with five classes where every featureof the classes was distributed according to Class 1-Uniform(85 100) Class 2-Uniform (70 85) Class 3-Uniform (55 70)Class 4-Uniform (40 55) and Class 5-Uniform (25 40) [1214] The data set is illustrated in Figure 3

Artificial Data Set Two (119873 = 600 119889 = 2 and 119870 = 4) This is atwo-featured problemwith four unique classes A total of 600patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732(120583 = (

119898119894

119898119894

) Σ = [[05 005

005 05]]) (9)

where 119894 = 1 2 3 4 gt 4 1198981

= minus3 1198982

= 0 1198983

= 3 and1198984= 6120583 and Σ are mean vector and covariance matrix respec-

tively [12 14] The data set is illustrated in Figure 4

Iris Data (119873 = 150 119889 = 4 and 119870 = 3) This data set with150 random samples of flowers from the Iris species setosaversicolor and virginica collected by Anderson [34] Fromeach species there are 50 observations for sepal length sepalwidth petal length and petal width in cm This data set wasused by Fisher [35] in his initiation of the linear-discriminant-function technique [11 12 33]

Teaching Assistant Evaluation (119873 = 151 119889 = 5 and 119870 =

3) The data consist of evaluations of teaching performanceover three regular semesters and two summer semestersof 151 teaching assistant (TA) assignments at the StatisticsDepartment of the University of Wisconsin-Madison Thescores were divided into 3 roughly equal-sized categories(ldquolowrdquo ldquomediumrdquo and ldquohighrdquo) to form the class variable [33]

Wine Data (119873 = 178 119889 = 13 and 119870 = 3) This isthe wine data set which is also taken from MCI laboratoryThese data are the results of a chemical analysis of winesgrown in the same region in Italy but derived from three

different cultivars The analysis determined the quantities of13 constituents found in each of the three types of winesThere are 178 instanceswith 13 numeric attributes inwine dataset All attributes are continuousThere is nomissing attributevalue [11 12 33]

Seeds Data (119873 = 210 119889 = 7 and119870 = 3)This data set consistsof 210 patterns belonging to three different varieties of wheatKama Rosa and Canadian From each species there are 70observations for area 119860 perimeter 119875 compactness 119862 (119862 =

4 lowast 119901119894 lowast 1198601198752) length of kernel width of kernel asymmetry

coefficient and length of kernel groove [33]

StatLog (Heart) Data (119873 = 270 119889 = 13 and119870 = 2)This dataset is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [33]

Hagermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) The dataset contains cases from a study that was conducted between1958 and 1970 at the University of Chicagorsquos Billings Hospitalon the survival of patients who had undergone surgery forbreast cancer It records two survival status patients with theage of patient at time of operation patientrsquos year of operationand number of positive axillary nodes detected [33]

Balance Scale Data (119873 = 625 119889 = 4 and119870 = 3)This data setwas generated to model psychological experimental resultsEach example is classified as having the balance scale tip to theright to the left or balancedThe attributes are the leftweightthe left distance the right weight and the right distance Thecorrect way to find the class is the greater of (left-distance lowast

left-weight) and (right-distance lowast right-weight) If they areequal it is balanced [33]

Wisconsin Breast Cancer (119873 = 683 119889 = 9 and 119870 =

2) It consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [11 12 33]

Herewe set the parameters of AMOand IAMOas followsThe population size of the AMO and IAMO is 100 In IAMOthe original living area radius 119877 = 03(119887 minus 119886) and shrinkagecoefficient 120588 = 092 For the PSO inertia weight 119908 = 0729acceleration coefficients 1198881 = 2 1198882 = 2 and population size119872 = 100 The population size of the CPSO is 20 The popu-lation size of the ABC and CABC are 50 and 10 respectivelyIn order to compare with other algorithms the maximumgenerations of all algorithms are 100

For every data set each algorithm is applied 20 times indi-vidually with random initial solution For the Art1 and Art2data set once the randomly generated parameters are deter-mined the same parameters are used to test the perform-ance of three algorithms We ranked each algorithm accord-ing to the mean result The results are kept four digits afterthe decimal point The mean value the best value the worstvalue the standard deviation and the rank value are recordedin Tables 1 2 3 4 5 6 7 8 9 and 10

6 Discrete Dynamics in Nature and Society

(a) The 119866th iteration living area (b) Animals begin to migrate (c) The 119866+ 1th iteration living area

Figure 2 Animals migration process

0

50

100

050

10020

40

60

80

100

Art1 data distribution

Figure 3 The distribution image of Art1

minus5 0 5 10minus6

minus4

minus2

0

2

4

6

8

10 Art2 data distribution

Figure 4 The distribution image of Art2

Table 1 Results obtained by the algorithms for 20 different runs onArt1 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 21066059 17207627 24560468 3588518 7PSO 19913796 17216508 24481993 1927566 5CPSO 18605699 17186937 24175543 3057752 4ABC 17185496 17182939 17189832 01955 3CABC 17184434 17182544 17203302 05488 2AMO 20621954 19744561 21991275 548921 6

IAMO 17182540 17182538 17182540 10990119890 minus

051

Table 2 Results obtained by the algorithms for 20 different runs onArt2 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 6442100 5146623 8964104 1811884 7PSO 5147965 5139191 5238308 23765 5

CPSO 5139046 5139035 5139085 13621119890 minus

034

ABC 5139037 5139035 5139045 32720119890 minus

043

CABC 5139037 5139035 5139696 15235119890 minus

042

AMO 5250879 5161389 5462325 84842 6

IAMO 5139035 5139035 5139035 93111119890 minus

061

Tables 1ndash10 show that IAMO is very precise than otheralgorithms in solving the ten data sets As seen from theresults the IAMOalgorithmprovides the best value and smallstandard deviation in comparison with other methods Forthe Art1 and Art2 data set in Tables 1 and 2 which were ran-domly generated IAMOobtained the best mean and smalleststandard deviation compared to other algorithms The mean

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 5: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Discrete Dynamics in Nature and Society 5

i i minus 2

i minus 1

i + 1

i + 2

middot middot middot

Figure 1 The concept of the neighborhood of an animal

survival balance scale and Wisconsin breast cancer) whichare selected from the UCI machine learning repository [33]

Artificial Data Set One (119873 = 250 119889 = 3 and 119870 = 5) This is athree-featured problem with five classes where every featureof the classes was distributed according to Class 1-Uniform(85 100) Class 2-Uniform (70 85) Class 3-Uniform (55 70)Class 4-Uniform (40 55) and Class 5-Uniform (25 40) [1214] The data set is illustrated in Figure 3

Artificial Data Set Two (119873 = 600 119889 = 2 and 119870 = 4) This is atwo-featured problemwith four unique classes A total of 600patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732(120583 = (

119898119894

119898119894

) Σ = [[05 005

005 05]]) (9)

where 119894 = 1 2 3 4 gt 4 1198981

= minus3 1198982

= 0 1198983

= 3 and1198984= 6120583 and Σ are mean vector and covariance matrix respec-

tively [12 14] The data set is illustrated in Figure 4

Iris Data (119873 = 150 119889 = 4 and 119870 = 3) This data set with150 random samples of flowers from the Iris species setosaversicolor and virginica collected by Anderson [34] Fromeach species there are 50 observations for sepal length sepalwidth petal length and petal width in cm This data set wasused by Fisher [35] in his initiation of the linear-discriminant-function technique [11 12 33]

Teaching Assistant Evaluation (119873 = 151 119889 = 5 and 119870 =

3) The data consist of evaluations of teaching performanceover three regular semesters and two summer semestersof 151 teaching assistant (TA) assignments at the StatisticsDepartment of the University of Wisconsin-Madison Thescores were divided into 3 roughly equal-sized categories(ldquolowrdquo ldquomediumrdquo and ldquohighrdquo) to form the class variable [33]

Wine Data (119873 = 178 119889 = 13 and 119870 = 3) This isthe wine data set which is also taken from MCI laboratoryThese data are the results of a chemical analysis of winesgrown in the same region in Italy but derived from three

different cultivars The analysis determined the quantities of13 constituents found in each of the three types of winesThere are 178 instanceswith 13 numeric attributes inwine dataset All attributes are continuousThere is nomissing attributevalue [11 12 33]

Seeds Data (119873 = 210 119889 = 7 and119870 = 3)This data set consistsof 210 patterns belonging to three different varieties of wheatKama Rosa and Canadian From each species there are 70observations for area 119860 perimeter 119875 compactness 119862 (119862 =

4 lowast 119901119894 lowast 1198601198752) length of kernel width of kernel asymmetry

coefficient and length of kernel groove [33]

StatLog (Heart) Data (119873 = 270 119889 = 13 and119870 = 2)This dataset is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [33]

Hagermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) The dataset contains cases from a study that was conducted between1958 and 1970 at the University of Chicagorsquos Billings Hospitalon the survival of patients who had undergone surgery forbreast cancer It records two survival status patients with theage of patient at time of operation patientrsquos year of operationand number of positive axillary nodes detected [33]

Balance Scale Data (119873 = 625 119889 = 4 and119870 = 3)This data setwas generated to model psychological experimental resultsEach example is classified as having the balance scale tip to theright to the left or balancedThe attributes are the leftweightthe left distance the right weight and the right distance Thecorrect way to find the class is the greater of (left-distance lowast

left-weight) and (right-distance lowast right-weight) If they areequal it is balanced [33]

Wisconsin Breast Cancer (119873 = 683 119889 = 9 and 119870 =

2) It consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [11 12 33]

Herewe set the parameters of AMOand IAMOas followsThe population size of the AMO and IAMO is 100 In IAMOthe original living area radius 119877 = 03(119887 minus 119886) and shrinkagecoefficient 120588 = 092 For the PSO inertia weight 119908 = 0729acceleration coefficients 1198881 = 2 1198882 = 2 and population size119872 = 100 The population size of the CPSO is 20 The popu-lation size of the ABC and CABC are 50 and 10 respectivelyIn order to compare with other algorithms the maximumgenerations of all algorithms are 100

For every data set each algorithm is applied 20 times indi-vidually with random initial solution For the Art1 and Art2data set once the randomly generated parameters are deter-mined the same parameters are used to test the perform-ance of three algorithms We ranked each algorithm accord-ing to the mean result The results are kept four digits afterthe decimal point The mean value the best value the worstvalue the standard deviation and the rank value are recordedin Tables 1 2 3 4 5 6 7 8 9 and 10

6 Discrete Dynamics in Nature and Society

(a) The 119866th iteration living area (b) Animals begin to migrate (c) The 119866+ 1th iteration living area

Figure 2 Animals migration process

0

50

100

050

10020

40

60

80

100

Art1 data distribution

Figure 3 The distribution image of Art1

minus5 0 5 10minus6

minus4

minus2

0

2

4

6

8

10 Art2 data distribution

Figure 4 The distribution image of Art2

Table 1 Results obtained by the algorithms for 20 different runs onArt1 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 21066059 17207627 24560468 3588518 7PSO 19913796 17216508 24481993 1927566 5CPSO 18605699 17186937 24175543 3057752 4ABC 17185496 17182939 17189832 01955 3CABC 17184434 17182544 17203302 05488 2AMO 20621954 19744561 21991275 548921 6

IAMO 17182540 17182538 17182540 10990119890 minus

051

Table 2 Results obtained by the algorithms for 20 different runs onArt2 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 6442100 5146623 8964104 1811884 7PSO 5147965 5139191 5238308 23765 5

CPSO 5139046 5139035 5139085 13621119890 minus

034

ABC 5139037 5139035 5139045 32720119890 minus

043

CABC 5139037 5139035 5139696 15235119890 minus

042

AMO 5250879 5161389 5462325 84842 6

IAMO 5139035 5139035 5139035 93111119890 minus

061

Tables 1ndash10 show that IAMO is very precise than otheralgorithms in solving the ten data sets As seen from theresults the IAMOalgorithmprovides the best value and smallstandard deviation in comparison with other methods Forthe Art1 and Art2 data set in Tables 1 and 2 which were ran-domly generated IAMOobtained the best mean and smalleststandard deviation compared to other algorithms The mean

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 6: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

6 Discrete Dynamics in Nature and Society

(a) The 119866th iteration living area (b) Animals begin to migrate (c) The 119866+ 1th iteration living area

Figure 2 Animals migration process

0

50

100

050

10020

40

60

80

100

Art1 data distribution

Figure 3 The distribution image of Art1

minus5 0 5 10minus6

minus4

minus2

0

2

4

6

8

10 Art2 data distribution

Figure 4 The distribution image of Art2

Table 1 Results obtained by the algorithms for 20 different runs onArt1 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 21066059 17207627 24560468 3588518 7PSO 19913796 17216508 24481993 1927566 5CPSO 18605699 17186937 24175543 3057752 4ABC 17185496 17182939 17189832 01955 3CABC 17184434 17182544 17203302 05488 2AMO 20621954 19744561 21991275 548921 6

IAMO 17182540 17182538 17182540 10990119890 minus

051

Table 2 Results obtained by the algorithms for 20 different runs onArt2 data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 6442100 5146623 8964104 1811884 7PSO 5147965 5139191 5238308 23765 5

CPSO 5139046 5139035 5139085 13621119890 minus

034

ABC 5139037 5139035 5139045 32720119890 minus

043

CABC 5139037 5139035 5139696 15235119890 minus

042

AMO 5250879 5161389 5462325 84842 6

IAMO 5139035 5139035 5139035 93111119890 minus

061

Tables 1ndash10 show that IAMO is very precise than otheralgorithms in solving the ten data sets As seen from theresults the IAMOalgorithmprovides the best value and smallstandard deviation in comparison with other methods Forthe Art1 and Art2 data set in Tables 1 and 2 which were ran-domly generated IAMOobtained the best mean and smalleststandard deviation compared to other algorithms The mean

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 7: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Discrete Dynamics in Nature and Society 7

Table 3 Results obtained by the algorithms for 20 different runs onIris data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 1028412 994582 1234678 87805 7PSO 983345 966567 1042224 22431 5CPSO 969721 966580 975211 2966119890 minus 01 4ABC 966659 966566 967547 21388119890 minus 02 3CABC 966561 966555 966599 11685119890 minus 03 2AMO 990055 970751 1005484 11202 6IAMO 966555 966555 966555 12155119890 minus 06 1

Table 4 Results obtained by the algorithms for 20 different runs onTAE data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 15335441 15039418 16051206 297491 7PSO 15015895 14902455 15292453 163170 6CPSO 14998073 14921980 15324523 174859 5ABC 14914434 14909775 14924754 05128 3CABC 14913099 14909276 14973575 27356 2AMO 14990215 14933564 15094512 68524 4IAMO 14910900 14909321 14925707 04482 1

Table 5 Results obtained by the algorithms for 20 different runs onwine data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 178532412 163744353 184843458 10036327 7PSO 163070584 163028364 163140563 58844 5CPSO 163049829 162990526 163121291 59887 3ABC 163056668 162991970 163219535 98439 4

CABC 162921982 162921858 162922094 10563119890 minus

022

AMO 163597965 163199935 164005533 300025 6

IAMO 162921855 162921849 162921862 50627119890 minus

041

value of IAMO obtained is 17182540 in solving Art1 whileABC and CABC obtained 17185496 and 17184434 andIAMO gives 4 orders of magnitude better than ABC andCABC Same to solving Art2 IAMOobtained 5139035 whileCPSO ABC and CABC obtained 5139046 5139037 and5139037 respectively but the standard deviation of IAMOis at least 2 orders of magnitude better than them For Irisdata set the mean value the optimum value and the worstvalue of IAMO are all 966555 and the standard deviationis 12155119890 minus 06 which revealed the robustness of IAMOCABC also sought the best solution 966555 but the standarddeviation is bigger than IAMO when the best solutions ofAMO PSO CPSO ABC and 119896-means are 970751 966567966580 966566 and 994582 respectively Table 4 shows theresults of algorithms on the TAE data set The mean value

Table 6 Results obtained by the algorithms for 20 different runs onseeds data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 3134977 3131428 3137343 26879119890 minus

015

PSO 3265250 3183185 3352944 60131 7CPSO 3121138 3119116 3123788 02899 4

ABC 3120382 3118520 3122110 67210119890 minus

023

CABC 3117980 3117980 3117982 14865119890 minus

042

AMO 3193922 3138100 3279267 31572 6

IAMO 3117980 3117980 3117980 33686119890 minus

051

Table 7 Results obtained by the algorithms for 20 different runs onStatLog (heart) data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 106957851 106825524 107038544 82080 6PSO 107600684 106448965 110242641 259460 7CPSO 106511354 106242168 107477609 550264 4ABC 106274760 106267154 106296472 10354 3

CABC 106229904 106229824 106236762 17830119890 minus

022

AMO 106751758 106586325 106961824 176918 5

IAMO 106229824 106229824 106229825 50093119890 minus

051

Table 8 Results obtained by the algorithms for 20 different runs onHagermanrsquos survival data

Algorithm Mean Best Worst Standarddeviation Rank

119896-means 26401277 26105245 31805211 1267500 7PSO 25674479 25669899 25691620 09587 6CPSO 25673233 25669889 25678257 04578 5

ABC 25669892 25669888 25669895 23919119890 minus

042

CABC 25669903 25669888 25669955 28709119890 minus

033

AMO 25669985 25669907 25670096 91598119890 minus

034

IAMO 25669888 25669888 25669888 23022119890 minus

061

of IAMO is 14910900 which is smaller than that of AMOPSO CPSO ABC CABC and 119896-means within 20 runs Forwine data set IAMOreached themean value 162921855whileCABC reached themean value 162921982The best value andworst value of IAMO are 162921849 and 162921862 whichare also better than 162921858 and 162922094 obtained byCABC and the standard deviation value of IAMO is also thesmallest one Table 6 provides the results of algorithms on the

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 8: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

8 Discrete Dynamics in Nature and Society

Table 9 Results obtained by the algorithms for 20 different runs onbalance scale data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 14267522 14238570 14338423 31208 5PSO 14301546 14264237 14476403 93451 7CPSO 14248260 14235525 14255460 09048 4

ABC 14239238 14238308 14259821 45452119890 minus

023

CABC 14239109 14238206 14252445 14053119890 minus

022

AMO 14291499 14279670 14308177 11641 6

IAMO 14238204 14238204 14238204 49763119890 minus

061

Table 10 Results obtained by the algorithms for 20 different runson cancer data

Algorithm Mean Best Worst Standarddeviation

Rank

119896-means 29812564 29764687 29884277 48661 5PSO 30012685 29697475 31143658 639370 6CPSO 29648268 29644167 29652941 03773 3ABC 29655369 29649734 29664785 08081 4

CABC 29644138 29643870 29645222 60177119890 minus

022

AMO 30016328 29740974 30509651 327322 7

IAMO 29643870 29643870 29643870 22260119890 minus

051

seeds data set the IAMO algorithm and CABC algorithm aresuperior to those obtained by the others Although IAMOandCABC reached the same mean value 3117980 the standarddeviation of IAMO is 1 order ofmagnitude better thanCABCOn StatLog (heart) data set results given in Table 7 IAMOgets the best value is 106229824 and the same as CABC whilethe mean values of the two algorithms are 106229824 and106229904 so the IAMO is better than CABC algorithm ForHagermanrsquos survival data set the optimum value 25669888can be obtained by IAMO ABC and CABC but the standarddeviations ofABCandCABCare 23919119890minus04 and 28709119890minus03

which is worse than that of 23022119890 minus 06 obtained by IAMOThe standard deviation of PSO is a little bigger than that ofCPSO For balance scale data set in Table 9 as seen fromthe results the mean best and worst ones are all 14238204which reflect the stable characteristics of IAMO The threebest algorithms in this test data are IAMO CABC and ABCand the best results of them are 14238204 14238206 and14238308 For Wisconsin breast cancer data set in Table 10the mean value the best value and the worst value are all29643870 which are obviously superior to 119896-means PSOCPSO ABC and AMO

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

2400

2500

Evolvement generation

Fitn

ess v

alue

Art1 data

PSOCPSOABC

CABCAMOIAMO

Figure 5 The convergence curve of the Art1 data

0 10 20 30 40 50 60 70 80 90 100500

550

600

650

700

750

800

850

900

Evolvement generation

Fitn

ess v

alue

Art2 data

PSOCPSOABC

CABCAMOIAMO

Figure 6 The convergence curve of the Art2 data

As seen from Table 1 to Table 10 we can conclude thatalthough the convergence rate is not quick enough at thebeginning of the iteration compared to ABC and CABC thefinal results are the best compared to other algorithms in alltest data sets The most results of ABC and CABC are betterthan PSO and CPSO and the 119896-means algorithm is the worstfor most of test data sets

Figures 5 6 7 8 9 10 11 12 13 and 14 show the con-vergence curves of different data sets for various algorithmsFigures 15 and 16 show the original data distribution of Irisdata set and the clustering result by IAMO algorithm

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 9: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Discrete Dynamics in Nature and Society 9

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

110

112

Evolvement generation

Fitn

ess v

alue

Iris data

PSOCPSOABC

CABCAMOIAMO

Figure 7 The convergence curve of the Iris data

0 10 20 30 40 50 60 70 80 90 1001480

1500

1520

1540

1560

1580

1600

1620

1640

Evolvement generation

Fitn

ess v

alue

TAE data

PSOCPSOABC

CABCAMOIAMO

Figure 8 The convergence curve of the TAE data

7 Living Area Radius Evaluation

The performance and results of the proposed algorithms aregreatly affected by the size of living area At the beginning ofthe iteration a big value of 119877 improves the exploration abilityof the algorithm and at the end of iteration a small valueof 119877 improves the exploitation ability of the algorithm Weadopted a fixed shrinking coefficient 120588 = 092 to change theliving area radius after each iteration as shown in formula (6)To study the extent of 119877 impacts on the proposed algorithmwe selected Art1 data set and Iris data set using different 120588 toevaluate the performance of the proposed algorithm

0 10 20 30 40 50 60 70 80 90 100162

164

166

168

17

172

174

176

Evolvement generation

Fitn

ess v

alue

Wine datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 9 The convergence curve of the wine data

0 10 20 30 40 50 60 70 80 90 100310

320

330

340

350

360

370

380

390

Evolvement generation

Fitn

ess v

alue

Seeds data

PSOCPSOABC

CABCAMOIAMO

Figure 10 The convergence curve of the seeds data

Figure 17 shows the results of an experiment on Art1 wecan conclude that if we choose 120588 between 06 and 09 it hasa better convergence precision than that of 120588 = 099 or 120588 =

040 If we choose 120588 = 040 IAMO algorithm plunges intolocal optima and if we choose 120588 = 099 the IAMO algorithmhas a very low convergence rate And likewise in Figure 18for Iris test data set IAMO algorithm quickly converged atglobal optimum before 30 iterations if we choose 120588 = 080while IAMO could not escape from poor local optima and toglobal optimum if we choose 120588 = 070 120588 = 060 or 120588 = 040

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 10: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

10 Discrete Dynamics in Nature and Society

0 10 20 30 40 50 60 70 80 90 100106

108

11

112

114

116

118

Evolvement generation

Fitn

ess v

alue

Heart datatimes104

PSOCPSOABC

CABCAMOIAMO

Figure 11 The convergence curve of the heart data

0 10 20 30 40 50 60 70 80 90 1002560

2580

2600

2620

2640

2660

2680

2700

2720

2740

Evolvement generation

Fitn

ess v

alue

Survival data

PSOCPSOABC

CABCAMOIAMO

Figure 12 The convergence curve of the survival data

So the best 120588 for solving Iris data set must exist between 07and 099

The results suggest that a proper 120588 can greatly improve thealgorithm convergence velocity and convergence precisionand an improper 120588 may lead the IAMO fall into localoptimum

8 Conclusions

In this paper to improve the deficiencies of the AMO algo-rithm we improved the algorithm by using a new migration

0 10 20 30 40 50 60 70 80 90 1001420

1430

1440

1450

1460

1470

1480

1490

1500

1510

Evolvement generation

Fitn

ess v

alue

Balance scale data

PSOCPSOABC

CABCAMOIAMO

Figure 13 The convergence curve of the balance scale data

0 10 20 30 40 50 60 70 80 90 1002900

3000

3100

3200

3300

3400

3500

3600

Evolvement generation

Fitn

ess v

alue

Cancer data

PSOCPSOABC

CABCAMOIAMO

Figure 14 The convergence curve of the cancer data

method based on shrinking animals living area By 10 typicalstandard test data sets simulation the results show thatIAMO algorithm generally has strong global searching abilityand local optimization ability and can effectively avoid thedeficiencies that conventional algorithms easily fall into localoptimum IAMO has improved the convergence precision ofAMO and rank 1st in all test data sets therefore it is verypractical and effective to solve clustering problems At lasthow to define a proper and unified radius of living area needsto be considered in subsequent work

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 11: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Discrete Dynamics in Nature and Society 11

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data distribution

Figure 15 The Iris data distribution

4 45 5 55 6 65 7 75 82

25

3

35

4

45Iris data result

Figure 16 The Iris data clustering result

0 10 20 30 40 50 60 70 80 90 1001700

1800

1900

2000

2100

2200

2300

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Art1 data

Figure 17 The convergence curve of the Art1 with different 120588

0 10 20 30 40 50 60 70 80 90 10096

98

100

102

104

106

108

Evolvement generation

Fitn

ess v

alue

120588 = 099

120588 = 090

120588 = 080

120588 = 070

120588 = 060

120588 = 040

Different 120588 for Iris data

Figure 18 The convergence curve of the Iris with different 120588

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by National Science Foundationof China under Grant nos 61165015 and 61463007 KeyProject of Guangxi Science Foundation under Grant no2012GXNSFDA053028 and Key Project of Guangxi HighSchool Science Foundation under Grant no 20121ZD008

References

[1] R B Cattell ldquoThe description of personality basic traitsresolved into clustersrdquo Journal of Abnormal and Social Psychol-ogy vol 38 no 4 pp 476ndash506 1943

[2] K R Zalik ldquoAn efficient k-means clustering algorithmrdquo PatternRecognition Letters vol 29 no 8 pp 1385ndash1391 2008

[3] B Zhang M Hsu and U Dayal ldquoK-harmonic meansmdashadata clustering algorithmrdquo Tech Rep HPL-1999-124 Hewlett-Packard Laboratories 1999

[4] X-S Yang Nature-Inspired Metaheuristic Algorithms LuniverPress 2008

[5] X-S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 IEEE December2009

[6] X-S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization vol 284of Studies in Computational Intelligence pp 65ndash74 SpringerBerlin Germany 2010

[7] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 12: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

12 Discrete Dynamics in Nature and Society

(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[8] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium onMicroMachine and Human Science pp 39ndash43 NagoyaJapan October 1995

[9] A H Gandomi X-S Yang A H Alavi and S Talatahari ldquoBatalgorithm for constrained optimization tasksrdquo Neural Comput-ing and Applications vol 22 no 6 pp 1239ndash1255 2013

[10] A H Gandomi X-S Yang and A H Alavi ldquoCuckoo searchalgorithm a metaheuristic approach to solve structural opti-mization problemsrdquo Engineering with Computers vol 29 no 1pp 17ndash35 2013

[11] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo Discrete Dy-namics in Nature and Society vol 2010 Article ID 459796 16pages 2010

[12] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and 119896-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[13] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[14] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[15] T Niknam J Olamaei and B Amiri ldquoA hybrid evolutionaryalgorithm based on ACO and SA for cluster analysisrdquo Journal ofApplied Sciences vol 8 no 15 pp 2695ndash2702 2008

[16] T Niknam B Bahmani Firouzi and M Nayeripour ldquoAnefficient hybrid evolutionary algorithm for cluster analysisrdquoWorld Applied Sciences Journal vol 4 no 2 pp 300ndash307 2008

[17] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[18] Y Kao and K Cheng An ACO-Based Clustering AlgorithmSpringer Berlin Germany 2006

[19] M Omran A P Engelbrecht and A Salman ldquoParticle swarmoptimization method for image clusteringrdquo International Jour-nal of Pattern Recognition and Artificial Intelligence vol 19 no3 pp 297ndash321 2005

[20] D Karaboga and C Ozturk ldquoA novel clustering approachArtificial BeeColony (ABC) algorithmrdquoApplied SoftComputingJournal vol 11 no 1 pp 652ndash657 2011

[21] K E Voges andNK L Pope ldquoRough clustering using an evolu-tionary algorithmrdquo in Proceedings of the 45th Hawaii Interna-tional Conference on System Sciences (HICSS rsquo12) pp 1138ndash1145IEEE January 2012

[22] A Colorni M Dorigo and V Maniezzo Distributed Optimiza-tion by Ant Colonies Elsevier Publishing Paris France 1991

[23] D W van der Merwe and A P Engelbrecht ldquoData clusteringusing particle swarm optimizationrdquo in Proceedings of the Con-gress on EvolutionaryComputation (CEC rsquo03) vol 1 pp 215ndash220Canberra Australia December 2003

[24] E H L Aarts and J H Korst Simulated Annealing andBoltzmann Machines John Wiley amp Sons 1989

[25] D Karaboga ldquoAn idea based on honey bee swarm for numer-ical optimizationrdquo Tech Rep TR06 Erciyes University PressErciyes Turkey 2005

[26] X Chen Y Zhou and Q Luo ldquoA hybrid monkey search algo-rithm for clustering analysisrdquo The Scientific World Journal vol2014 Article ID 938239 16 pages 2014

[27] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications vol 24 no 7-8 pp 1867ndash1877 2014

[28] J MacQueen ldquoSome methods for classification and analysis ofmultivariate observationsrdquo in Proceedings of the Fifth BerkeleySymposium on Mathematical Statistics and Probability Volume1 Statistics pp 281ndash297 University of California Press BerkeleyCalif USA 1967

[29] X Chen and J Zhang ldquoClustering algorithmbased on improvedparticle swarmoptimizationrdquo Journal of Computer Research andDevelopment pp 287ndash291 2012

[30] X Liu Q Sha Y Liu and X Duan ldquoAnalysis of classificationusing particle swarm optimizationrdquo Computer Engineering vol32 no 6 pp 201ndash213 2006

[31] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[32] F van den Bergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[33] C L Blake andC JMerz UCI Repository ofMachine LearningDatabases httparchiveicsuciedumldatasetshtml

[34] E Anderson ldquoThe irises of the gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[35] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 part 2 Article ID 179188pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 13: Research Article An Improved Animal Migration Optimization ...complex optimization problems. Clustering is a popular data analysis and data mining technique and it is used in many

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of