AnImprovedGreyWolfOptimizationAlgorithmwith...

14
Research Article An Improved Grey Wolf Optimization Algorithm with Variable Weights Zheng-Ming Gao 1 and Juan Zhao 2 1 School of Computer Engineering, Jingchu University of Technology, Jingmen, Hubei 448000, China 2 School of Electronics and Information Engineering, Jingchu University of Technology, Jingmen, Hubei 448000, China Correspondence should be addressed to Zheng-Ming Gao; [email protected] Received 11 December 2018; Revised 19 February 2019; Accepted 13 March 2019; Published 2 June 2019 Academic Editor: Ras ¸it K¨ oker Copyright © 2019 Zheng-Ming Gao and Juan Zhao. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. With a hypothesis that the social hierarchy of the grey wolves would be also followed in their searching positions, an improved grey wolf optimization (GWO) algorithm with variable weights (VW-GWO) is proposed. And to reduce the probability of being trapped in local optima, a new governing equation of the controlling parameter is also proposed. Simulation experiments are carried out, and comparisons are made. Results show that the proposed VW-GWO algorithm works better than the standard GWO, the ant lion optimization (ALO), the particle swarm optimization (PSO) algorithm, and the bat algorithm (BA). e novel VW-GWO algorithm is also verified in high-dimensional problems. 1. Introduction A lot of problems with huge numbers of variables, massive complexity, or having no analytical solutions were met during the behavior of exploring, exploiting, and conquering nature by human beings. e optimization methods are proposed to solve them. But unfortunately, because of the no free lunch rule [1], it is always hard to find a universal efficient way for almost all problems. erefore, scientists and engineers around the world are still under ways to find more optimi- zation algorithms and more suitable methods. Traditionally, the optimization algorithms are divided into two parts: the deterministic algorithms and the stochastic al- gorithms [2]; the deterministic algorithms are proved to be easily trapped in local optima, while the stochastic algorithms are found to be capable of avoiding local solutions with ran- domness. us, more attention is paid to the stochastic algo- rithms, and more and more algorithms are proposed. Among the research on the stochastic algorithms, presentations, im- provements, and applications of the nature-inspired computing (NIC) algorithms come into being a hot spot. e NIC algorithms are proposed with inspiration of the nature, and they have been proved to be efficient to solve the problems human meet [3, 4]. One of the most important parts of NIC algorithms are the bionic algorithms, and most of the bionic algorithms are metaheuristic [5–7]. ey can solve problems with parallel computing and global searching. e metaheuristic algorithms divide the swarms in global and local searching with some methods. ey cannot guarantee the global optimal solutions; thus, most of the metaheuristic algorithms introduce randomness to avoid local optima. e individuals in swarms are controlled to separate, align, and cohere [8] with randomness; their current velocities are composed of the former velocities, random multipliers of the frequency [9], or Euclidean distances of specific individuals’ positions [10–14]. Some improvements are made with inertia weights modification [15–17], hybridization with invasive weed optimization [18], chaos [19], and binary [20] vectors et al. Most of these improvements result in a little better performance of the specific algorithms, but the overall structures remain unchanged. Almost all of the metaheuristic algorithms and their improvements so far are inspired directly from the behaviors of the organisms such as searching, hunting [11, 21], pol- linating [13], and flashing [14]. In the old metaheuristic Hindawi Computational Intelligence and Neuroscience Volume 2019, Article ID 2981282, 13 pages https://doi.org/10.1155/2019/2981282

Transcript of AnImprovedGreyWolfOptimizationAlgorithmwith...

Page 1: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

Research ArticleAn Improved Grey Wolf Optimization Algorithm withVariable Weights

Zheng-Ming Gao 1 and Juan Zhao2

1School of Computer Engineering Jingchu University of Technology Jingmen Hubei 448000 China2School of Electronics and Information Engineering Jingchu University of Technology Jingmen Hubei 448000 China

Correspondence should be addressed to Zheng-Ming Gao gaozmingjcuteducn

Received 11 December 2018 Revised 19 February 2019 Accepted 13 March 2019 Published 2 June 2019

Academic Editor Rasit Koker

Copyright copy 2019 Zheng-Ming Gao and Juan Zhao 0is is an open access article distributed under the Creative CommonsAttribution License which permits unrestricted use distribution and reproduction in anymedium provided the original work isproperly cited

With a hypothesis that the social hierarchy of the grey wolves would be also followed in their searching positions an improvedgrey wolf optimization (GWO) algorithm with variable weights (VW-GWO) is proposed And to reduce the probability of beingtrapped in local optima a new governing equation of the controlling parameter is also proposed Simulation experiments arecarried out and comparisons are made Results show that the proposed VW-GWO algorithm works better than the standardGWO the ant lion optimization (ALO) the particle swarm optimization (PSO) algorithm and the bat algorithm (BA) 0e novelVW-GWO algorithm is also verified in high-dimensional problems

1 Introduction

A lot of problems with huge numbers of variables massivecomplexity or having no analytical solutions were met duringthe behavior of exploring exploiting and conquering natureby human beings 0e optimization methods are proposed tosolve them But unfortunately because of the no free lunchrule [1] it is always hard to find a universal efficient way foralmost all problems 0erefore scientists and engineersaround the world are still under ways to find more optimi-zation algorithms and more suitable methods

Traditionally the optimization algorithms are divided intotwo parts the deterministic algorithms and the stochastic al-gorithms [2] the deterministic algorithms are proved to beeasily trapped in local optima while the stochastic algorithmsare found to be capable of avoiding local solutions with ran-domness 0us more attention is paid to the stochastic algo-rithms and more and more algorithms are proposed Amongthe research on the stochastic algorithms presentations im-provements and applications of the nature-inspired computing(NIC) algorithms come into being a hot spot

0e NIC algorithms are proposed with inspiration of thenature and they have been proved to be efficient to solve the

problems human meet [3 4] One of the most importantparts of NIC algorithms are the bionic algorithms and mostof the bionic algorithms are metaheuristic [5ndash7] 0ey cansolve problems with parallel computing and globalsearching 0e metaheuristic algorithms divide the swarmsin global and local searching with some methods 0eycannot guarantee the global optimal solutions thus most ofthe metaheuristic algorithms introduce randomness to avoidlocal optima 0e individuals in swarms are controlled toseparate align and cohere [8] with randomness theircurrent velocities are composed of the former velocitiesrandom multipliers of the frequency [9] or Euclideandistances of specific individualsrsquo positions [10ndash14] Someimprovements are made with inertia weights modification[15ndash17] hybridization with invasive weed optimization [18]chaos [19] and binary [20] vectors et al Most of theseimprovements result in a little better performance of thespecific algorithms but the overall structures remainunchanged

Almost all of the metaheuristic algorithms and theirimprovements so far are inspired directly from the behaviorsof the organisms such as searching hunting [11 21] pol-linating [13] and flashing [14] In the old metaheuristic

HindawiComputational Intelligence and NeuroscienceVolume 2019 Article ID 2981282 13 pageshttpsdoiorg10115520192981282

algorithms such as the genetic algorithm (GA) [22] sim-ulated annealing (SA) [23] and the ant colony optimization(ACO) algorithm [24] the individuals are treated in thesame way and the final results are the best fitness valuesMetaheuristic algorithms perform their behavior under thesame governing equations To achieve a better performanceand decrease the possibility of being trapped in local optimarandom walks or levy flights are introduced to the in-dividuals when specific conditions are might [25 26] 0esemostly mean that the swarms would perform their behaviorin more uncontrolling ways Furthermore as organismsliving in swarms in nature most of them have social hier-archies as long as they are slightly intelligent For example inan ant colony the queen is the commander despite its re-production role the dinergates are soldiers to garden thecolony while the ergates are careered with building gath-ering and breeding It can be concluded that the hierarchy ofthe ant colony is queenrarr dinergatesrarr ergates if they areclassified with jobs 0e ergatesrsquo behavior could be directedby their elderrsquos experience and their queen or the dinergatesIf the ergates are commanded by the queen some dinergatesor elders and such operations are mathematically describedand introduced to the ant colony optimization (ACO) insome way will the ACO algorithm perform better in solvingthe problems In other words how about the social hier-archy of the swarms considered in the metaheuristic algo-rithms 0is work was done by Mirjalili et al and a newoptimization method called the grey wolf optimization(GWO) algorithm was proposed [27]

0e GWO algorithm considers the searching huntingbehavior and the social hierarchy of the grey wolves Due toless randomness and varying numbers of individualsassigned in global and local searching procedures the GWOalgorithm is easier to use and converges more rapidly It hasbeen proved to be more efficient than the PSO [27] algo-rithm and other bionic algorithms [28ndash32] More attentionhad been paid to its applications due to its better perfor-mance Efforts have been done in feature and band selection[33 34] automatic control [29 35] power dispatching[32 36] parameter estimation [31] shop scheduling [28]and multiobjective optimization [37 38] However thestandard GWO algorithm was formulated with equal im-portance of the grey wolvesrsquo positions which is not con-sistent strictly with their social hierarchy Recentdevelopments of the GWO algorithms such as the binaryGWO algorithm [34] multiobjective GWO algorithm [37]and mix with others [39] together with their applications[40ndash43] keep it remaining unchanged If the searching andhunting positions of the grey wolves are also agreed to thesocial hierarchy the GWO algorithm will be possibly im-proved With a hypothesis that the social hierarchy of thegrey wolves would be also functional in the grey wolvesrsquosearching procedure we report an improvement of theoriginal GWO algorithm in this paper And considering theapplications in engineering when a maximum admissibleerror (MAE) is usually restricted for given problems adeclined exponentially governing equation of the controllingparameter is introduced to avoid the unknown maximum

iteration number 0e rest of this paper is organized asfollows

Section 2 presents the inspiration of the improvementand the revision of the controlling equations to meet theneeds of the latter experiments Experiment setup is de-scribed in Section 3 and results are compared in Section 4Finally Section 5 concludes the work and further researchsuggestions are made

2 Algorithms

According to Mirjalili et al [27] the grey wolves live to-gether and hunt in groups 0e searching and huntingprocess can be described as follows (1) if a prey is foundthey first track and chase and approach it (2) If the preyruns then the grey wolves pursue encircle and harass theprey until it stops moving (3) Finally the attack begins

21 Standard GWO Algorithm Mirjalili designed the opti-mization algorithm imitating the searching and huntingprocess of grey wolves In the mathematical model the fittestsolution is called the alpha (α) the second best is beta (β)and consequently the third best is named the delta (δ) 0erest of the candidate solutions are all assumed to be omegas(ω) All of the omegas would be guided by these three greywolves during the searching (optimizing) and hunting

When a prey is found the iteration begins (t 1)0ereafter the alpha beta and the delta wolves would leadthe omegas to pursue and eventually encircle the prey0ree coefficients A

rarr Crarr and D

rarrare proposed to describe the

encircling behavior

Dαrarr

C1rarr

middot Xa

rarrminus X

rarr(t)

11138681113868111386811138681113868

11138681113868111386811138681113868

Dβrarr

C2rarr

middot Xβrarrminus X

rarr(t)

11138681113868111386811138681113868

11138681113868111386811138681113868

Dδrarr

C3rarr

middot Xδrarrminus X

rarr(t)

11138681113868111386811138681113868

11138681113868111386811138681113868

(1)

where t indicates the current iteration Xrarr

is the positionvector of the grey wolf and X1

rarr X2

rarr and X3

rarrare the position

vectors of the alpha beta and delta wolves Xrarr

would becomputed as follows

X1rarr

Xa

rarrminus A1

rarrmiddot Dαrarr

(2)

X2rarr

Xβrarrminus A2

rarrmiddot Dβrarr

(3)

X3rarr

Xδrarrminus A3

rarrmiddot Dδrarr

(4)

Xrarr

(t) X1rarr

+ X2rarr

+ X3rarr

3 (5)

0e parameters Ararr

and Crarr

are combinations of thecontrolling parameter a and the random numbers r1

rarr and r2rarr

[27]Ararr

2αr1rarrminus α

Crarr

2r2rarr

(6)

2 Computational Intelligence and Neuroscience

0e controlling parameter a changes Ararr

and finallycauses the omega wolves to approach or run away from thedominant wolves such as the alpha beta and delta 0eo-retically if |A

rarr|gt 1 the grey wolves run away from the

dominants and this means the omega wolves would runaway from the prey and explore more space which is calleda global search in optimization And if |A

rarr|lt 1 they ap-

proach the dominants which means the omega wolveswould follow the dominants approaching the prey and thisis called a local search in optimization

0e controlling parameter a is defined to be declinedlinearly from a maximum value of 2 to zero while the it-erations are being carried on

α 2 1minusitN

1113874 1113875 (7)

where N is the maximum iteration number and it is ini-tialized at the beginning by users It is defined as the cu-mulative iteration number 0e application procedure can bedivided in three parts (1) 0e given problems are understoodand mathematically described and some elemental param-eters are then known (2) A pack of grey wolves are randomlyinitialized all through the space domain (3) 0e alpha andother dominant grey wolves lead the pack to search pursueand encircle the prey When the prey is encircled by the greywolves and it stops moving the search finishes and attacksbegin 0e pseudocode is listed in Table 1

22 Proposed Variable Weights and eir GoverningEquations We can see from the governing equation (5) thatthe dominants play a same role in the searching processevery one of the grey wolves approaches or runs away fromthe dominants with an average weight of the alpha beta anddelta However although the alpha is the nearest to the preyat the beginning of the search it might be far away from thefinal result let alone the beta and delta 0erefore at thebeginning of the searching procedure only the position ofthe alpha should be considered in equation (5) or its weightshould be much larger than those of other dominants Onthe contrary the averaging weight in equation (5) is alsoagainst the social hierarchy hypothesis of the grey wolves Ifthe social hierarchy is strictly followed in the pack the alphais the leader and heshe might be always the nearest one tothe prey 0e alpha wolf should be the most importantwhich means that the weight of alpharsquos position in equation(5) should be always no less than those of the beta and thedelta And consequently the weight of the betarsquos positionshould be always no less than that of the delta Based on theseconsiderations we further hypothesize the following

(1) 0e searching and hunting process are always gov-erned by the alpha the beta plays a less importantrole and the delta plays a much less role All of theother grey wolves transfer hisher position to thealpha if heshe gets the bestIt should be noted that in real searching and huntingprocedures the best position is nearest to the preywhile in optimization for a global optimum of a given

problem the best position is the maximum or min-imum of the fitness value under given restrictions

(2) During the searching process a hypothesized prey isalways surrounded by the dominants while inhunting process a real prey is encircled 0e dom-inant grey wolves are at positions surrounding theprey in order of their social hierarchy 0is meansthat the alpha is the nearest one among the greywolves the beta is the nearest one in the pack exceptfor the alpha and the delta ranks the third 0eomega wolves are involved in the processes and theytransfer their better positions to the dominants

With hypothesis mentioned hereinbefore the updatemethod of the positions should not be considered the samein equation (5)

When the search begins the alpha is the nearest and therest are all not important So hisher position should becontributed to the new searching individuals while all of theothers could be ignored 0is means that the weight of thealpha should be near to 10 at the beginning while theweights of the beta and delta could be near zero at this timeAt the final state the alpha beta and the delta wolves shouldencircle the prey which means they have an equal weight asmentioned in equation (5) Along with the searching pro-cedure from the beginning to the end the beta comes upwith the alpha as heshe always rank the second and thedelta comes up with the beta due to hisher third rank 0ismeans that the weights of the beta and delta arise along withthe cumulative iteration number So the weight of the alphashould be reduced and the weights of the beta and deltaarise

0e above ideas could be formulated in mathematicsFirst of all all of the weights should be varied and limited to10 when they are summed up Equation (5) is then changedas follows

Xrarr

(t + 1) w1X1rarr

+ w2X2rarr

+ w3X3rarr

w1 + w2 + w3 1(8)

Table 1 Pseudocode of the GWO algorithm

Description Pseudocode

Set upoptimization

Dimension of the given problemsLimitations of the given problems

Population sizeControlling parameter

Stop criterion (maximum iteration times oradmissible errors)

Initialization Positions of all of the grey wolves including αβ and δ wolves

Searching

While not the stop criterion calculate the newfitness function

Update the positionsLimit the scope of positions

Refresh α β and δUpdate the stop criterion

End

Computational Intelligence and Neuroscience 3

Secondly the weight of the alpha w1 that of the beta w2and that of the delta w3 should always satisfy w1 gew2 gew3Mathematically speaking the weight of the alpha would bechanged from 10 to 13 along with the searching procedureAnd at the same time the weights of the beta and deltawould be increased from 00 to 13 Generally speaking acosine function could be introduced to describe w1 when werestrict an angle θ to vary in [0 arccos(13)]

irdly the weights should be varied with the cumu-lative iteration number or ldquoitrdquo And we know thatw2 middot w3⟶ 0 when it 0 and w1 w2 w3⟶ 13 whenit⟶infin So we introduce an arc-tangent function about itwhich would be varying from 00 to π2 And magically sin(π4) cos (π4)

2

radic2 so another angular parameter φwas

introduced as follows

φ 12arctan(it) (9)

Consideringw2 would be increased from 00 to 13 alongwith it we hypothesize that it contains sin θ and cosφ andθ⟶ arccos(13) when it⟶infin therefore

θ 2πarccos

13middot arctan(it) (10)

when it⟶infin θ⟶ arccos (13) w2 13 we can thenformulatew2 in details Based on these considerations a newupdate method of the positions with variable weights isproposed as follows

w1 cos θ

w2 12sin θ middot cosφ

w3 1minusw1 minusw2

(11)

e curve of the variable weights is drawn in Figure 1We can then nd that the variable weights satisfy the hy-pothesis the social hierarchy of the grey wolvesrsquo functions intheir behavior of searching

23 Proposed Declined Exponentially Governing Equation ofthe Controlling Parameter In equation (7) the controllingparameter is declined linearly from two to zero when theiterations are carrying on from zero to the maximum NHowever an optimization is usually ended with a maximumadmissible error (MAE) which is requested in engineering is also means that the maximum iteration number N isunknown

Furthermore the controlling parameter is a restrictionparameter for A who is responsible for the grey wolf toapproach or run away from the dominants In other wordsthe controlling parameter governs the grey wolves to searchglobally or locally in the optimizing process e globalsearch probability is expected to be larger when the searchbegins and consequently the local search probability isexpected to be larger when the algorithm is approaching theoptimum erefore to obtain a better performance of theGWO algorithm the controlling parameter is expected to be

decreased quickly when the optimization starts and convergeto the optimum very fast On the contrary some grey wolvesare expected to remain global searching to avoid beingtrapped in local optima Considering these reasons acontrolling parameter declined exponentially [44] is in-troduced as described below

α αmeminusitM (12)

where am is the maximum value and M is an admissiblemaximum iteration number e parameterM restricts thealgorithm to avoid long time running and nonconvergenceIt is expected to be larger than 104 or 105 based on nowadayscomputing hardware used in most laboratories

3 Empirical Studies and theExperiments Prerequisite

e goal of experiments is to verify the advantages of theimproved GWO algorithm with variable weights (VW-GWO) with comparisons to the standard GWO algorithmand other metaheuristic algorithms in this paper Classicallyoptimization algorithms are applied to optimize benchmarkfunctions which were used to describe the real problemshuman meet

31 Empirical Study of the GWO Algorithm Although thereare less numbers of parameters in the GWO algorithm thanthat in other algorithms such as the ALO PSO and batalgorithm (BA) [45] the suitable values of the parametersremain important for the algorithm to be ecient andeconomic Empirical study has been carried out and resultsshow that the population size is expected to be 20sim50balancing the computing complexity and the convergentrate In an empirical study on the parameters of the max-imum value am the sphere function (F1) and Schwefelrsquosproblems 222 (F2) and 12 (F3) are optimized to nd therelationship between am and the mean least iteration timeswith a given error tolerance of 10minus25 as shown in Figure 2

w1w2w3

01

02

03

04

05

06

07

08

09

Wei

ght

10 15 20 25 30 35 40 45 505Iteration

Figure 1 e variable weights vs iterations

4 Computational Intelligence and Neuroscience

We can know from Figure 2 the following (1) themaximum value am of the controlling parameter a inuencesthe MLIT under a given MAE when am is smaller than 10the smaller the am is the more the MLIT would be neededOn the contrary if the am is larger than 25 the larger the amis the more the MLIT would be needed (2) am should bevaried in [10 25] and am is found to be the best when it is16 or 17

32 Benchmark Functions Benchmark functions are stan-dard functions which are derived from the research onnature ey are usually diverse and unbiased dicult to besolved with analytical expressions e benchmark func-tions have been an essential way to test the reliabilityeciency and validation of optimization algorithms eyvaried from the number of ambiguous peaks in the functionlandscape the shape of the basins or valleys reparability tothe dimensional Mathematically speaking the benchmarkfunctions can be classied with the following ve attributes[46]

(a) Continuous or uncontinuous most of the functionsare continuous but some of them are not

(b) Dipounderentiable or nondipounderentiable some of thefunctions can be dipounderenced but some of them not

(c) Separable or nonseparable some of the functions canbe separated but some of them are not

(d) Scalable or nonscalable some of the functions can beexpanded to any dimensional but some of them arexed to two or three dimensionalities

(e) Unimodal or multimodal some of the functionshave only one peak in their landscape but some ofthem have many peaks e former attribute is calledunimodal and the latter is multimodal

ere are 175 benchmark functions being summarizedin literature [46] In this paper we choose 11 benchmarkfunctions from simplicity to complexity including all of theabove ve characteristics ey would be tted to test thecapability of the involved algorithms as listed in Table 2 andthey are all scalable

e functions are all n-dimensional and their inputvectors x (x1 x2 xn) are limited by the domain Valuesin the domain are maximum to be ub and minimum to be lb e single result values are all zeros theoretically forsimplicity

4 Results and Discussion

ere are 11 benchmark functions being involved in thisstudy Comparisons are made with the standard grey wolfoptimization algorithm (std GWO) and three other bionicmethods such as the ant lion optimization algorithm (ALO)the PSO algorithm and BA

41 General Reviews of theAlgorithms e randomness is allinvolved in the algorithms studied in this paper for examplethe random positions random velocities and randomcontrolling parameters e randomness causes the tnessvalues obtained during the optimization procedure touctuate So when an individual of the swarm is initialized

05 1 15 2 25 3 35 4 45 5am

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Min

imum

iter

atio

n (E

=1e

ndash25

)

F1F2F3

Figure 2 Relationship between the MLIT and maximum value am

Computational Intelligence and Neuroscience 5

or it randomly jumps to a position quite near the optimumthe best fitness value would be met Table 3 lists the best andworst fitness results of some chosen benchmark functionsand their corresponding algorithms During this experi-ment 100 Monte Carlo (MC) simulations are carried out forevery benchmark function 0e results show that the ran-domness indeed leads to some random work but at most ofthe time the final results would be more dependent on thealgorithms

0e GWO algorithms always work the best at first glanceof Table 3 either the VM-GWO or the std GWO algorithmcould optimize the benchmark functions best to its optimawith little absolute errors while the proposed VM-GWOalgorithm is almost always the best one Other comparedalgorithms such as the PSO ALO algorithms and the BAwould lead to the worst results at most time0ese mean thatthe GWO algorithms are more capable and the proposedVM-GWO algorithm is indeed improving the capability ofthe std GWO algorithm A figure about the absolute errorsaveraged over MC 100 versus iterations could also lead tothis conclusion as shown in Figure 3

0e convergence rate curve during the iterations ofF3 benchmark function is demonstrated in Figure 3 Itshows that the proposed VM-GWO algorithm wouldresult in faster converging low residual errors and stableconvergence

42 Comparison Statistical Analysis and Test General ac-quaintances of the metaheuristic algorithms might be gotfrom Table 3 and Figure 3 However the optimizationproblems often demand the statistical analysis and test Todo this 100MC simulations are carried out on the bench-mark functions 0e benchmark functions are all two di-mensional and they are optimized by the new proposedVM-GWO and other four algorithms over 100 timesCausing the benchmark functions are all concentrated tozeros and the simulated fitness results are also their absoluteerrors 0e mean values of the absolute errors and thestandard deviations of the final results are listed in Table 4

some of the values are quoted from the published jobs andreferences are listed correspondingly

0e proposed VM-GWO algorithm and its comparedalgorithms are almost all capable of searching the globaloptima of the benchmark functions 0e detailed values inTable 4 show that the standard deviations of the 100MCsimulations are all small We can further draw the followingconclusions

(1) All of the algorithms involved in this study were ableto find the optimum

(2) All of the benchmark functions tested in this ex-periment could be optimized whether they areunimodal or multimodal under the symmetric orunsymmetric domain

(3) Comparatively speaking although the bat algorithmis composed of much more randomness it did the

Table 2 Benchmark functions to be fitted

Label Function name Expressions Domain [lb ub]F1 De Jongrsquos sphere y 1113936

ni1x

2i [minus100 100]

F2 Schwefelrsquos problems 222 y 1113936ni1lfloorxirfloor + 1113937

ni1lfloorxirfloor [minus100 100]

F3 Schwefelrsquos problem 12 y 1113936ni11113936

ij1x

2j [minus100 100]

F4 Schwefelrsquos problem 221 y max1leilenlfloorxirfloor [minus100 100]

F5 Chung Reynolds function y (1113936ni1x

2i )2 [minus100 100]

F6 Schwefelrsquos problem 220 y 1113936ni1lfloorxirfloor [minus100 100]

F7 Csendes function y 1113936ni1x

2i (2 + sin(1xi)) [minus1 1]

F8 Exponential function y minuseminus051113936n

i1x2i [minus1 1]

F9 Griewankrsquos function y 1113936ni1(x2

i 4000)minus1113937ni1cos(xi

i

radic) + 1 [minus100 100]

F10 Salomon function y 1minus cos11138742π1113936

ni1x

2i

1113969

1113875 + 011113936

ni1x

2i

1113969[minus100 100]

F11 Zakharov function y 1113936ni1x

2i + 11138741113936

ni105ixi1113875

2+ 11138741113936

ni105ixi1113875

4[minus5 10]

Table 3 0e best and worst simulation results and their corre-sponding algorithms (dim 2)

Functions Value Corresponding algorithmBest fitnessF1 14238eminus 70 VM-GWOF2 32617eminus 36 VM-GWOF3 36792eminus 68 VM-GWOF4 33655eminus 66 Std GWOF7 78721eminus 222 VM-GWOF8 0 VM-GWO Std GWO PSO BAF9 0 VM-GWO Std GWOF11 26230eminus 69 VM-GWOWorst fitnessF1 10213eminus 07 BAF2 41489eminus 04 BAF3 59510eminus 08 BAF4 24192eminus 06 PSOF7 10627eminus 24 BAF8 57010eminus 13 BAF9 10850eminus 01 ALOF11 99157eminus 09 BA

6 Computational Intelligence and Neuroscience

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 2: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

algorithms such as the genetic algorithm (GA) [22] sim-ulated annealing (SA) [23] and the ant colony optimization(ACO) algorithm [24] the individuals are treated in thesame way and the final results are the best fitness valuesMetaheuristic algorithms perform their behavior under thesame governing equations To achieve a better performanceand decrease the possibility of being trapped in local optimarandom walks or levy flights are introduced to the in-dividuals when specific conditions are might [25 26] 0esemostly mean that the swarms would perform their behaviorin more uncontrolling ways Furthermore as organismsliving in swarms in nature most of them have social hier-archies as long as they are slightly intelligent For example inan ant colony the queen is the commander despite its re-production role the dinergates are soldiers to garden thecolony while the ergates are careered with building gath-ering and breeding It can be concluded that the hierarchy ofthe ant colony is queenrarr dinergatesrarr ergates if they areclassified with jobs 0e ergatesrsquo behavior could be directedby their elderrsquos experience and their queen or the dinergatesIf the ergates are commanded by the queen some dinergatesor elders and such operations are mathematically describedand introduced to the ant colony optimization (ACO) insome way will the ACO algorithm perform better in solvingthe problems In other words how about the social hier-archy of the swarms considered in the metaheuristic algo-rithms 0is work was done by Mirjalili et al and a newoptimization method called the grey wolf optimization(GWO) algorithm was proposed [27]

0e GWO algorithm considers the searching huntingbehavior and the social hierarchy of the grey wolves Due toless randomness and varying numbers of individualsassigned in global and local searching procedures the GWOalgorithm is easier to use and converges more rapidly It hasbeen proved to be more efficient than the PSO [27] algo-rithm and other bionic algorithms [28ndash32] More attentionhad been paid to its applications due to its better perfor-mance Efforts have been done in feature and band selection[33 34] automatic control [29 35] power dispatching[32 36] parameter estimation [31] shop scheduling [28]and multiobjective optimization [37 38] However thestandard GWO algorithm was formulated with equal im-portance of the grey wolvesrsquo positions which is not con-sistent strictly with their social hierarchy Recentdevelopments of the GWO algorithms such as the binaryGWO algorithm [34] multiobjective GWO algorithm [37]and mix with others [39] together with their applications[40ndash43] keep it remaining unchanged If the searching andhunting positions of the grey wolves are also agreed to thesocial hierarchy the GWO algorithm will be possibly im-proved With a hypothesis that the social hierarchy of thegrey wolves would be also functional in the grey wolvesrsquosearching procedure we report an improvement of theoriginal GWO algorithm in this paper And considering theapplications in engineering when a maximum admissibleerror (MAE) is usually restricted for given problems adeclined exponentially governing equation of the controllingparameter is introduced to avoid the unknown maximum

iteration number 0e rest of this paper is organized asfollows

Section 2 presents the inspiration of the improvementand the revision of the controlling equations to meet theneeds of the latter experiments Experiment setup is de-scribed in Section 3 and results are compared in Section 4Finally Section 5 concludes the work and further researchsuggestions are made

2 Algorithms

According to Mirjalili et al [27] the grey wolves live to-gether and hunt in groups 0e searching and huntingprocess can be described as follows (1) if a prey is foundthey first track and chase and approach it (2) If the preyruns then the grey wolves pursue encircle and harass theprey until it stops moving (3) Finally the attack begins

21 Standard GWO Algorithm Mirjalili designed the opti-mization algorithm imitating the searching and huntingprocess of grey wolves In the mathematical model the fittestsolution is called the alpha (α) the second best is beta (β)and consequently the third best is named the delta (δ) 0erest of the candidate solutions are all assumed to be omegas(ω) All of the omegas would be guided by these three greywolves during the searching (optimizing) and hunting

When a prey is found the iteration begins (t 1)0ereafter the alpha beta and the delta wolves would leadthe omegas to pursue and eventually encircle the prey0ree coefficients A

rarr Crarr and D

rarrare proposed to describe the

encircling behavior

Dαrarr

C1rarr

middot Xa

rarrminus X

rarr(t)

11138681113868111386811138681113868

11138681113868111386811138681113868

Dβrarr

C2rarr

middot Xβrarrminus X

rarr(t)

11138681113868111386811138681113868

11138681113868111386811138681113868

Dδrarr

C3rarr

middot Xδrarrminus X

rarr(t)

11138681113868111386811138681113868

11138681113868111386811138681113868

(1)

where t indicates the current iteration Xrarr

is the positionvector of the grey wolf and X1

rarr X2

rarr and X3

rarrare the position

vectors of the alpha beta and delta wolves Xrarr

would becomputed as follows

X1rarr

Xa

rarrminus A1

rarrmiddot Dαrarr

(2)

X2rarr

Xβrarrminus A2

rarrmiddot Dβrarr

(3)

X3rarr

Xδrarrminus A3

rarrmiddot Dδrarr

(4)

Xrarr

(t) X1rarr

+ X2rarr

+ X3rarr

3 (5)

0e parameters Ararr

and Crarr

are combinations of thecontrolling parameter a and the random numbers r1

rarr and r2rarr

[27]Ararr

2αr1rarrminus α

Crarr

2r2rarr

(6)

2 Computational Intelligence and Neuroscience

0e controlling parameter a changes Ararr

and finallycauses the omega wolves to approach or run away from thedominant wolves such as the alpha beta and delta 0eo-retically if |A

rarr|gt 1 the grey wolves run away from the

dominants and this means the omega wolves would runaway from the prey and explore more space which is calleda global search in optimization And if |A

rarr|lt 1 they ap-

proach the dominants which means the omega wolveswould follow the dominants approaching the prey and thisis called a local search in optimization

0e controlling parameter a is defined to be declinedlinearly from a maximum value of 2 to zero while the it-erations are being carried on

α 2 1minusitN

1113874 1113875 (7)

where N is the maximum iteration number and it is ini-tialized at the beginning by users It is defined as the cu-mulative iteration number 0e application procedure can bedivided in three parts (1) 0e given problems are understoodand mathematically described and some elemental param-eters are then known (2) A pack of grey wolves are randomlyinitialized all through the space domain (3) 0e alpha andother dominant grey wolves lead the pack to search pursueand encircle the prey When the prey is encircled by the greywolves and it stops moving the search finishes and attacksbegin 0e pseudocode is listed in Table 1

22 Proposed Variable Weights and eir GoverningEquations We can see from the governing equation (5) thatthe dominants play a same role in the searching processevery one of the grey wolves approaches or runs away fromthe dominants with an average weight of the alpha beta anddelta However although the alpha is the nearest to the preyat the beginning of the search it might be far away from thefinal result let alone the beta and delta 0erefore at thebeginning of the searching procedure only the position ofthe alpha should be considered in equation (5) or its weightshould be much larger than those of other dominants Onthe contrary the averaging weight in equation (5) is alsoagainst the social hierarchy hypothesis of the grey wolves Ifthe social hierarchy is strictly followed in the pack the alphais the leader and heshe might be always the nearest one tothe prey 0e alpha wolf should be the most importantwhich means that the weight of alpharsquos position in equation(5) should be always no less than those of the beta and thedelta And consequently the weight of the betarsquos positionshould be always no less than that of the delta Based on theseconsiderations we further hypothesize the following

(1) 0e searching and hunting process are always gov-erned by the alpha the beta plays a less importantrole and the delta plays a much less role All of theother grey wolves transfer hisher position to thealpha if heshe gets the bestIt should be noted that in real searching and huntingprocedures the best position is nearest to the preywhile in optimization for a global optimum of a given

problem the best position is the maximum or min-imum of the fitness value under given restrictions

(2) During the searching process a hypothesized prey isalways surrounded by the dominants while inhunting process a real prey is encircled 0e dom-inant grey wolves are at positions surrounding theprey in order of their social hierarchy 0is meansthat the alpha is the nearest one among the greywolves the beta is the nearest one in the pack exceptfor the alpha and the delta ranks the third 0eomega wolves are involved in the processes and theytransfer their better positions to the dominants

With hypothesis mentioned hereinbefore the updatemethod of the positions should not be considered the samein equation (5)

When the search begins the alpha is the nearest and therest are all not important So hisher position should becontributed to the new searching individuals while all of theothers could be ignored 0is means that the weight of thealpha should be near to 10 at the beginning while theweights of the beta and delta could be near zero at this timeAt the final state the alpha beta and the delta wolves shouldencircle the prey which means they have an equal weight asmentioned in equation (5) Along with the searching pro-cedure from the beginning to the end the beta comes upwith the alpha as heshe always rank the second and thedelta comes up with the beta due to hisher third rank 0ismeans that the weights of the beta and delta arise along withthe cumulative iteration number So the weight of the alphashould be reduced and the weights of the beta and deltaarise

0e above ideas could be formulated in mathematicsFirst of all all of the weights should be varied and limited to10 when they are summed up Equation (5) is then changedas follows

Xrarr

(t + 1) w1X1rarr

+ w2X2rarr

+ w3X3rarr

w1 + w2 + w3 1(8)

Table 1 Pseudocode of the GWO algorithm

Description Pseudocode

Set upoptimization

Dimension of the given problemsLimitations of the given problems

Population sizeControlling parameter

Stop criterion (maximum iteration times oradmissible errors)

Initialization Positions of all of the grey wolves including αβ and δ wolves

Searching

While not the stop criterion calculate the newfitness function

Update the positionsLimit the scope of positions

Refresh α β and δUpdate the stop criterion

End

Computational Intelligence and Neuroscience 3

Secondly the weight of the alpha w1 that of the beta w2and that of the delta w3 should always satisfy w1 gew2 gew3Mathematically speaking the weight of the alpha would bechanged from 10 to 13 along with the searching procedureAnd at the same time the weights of the beta and deltawould be increased from 00 to 13 Generally speaking acosine function could be introduced to describe w1 when werestrict an angle θ to vary in [0 arccos(13)]

irdly the weights should be varied with the cumu-lative iteration number or ldquoitrdquo And we know thatw2 middot w3⟶ 0 when it 0 and w1 w2 w3⟶ 13 whenit⟶infin So we introduce an arc-tangent function about itwhich would be varying from 00 to π2 And magically sin(π4) cos (π4)

2

radic2 so another angular parameter φwas

introduced as follows

φ 12arctan(it) (9)

Consideringw2 would be increased from 00 to 13 alongwith it we hypothesize that it contains sin θ and cosφ andθ⟶ arccos(13) when it⟶infin therefore

θ 2πarccos

13middot arctan(it) (10)

when it⟶infin θ⟶ arccos (13) w2 13 we can thenformulatew2 in details Based on these considerations a newupdate method of the positions with variable weights isproposed as follows

w1 cos θ

w2 12sin θ middot cosφ

w3 1minusw1 minusw2

(11)

e curve of the variable weights is drawn in Figure 1We can then nd that the variable weights satisfy the hy-pothesis the social hierarchy of the grey wolvesrsquo functions intheir behavior of searching

23 Proposed Declined Exponentially Governing Equation ofthe Controlling Parameter In equation (7) the controllingparameter is declined linearly from two to zero when theiterations are carrying on from zero to the maximum NHowever an optimization is usually ended with a maximumadmissible error (MAE) which is requested in engineering is also means that the maximum iteration number N isunknown

Furthermore the controlling parameter is a restrictionparameter for A who is responsible for the grey wolf toapproach or run away from the dominants In other wordsthe controlling parameter governs the grey wolves to searchglobally or locally in the optimizing process e globalsearch probability is expected to be larger when the searchbegins and consequently the local search probability isexpected to be larger when the algorithm is approaching theoptimum erefore to obtain a better performance of theGWO algorithm the controlling parameter is expected to be

decreased quickly when the optimization starts and convergeto the optimum very fast On the contrary some grey wolvesare expected to remain global searching to avoid beingtrapped in local optima Considering these reasons acontrolling parameter declined exponentially [44] is in-troduced as described below

α αmeminusitM (12)

where am is the maximum value and M is an admissiblemaximum iteration number e parameterM restricts thealgorithm to avoid long time running and nonconvergenceIt is expected to be larger than 104 or 105 based on nowadayscomputing hardware used in most laboratories

3 Empirical Studies and theExperiments Prerequisite

e goal of experiments is to verify the advantages of theimproved GWO algorithm with variable weights (VW-GWO) with comparisons to the standard GWO algorithmand other metaheuristic algorithms in this paper Classicallyoptimization algorithms are applied to optimize benchmarkfunctions which were used to describe the real problemshuman meet

31 Empirical Study of the GWO Algorithm Although thereare less numbers of parameters in the GWO algorithm thanthat in other algorithms such as the ALO PSO and batalgorithm (BA) [45] the suitable values of the parametersremain important for the algorithm to be ecient andeconomic Empirical study has been carried out and resultsshow that the population size is expected to be 20sim50balancing the computing complexity and the convergentrate In an empirical study on the parameters of the max-imum value am the sphere function (F1) and Schwefelrsquosproblems 222 (F2) and 12 (F3) are optimized to nd therelationship between am and the mean least iteration timeswith a given error tolerance of 10minus25 as shown in Figure 2

w1w2w3

01

02

03

04

05

06

07

08

09

Wei

ght

10 15 20 25 30 35 40 45 505Iteration

Figure 1 e variable weights vs iterations

4 Computational Intelligence and Neuroscience

We can know from Figure 2 the following (1) themaximum value am of the controlling parameter a inuencesthe MLIT under a given MAE when am is smaller than 10the smaller the am is the more the MLIT would be neededOn the contrary if the am is larger than 25 the larger the amis the more the MLIT would be needed (2) am should bevaried in [10 25] and am is found to be the best when it is16 or 17

32 Benchmark Functions Benchmark functions are stan-dard functions which are derived from the research onnature ey are usually diverse and unbiased dicult to besolved with analytical expressions e benchmark func-tions have been an essential way to test the reliabilityeciency and validation of optimization algorithms eyvaried from the number of ambiguous peaks in the functionlandscape the shape of the basins or valleys reparability tothe dimensional Mathematically speaking the benchmarkfunctions can be classied with the following ve attributes[46]

(a) Continuous or uncontinuous most of the functionsare continuous but some of them are not

(b) Dipounderentiable or nondipounderentiable some of thefunctions can be dipounderenced but some of them not

(c) Separable or nonseparable some of the functions canbe separated but some of them are not

(d) Scalable or nonscalable some of the functions can beexpanded to any dimensional but some of them arexed to two or three dimensionalities

(e) Unimodal or multimodal some of the functionshave only one peak in their landscape but some ofthem have many peaks e former attribute is calledunimodal and the latter is multimodal

ere are 175 benchmark functions being summarizedin literature [46] In this paper we choose 11 benchmarkfunctions from simplicity to complexity including all of theabove ve characteristics ey would be tted to test thecapability of the involved algorithms as listed in Table 2 andthey are all scalable

e functions are all n-dimensional and their inputvectors x (x1 x2 xn) are limited by the domain Valuesin the domain are maximum to be ub and minimum to be lb e single result values are all zeros theoretically forsimplicity

4 Results and Discussion

ere are 11 benchmark functions being involved in thisstudy Comparisons are made with the standard grey wolfoptimization algorithm (std GWO) and three other bionicmethods such as the ant lion optimization algorithm (ALO)the PSO algorithm and BA

41 General Reviews of theAlgorithms e randomness is allinvolved in the algorithms studied in this paper for examplethe random positions random velocities and randomcontrolling parameters e randomness causes the tnessvalues obtained during the optimization procedure touctuate So when an individual of the swarm is initialized

05 1 15 2 25 3 35 4 45 5am

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Min

imum

iter

atio

n (E

=1e

ndash25

)

F1F2F3

Figure 2 Relationship between the MLIT and maximum value am

Computational Intelligence and Neuroscience 5

or it randomly jumps to a position quite near the optimumthe best fitness value would be met Table 3 lists the best andworst fitness results of some chosen benchmark functionsand their corresponding algorithms During this experi-ment 100 Monte Carlo (MC) simulations are carried out forevery benchmark function 0e results show that the ran-domness indeed leads to some random work but at most ofthe time the final results would be more dependent on thealgorithms

0e GWO algorithms always work the best at first glanceof Table 3 either the VM-GWO or the std GWO algorithmcould optimize the benchmark functions best to its optimawith little absolute errors while the proposed VM-GWOalgorithm is almost always the best one Other comparedalgorithms such as the PSO ALO algorithms and the BAwould lead to the worst results at most time0ese mean thatthe GWO algorithms are more capable and the proposedVM-GWO algorithm is indeed improving the capability ofthe std GWO algorithm A figure about the absolute errorsaveraged over MC 100 versus iterations could also lead tothis conclusion as shown in Figure 3

0e convergence rate curve during the iterations ofF3 benchmark function is demonstrated in Figure 3 Itshows that the proposed VM-GWO algorithm wouldresult in faster converging low residual errors and stableconvergence

42 Comparison Statistical Analysis and Test General ac-quaintances of the metaheuristic algorithms might be gotfrom Table 3 and Figure 3 However the optimizationproblems often demand the statistical analysis and test Todo this 100MC simulations are carried out on the bench-mark functions 0e benchmark functions are all two di-mensional and they are optimized by the new proposedVM-GWO and other four algorithms over 100 timesCausing the benchmark functions are all concentrated tozeros and the simulated fitness results are also their absoluteerrors 0e mean values of the absolute errors and thestandard deviations of the final results are listed in Table 4

some of the values are quoted from the published jobs andreferences are listed correspondingly

0e proposed VM-GWO algorithm and its comparedalgorithms are almost all capable of searching the globaloptima of the benchmark functions 0e detailed values inTable 4 show that the standard deviations of the 100MCsimulations are all small We can further draw the followingconclusions

(1) All of the algorithms involved in this study were ableto find the optimum

(2) All of the benchmark functions tested in this ex-periment could be optimized whether they areunimodal or multimodal under the symmetric orunsymmetric domain

(3) Comparatively speaking although the bat algorithmis composed of much more randomness it did the

Table 2 Benchmark functions to be fitted

Label Function name Expressions Domain [lb ub]F1 De Jongrsquos sphere y 1113936

ni1x

2i [minus100 100]

F2 Schwefelrsquos problems 222 y 1113936ni1lfloorxirfloor + 1113937

ni1lfloorxirfloor [minus100 100]

F3 Schwefelrsquos problem 12 y 1113936ni11113936

ij1x

2j [minus100 100]

F4 Schwefelrsquos problem 221 y max1leilenlfloorxirfloor [minus100 100]

F5 Chung Reynolds function y (1113936ni1x

2i )2 [minus100 100]

F6 Schwefelrsquos problem 220 y 1113936ni1lfloorxirfloor [minus100 100]

F7 Csendes function y 1113936ni1x

2i (2 + sin(1xi)) [minus1 1]

F8 Exponential function y minuseminus051113936n

i1x2i [minus1 1]

F9 Griewankrsquos function y 1113936ni1(x2

i 4000)minus1113937ni1cos(xi

i

radic) + 1 [minus100 100]

F10 Salomon function y 1minus cos11138742π1113936

ni1x

2i

1113969

1113875 + 011113936

ni1x

2i

1113969[minus100 100]

F11 Zakharov function y 1113936ni1x

2i + 11138741113936

ni105ixi1113875

2+ 11138741113936

ni105ixi1113875

4[minus5 10]

Table 3 0e best and worst simulation results and their corre-sponding algorithms (dim 2)

Functions Value Corresponding algorithmBest fitnessF1 14238eminus 70 VM-GWOF2 32617eminus 36 VM-GWOF3 36792eminus 68 VM-GWOF4 33655eminus 66 Std GWOF7 78721eminus 222 VM-GWOF8 0 VM-GWO Std GWO PSO BAF9 0 VM-GWO Std GWOF11 26230eminus 69 VM-GWOWorst fitnessF1 10213eminus 07 BAF2 41489eminus 04 BAF3 59510eminus 08 BAF4 24192eminus 06 PSOF7 10627eminus 24 BAF8 57010eminus 13 BAF9 10850eminus 01 ALOF11 99157eminus 09 BA

6 Computational Intelligence and Neuroscience

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 3: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

0e controlling parameter a changes Ararr

and finallycauses the omega wolves to approach or run away from thedominant wolves such as the alpha beta and delta 0eo-retically if |A

rarr|gt 1 the grey wolves run away from the

dominants and this means the omega wolves would runaway from the prey and explore more space which is calleda global search in optimization And if |A

rarr|lt 1 they ap-

proach the dominants which means the omega wolveswould follow the dominants approaching the prey and thisis called a local search in optimization

0e controlling parameter a is defined to be declinedlinearly from a maximum value of 2 to zero while the it-erations are being carried on

α 2 1minusitN

1113874 1113875 (7)

where N is the maximum iteration number and it is ini-tialized at the beginning by users It is defined as the cu-mulative iteration number 0e application procedure can bedivided in three parts (1) 0e given problems are understoodand mathematically described and some elemental param-eters are then known (2) A pack of grey wolves are randomlyinitialized all through the space domain (3) 0e alpha andother dominant grey wolves lead the pack to search pursueand encircle the prey When the prey is encircled by the greywolves and it stops moving the search finishes and attacksbegin 0e pseudocode is listed in Table 1

22 Proposed Variable Weights and eir GoverningEquations We can see from the governing equation (5) thatthe dominants play a same role in the searching processevery one of the grey wolves approaches or runs away fromthe dominants with an average weight of the alpha beta anddelta However although the alpha is the nearest to the preyat the beginning of the search it might be far away from thefinal result let alone the beta and delta 0erefore at thebeginning of the searching procedure only the position ofthe alpha should be considered in equation (5) or its weightshould be much larger than those of other dominants Onthe contrary the averaging weight in equation (5) is alsoagainst the social hierarchy hypothesis of the grey wolves Ifthe social hierarchy is strictly followed in the pack the alphais the leader and heshe might be always the nearest one tothe prey 0e alpha wolf should be the most importantwhich means that the weight of alpharsquos position in equation(5) should be always no less than those of the beta and thedelta And consequently the weight of the betarsquos positionshould be always no less than that of the delta Based on theseconsiderations we further hypothesize the following

(1) 0e searching and hunting process are always gov-erned by the alpha the beta plays a less importantrole and the delta plays a much less role All of theother grey wolves transfer hisher position to thealpha if heshe gets the bestIt should be noted that in real searching and huntingprocedures the best position is nearest to the preywhile in optimization for a global optimum of a given

problem the best position is the maximum or min-imum of the fitness value under given restrictions

(2) During the searching process a hypothesized prey isalways surrounded by the dominants while inhunting process a real prey is encircled 0e dom-inant grey wolves are at positions surrounding theprey in order of their social hierarchy 0is meansthat the alpha is the nearest one among the greywolves the beta is the nearest one in the pack exceptfor the alpha and the delta ranks the third 0eomega wolves are involved in the processes and theytransfer their better positions to the dominants

With hypothesis mentioned hereinbefore the updatemethod of the positions should not be considered the samein equation (5)

When the search begins the alpha is the nearest and therest are all not important So hisher position should becontributed to the new searching individuals while all of theothers could be ignored 0is means that the weight of thealpha should be near to 10 at the beginning while theweights of the beta and delta could be near zero at this timeAt the final state the alpha beta and the delta wolves shouldencircle the prey which means they have an equal weight asmentioned in equation (5) Along with the searching pro-cedure from the beginning to the end the beta comes upwith the alpha as heshe always rank the second and thedelta comes up with the beta due to hisher third rank 0ismeans that the weights of the beta and delta arise along withthe cumulative iteration number So the weight of the alphashould be reduced and the weights of the beta and deltaarise

0e above ideas could be formulated in mathematicsFirst of all all of the weights should be varied and limited to10 when they are summed up Equation (5) is then changedas follows

Xrarr

(t + 1) w1X1rarr

+ w2X2rarr

+ w3X3rarr

w1 + w2 + w3 1(8)

Table 1 Pseudocode of the GWO algorithm

Description Pseudocode

Set upoptimization

Dimension of the given problemsLimitations of the given problems

Population sizeControlling parameter

Stop criterion (maximum iteration times oradmissible errors)

Initialization Positions of all of the grey wolves including αβ and δ wolves

Searching

While not the stop criterion calculate the newfitness function

Update the positionsLimit the scope of positions

Refresh α β and δUpdate the stop criterion

End

Computational Intelligence and Neuroscience 3

Secondly the weight of the alpha w1 that of the beta w2and that of the delta w3 should always satisfy w1 gew2 gew3Mathematically speaking the weight of the alpha would bechanged from 10 to 13 along with the searching procedureAnd at the same time the weights of the beta and deltawould be increased from 00 to 13 Generally speaking acosine function could be introduced to describe w1 when werestrict an angle θ to vary in [0 arccos(13)]

irdly the weights should be varied with the cumu-lative iteration number or ldquoitrdquo And we know thatw2 middot w3⟶ 0 when it 0 and w1 w2 w3⟶ 13 whenit⟶infin So we introduce an arc-tangent function about itwhich would be varying from 00 to π2 And magically sin(π4) cos (π4)

2

radic2 so another angular parameter φwas

introduced as follows

φ 12arctan(it) (9)

Consideringw2 would be increased from 00 to 13 alongwith it we hypothesize that it contains sin θ and cosφ andθ⟶ arccos(13) when it⟶infin therefore

θ 2πarccos

13middot arctan(it) (10)

when it⟶infin θ⟶ arccos (13) w2 13 we can thenformulatew2 in details Based on these considerations a newupdate method of the positions with variable weights isproposed as follows

w1 cos θ

w2 12sin θ middot cosφ

w3 1minusw1 minusw2

(11)

e curve of the variable weights is drawn in Figure 1We can then nd that the variable weights satisfy the hy-pothesis the social hierarchy of the grey wolvesrsquo functions intheir behavior of searching

23 Proposed Declined Exponentially Governing Equation ofthe Controlling Parameter In equation (7) the controllingparameter is declined linearly from two to zero when theiterations are carrying on from zero to the maximum NHowever an optimization is usually ended with a maximumadmissible error (MAE) which is requested in engineering is also means that the maximum iteration number N isunknown

Furthermore the controlling parameter is a restrictionparameter for A who is responsible for the grey wolf toapproach or run away from the dominants In other wordsthe controlling parameter governs the grey wolves to searchglobally or locally in the optimizing process e globalsearch probability is expected to be larger when the searchbegins and consequently the local search probability isexpected to be larger when the algorithm is approaching theoptimum erefore to obtain a better performance of theGWO algorithm the controlling parameter is expected to be

decreased quickly when the optimization starts and convergeto the optimum very fast On the contrary some grey wolvesare expected to remain global searching to avoid beingtrapped in local optima Considering these reasons acontrolling parameter declined exponentially [44] is in-troduced as described below

α αmeminusitM (12)

where am is the maximum value and M is an admissiblemaximum iteration number e parameterM restricts thealgorithm to avoid long time running and nonconvergenceIt is expected to be larger than 104 or 105 based on nowadayscomputing hardware used in most laboratories

3 Empirical Studies and theExperiments Prerequisite

e goal of experiments is to verify the advantages of theimproved GWO algorithm with variable weights (VW-GWO) with comparisons to the standard GWO algorithmand other metaheuristic algorithms in this paper Classicallyoptimization algorithms are applied to optimize benchmarkfunctions which were used to describe the real problemshuman meet

31 Empirical Study of the GWO Algorithm Although thereare less numbers of parameters in the GWO algorithm thanthat in other algorithms such as the ALO PSO and batalgorithm (BA) [45] the suitable values of the parametersremain important for the algorithm to be ecient andeconomic Empirical study has been carried out and resultsshow that the population size is expected to be 20sim50balancing the computing complexity and the convergentrate In an empirical study on the parameters of the max-imum value am the sphere function (F1) and Schwefelrsquosproblems 222 (F2) and 12 (F3) are optimized to nd therelationship between am and the mean least iteration timeswith a given error tolerance of 10minus25 as shown in Figure 2

w1w2w3

01

02

03

04

05

06

07

08

09

Wei

ght

10 15 20 25 30 35 40 45 505Iteration

Figure 1 e variable weights vs iterations

4 Computational Intelligence and Neuroscience

We can know from Figure 2 the following (1) themaximum value am of the controlling parameter a inuencesthe MLIT under a given MAE when am is smaller than 10the smaller the am is the more the MLIT would be neededOn the contrary if the am is larger than 25 the larger the amis the more the MLIT would be needed (2) am should bevaried in [10 25] and am is found to be the best when it is16 or 17

32 Benchmark Functions Benchmark functions are stan-dard functions which are derived from the research onnature ey are usually diverse and unbiased dicult to besolved with analytical expressions e benchmark func-tions have been an essential way to test the reliabilityeciency and validation of optimization algorithms eyvaried from the number of ambiguous peaks in the functionlandscape the shape of the basins or valleys reparability tothe dimensional Mathematically speaking the benchmarkfunctions can be classied with the following ve attributes[46]

(a) Continuous or uncontinuous most of the functionsare continuous but some of them are not

(b) Dipounderentiable or nondipounderentiable some of thefunctions can be dipounderenced but some of them not

(c) Separable or nonseparable some of the functions canbe separated but some of them are not

(d) Scalable or nonscalable some of the functions can beexpanded to any dimensional but some of them arexed to two or three dimensionalities

(e) Unimodal or multimodal some of the functionshave only one peak in their landscape but some ofthem have many peaks e former attribute is calledunimodal and the latter is multimodal

ere are 175 benchmark functions being summarizedin literature [46] In this paper we choose 11 benchmarkfunctions from simplicity to complexity including all of theabove ve characteristics ey would be tted to test thecapability of the involved algorithms as listed in Table 2 andthey are all scalable

e functions are all n-dimensional and their inputvectors x (x1 x2 xn) are limited by the domain Valuesin the domain are maximum to be ub and minimum to be lb e single result values are all zeros theoretically forsimplicity

4 Results and Discussion

ere are 11 benchmark functions being involved in thisstudy Comparisons are made with the standard grey wolfoptimization algorithm (std GWO) and three other bionicmethods such as the ant lion optimization algorithm (ALO)the PSO algorithm and BA

41 General Reviews of theAlgorithms e randomness is allinvolved in the algorithms studied in this paper for examplethe random positions random velocities and randomcontrolling parameters e randomness causes the tnessvalues obtained during the optimization procedure touctuate So when an individual of the swarm is initialized

05 1 15 2 25 3 35 4 45 5am

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Min

imum

iter

atio

n (E

=1e

ndash25

)

F1F2F3

Figure 2 Relationship between the MLIT and maximum value am

Computational Intelligence and Neuroscience 5

or it randomly jumps to a position quite near the optimumthe best fitness value would be met Table 3 lists the best andworst fitness results of some chosen benchmark functionsand their corresponding algorithms During this experi-ment 100 Monte Carlo (MC) simulations are carried out forevery benchmark function 0e results show that the ran-domness indeed leads to some random work but at most ofthe time the final results would be more dependent on thealgorithms

0e GWO algorithms always work the best at first glanceof Table 3 either the VM-GWO or the std GWO algorithmcould optimize the benchmark functions best to its optimawith little absolute errors while the proposed VM-GWOalgorithm is almost always the best one Other comparedalgorithms such as the PSO ALO algorithms and the BAwould lead to the worst results at most time0ese mean thatthe GWO algorithms are more capable and the proposedVM-GWO algorithm is indeed improving the capability ofthe std GWO algorithm A figure about the absolute errorsaveraged over MC 100 versus iterations could also lead tothis conclusion as shown in Figure 3

0e convergence rate curve during the iterations ofF3 benchmark function is demonstrated in Figure 3 Itshows that the proposed VM-GWO algorithm wouldresult in faster converging low residual errors and stableconvergence

42 Comparison Statistical Analysis and Test General ac-quaintances of the metaheuristic algorithms might be gotfrom Table 3 and Figure 3 However the optimizationproblems often demand the statistical analysis and test Todo this 100MC simulations are carried out on the bench-mark functions 0e benchmark functions are all two di-mensional and they are optimized by the new proposedVM-GWO and other four algorithms over 100 timesCausing the benchmark functions are all concentrated tozeros and the simulated fitness results are also their absoluteerrors 0e mean values of the absolute errors and thestandard deviations of the final results are listed in Table 4

some of the values are quoted from the published jobs andreferences are listed correspondingly

0e proposed VM-GWO algorithm and its comparedalgorithms are almost all capable of searching the globaloptima of the benchmark functions 0e detailed values inTable 4 show that the standard deviations of the 100MCsimulations are all small We can further draw the followingconclusions

(1) All of the algorithms involved in this study were ableto find the optimum

(2) All of the benchmark functions tested in this ex-periment could be optimized whether they areunimodal or multimodal under the symmetric orunsymmetric domain

(3) Comparatively speaking although the bat algorithmis composed of much more randomness it did the

Table 2 Benchmark functions to be fitted

Label Function name Expressions Domain [lb ub]F1 De Jongrsquos sphere y 1113936

ni1x

2i [minus100 100]

F2 Schwefelrsquos problems 222 y 1113936ni1lfloorxirfloor + 1113937

ni1lfloorxirfloor [minus100 100]

F3 Schwefelrsquos problem 12 y 1113936ni11113936

ij1x

2j [minus100 100]

F4 Schwefelrsquos problem 221 y max1leilenlfloorxirfloor [minus100 100]

F5 Chung Reynolds function y (1113936ni1x

2i )2 [minus100 100]

F6 Schwefelrsquos problem 220 y 1113936ni1lfloorxirfloor [minus100 100]

F7 Csendes function y 1113936ni1x

2i (2 + sin(1xi)) [minus1 1]

F8 Exponential function y minuseminus051113936n

i1x2i [minus1 1]

F9 Griewankrsquos function y 1113936ni1(x2

i 4000)minus1113937ni1cos(xi

i

radic) + 1 [minus100 100]

F10 Salomon function y 1minus cos11138742π1113936

ni1x

2i

1113969

1113875 + 011113936

ni1x

2i

1113969[minus100 100]

F11 Zakharov function y 1113936ni1x

2i + 11138741113936

ni105ixi1113875

2+ 11138741113936

ni105ixi1113875

4[minus5 10]

Table 3 0e best and worst simulation results and their corre-sponding algorithms (dim 2)

Functions Value Corresponding algorithmBest fitnessF1 14238eminus 70 VM-GWOF2 32617eminus 36 VM-GWOF3 36792eminus 68 VM-GWOF4 33655eminus 66 Std GWOF7 78721eminus 222 VM-GWOF8 0 VM-GWO Std GWO PSO BAF9 0 VM-GWO Std GWOF11 26230eminus 69 VM-GWOWorst fitnessF1 10213eminus 07 BAF2 41489eminus 04 BAF3 59510eminus 08 BAF4 24192eminus 06 PSOF7 10627eminus 24 BAF8 57010eminus 13 BAF9 10850eminus 01 ALOF11 99157eminus 09 BA

6 Computational Intelligence and Neuroscience

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 4: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

Secondly the weight of the alpha w1 that of the beta w2and that of the delta w3 should always satisfy w1 gew2 gew3Mathematically speaking the weight of the alpha would bechanged from 10 to 13 along with the searching procedureAnd at the same time the weights of the beta and deltawould be increased from 00 to 13 Generally speaking acosine function could be introduced to describe w1 when werestrict an angle θ to vary in [0 arccos(13)]

irdly the weights should be varied with the cumu-lative iteration number or ldquoitrdquo And we know thatw2 middot w3⟶ 0 when it 0 and w1 w2 w3⟶ 13 whenit⟶infin So we introduce an arc-tangent function about itwhich would be varying from 00 to π2 And magically sin(π4) cos (π4)

2

radic2 so another angular parameter φwas

introduced as follows

φ 12arctan(it) (9)

Consideringw2 would be increased from 00 to 13 alongwith it we hypothesize that it contains sin θ and cosφ andθ⟶ arccos(13) when it⟶infin therefore

θ 2πarccos

13middot arctan(it) (10)

when it⟶infin θ⟶ arccos (13) w2 13 we can thenformulatew2 in details Based on these considerations a newupdate method of the positions with variable weights isproposed as follows

w1 cos θ

w2 12sin θ middot cosφ

w3 1minusw1 minusw2

(11)

e curve of the variable weights is drawn in Figure 1We can then nd that the variable weights satisfy the hy-pothesis the social hierarchy of the grey wolvesrsquo functions intheir behavior of searching

23 Proposed Declined Exponentially Governing Equation ofthe Controlling Parameter In equation (7) the controllingparameter is declined linearly from two to zero when theiterations are carrying on from zero to the maximum NHowever an optimization is usually ended with a maximumadmissible error (MAE) which is requested in engineering is also means that the maximum iteration number N isunknown

Furthermore the controlling parameter is a restrictionparameter for A who is responsible for the grey wolf toapproach or run away from the dominants In other wordsthe controlling parameter governs the grey wolves to searchglobally or locally in the optimizing process e globalsearch probability is expected to be larger when the searchbegins and consequently the local search probability isexpected to be larger when the algorithm is approaching theoptimum erefore to obtain a better performance of theGWO algorithm the controlling parameter is expected to be

decreased quickly when the optimization starts and convergeto the optimum very fast On the contrary some grey wolvesare expected to remain global searching to avoid beingtrapped in local optima Considering these reasons acontrolling parameter declined exponentially [44] is in-troduced as described below

α αmeminusitM (12)

where am is the maximum value and M is an admissiblemaximum iteration number e parameterM restricts thealgorithm to avoid long time running and nonconvergenceIt is expected to be larger than 104 or 105 based on nowadayscomputing hardware used in most laboratories

3 Empirical Studies and theExperiments Prerequisite

e goal of experiments is to verify the advantages of theimproved GWO algorithm with variable weights (VW-GWO) with comparisons to the standard GWO algorithmand other metaheuristic algorithms in this paper Classicallyoptimization algorithms are applied to optimize benchmarkfunctions which were used to describe the real problemshuman meet

31 Empirical Study of the GWO Algorithm Although thereare less numbers of parameters in the GWO algorithm thanthat in other algorithms such as the ALO PSO and batalgorithm (BA) [45] the suitable values of the parametersremain important for the algorithm to be ecient andeconomic Empirical study has been carried out and resultsshow that the population size is expected to be 20sim50balancing the computing complexity and the convergentrate In an empirical study on the parameters of the max-imum value am the sphere function (F1) and Schwefelrsquosproblems 222 (F2) and 12 (F3) are optimized to nd therelationship between am and the mean least iteration timeswith a given error tolerance of 10minus25 as shown in Figure 2

w1w2w3

01

02

03

04

05

06

07

08

09

Wei

ght

10 15 20 25 30 35 40 45 505Iteration

Figure 1 e variable weights vs iterations

4 Computational Intelligence and Neuroscience

We can know from Figure 2 the following (1) themaximum value am of the controlling parameter a inuencesthe MLIT under a given MAE when am is smaller than 10the smaller the am is the more the MLIT would be neededOn the contrary if the am is larger than 25 the larger the amis the more the MLIT would be needed (2) am should bevaried in [10 25] and am is found to be the best when it is16 or 17

32 Benchmark Functions Benchmark functions are stan-dard functions which are derived from the research onnature ey are usually diverse and unbiased dicult to besolved with analytical expressions e benchmark func-tions have been an essential way to test the reliabilityeciency and validation of optimization algorithms eyvaried from the number of ambiguous peaks in the functionlandscape the shape of the basins or valleys reparability tothe dimensional Mathematically speaking the benchmarkfunctions can be classied with the following ve attributes[46]

(a) Continuous or uncontinuous most of the functionsare continuous but some of them are not

(b) Dipounderentiable or nondipounderentiable some of thefunctions can be dipounderenced but some of them not

(c) Separable or nonseparable some of the functions canbe separated but some of them are not

(d) Scalable or nonscalable some of the functions can beexpanded to any dimensional but some of them arexed to two or three dimensionalities

(e) Unimodal or multimodal some of the functionshave only one peak in their landscape but some ofthem have many peaks e former attribute is calledunimodal and the latter is multimodal

ere are 175 benchmark functions being summarizedin literature [46] In this paper we choose 11 benchmarkfunctions from simplicity to complexity including all of theabove ve characteristics ey would be tted to test thecapability of the involved algorithms as listed in Table 2 andthey are all scalable

e functions are all n-dimensional and their inputvectors x (x1 x2 xn) are limited by the domain Valuesin the domain are maximum to be ub and minimum to be lb e single result values are all zeros theoretically forsimplicity

4 Results and Discussion

ere are 11 benchmark functions being involved in thisstudy Comparisons are made with the standard grey wolfoptimization algorithm (std GWO) and three other bionicmethods such as the ant lion optimization algorithm (ALO)the PSO algorithm and BA

41 General Reviews of theAlgorithms e randomness is allinvolved in the algorithms studied in this paper for examplethe random positions random velocities and randomcontrolling parameters e randomness causes the tnessvalues obtained during the optimization procedure touctuate So when an individual of the swarm is initialized

05 1 15 2 25 3 35 4 45 5am

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Min

imum

iter

atio

n (E

=1e

ndash25

)

F1F2F3

Figure 2 Relationship between the MLIT and maximum value am

Computational Intelligence and Neuroscience 5

or it randomly jumps to a position quite near the optimumthe best fitness value would be met Table 3 lists the best andworst fitness results of some chosen benchmark functionsand their corresponding algorithms During this experi-ment 100 Monte Carlo (MC) simulations are carried out forevery benchmark function 0e results show that the ran-domness indeed leads to some random work but at most ofthe time the final results would be more dependent on thealgorithms

0e GWO algorithms always work the best at first glanceof Table 3 either the VM-GWO or the std GWO algorithmcould optimize the benchmark functions best to its optimawith little absolute errors while the proposed VM-GWOalgorithm is almost always the best one Other comparedalgorithms such as the PSO ALO algorithms and the BAwould lead to the worst results at most time0ese mean thatthe GWO algorithms are more capable and the proposedVM-GWO algorithm is indeed improving the capability ofthe std GWO algorithm A figure about the absolute errorsaveraged over MC 100 versus iterations could also lead tothis conclusion as shown in Figure 3

0e convergence rate curve during the iterations ofF3 benchmark function is demonstrated in Figure 3 Itshows that the proposed VM-GWO algorithm wouldresult in faster converging low residual errors and stableconvergence

42 Comparison Statistical Analysis and Test General ac-quaintances of the metaheuristic algorithms might be gotfrom Table 3 and Figure 3 However the optimizationproblems often demand the statistical analysis and test Todo this 100MC simulations are carried out on the bench-mark functions 0e benchmark functions are all two di-mensional and they are optimized by the new proposedVM-GWO and other four algorithms over 100 timesCausing the benchmark functions are all concentrated tozeros and the simulated fitness results are also their absoluteerrors 0e mean values of the absolute errors and thestandard deviations of the final results are listed in Table 4

some of the values are quoted from the published jobs andreferences are listed correspondingly

0e proposed VM-GWO algorithm and its comparedalgorithms are almost all capable of searching the globaloptima of the benchmark functions 0e detailed values inTable 4 show that the standard deviations of the 100MCsimulations are all small We can further draw the followingconclusions

(1) All of the algorithms involved in this study were ableto find the optimum

(2) All of the benchmark functions tested in this ex-periment could be optimized whether they areunimodal or multimodal under the symmetric orunsymmetric domain

(3) Comparatively speaking although the bat algorithmis composed of much more randomness it did the

Table 2 Benchmark functions to be fitted

Label Function name Expressions Domain [lb ub]F1 De Jongrsquos sphere y 1113936

ni1x

2i [minus100 100]

F2 Schwefelrsquos problems 222 y 1113936ni1lfloorxirfloor + 1113937

ni1lfloorxirfloor [minus100 100]

F3 Schwefelrsquos problem 12 y 1113936ni11113936

ij1x

2j [minus100 100]

F4 Schwefelrsquos problem 221 y max1leilenlfloorxirfloor [minus100 100]

F5 Chung Reynolds function y (1113936ni1x

2i )2 [minus100 100]

F6 Schwefelrsquos problem 220 y 1113936ni1lfloorxirfloor [minus100 100]

F7 Csendes function y 1113936ni1x

2i (2 + sin(1xi)) [minus1 1]

F8 Exponential function y minuseminus051113936n

i1x2i [minus1 1]

F9 Griewankrsquos function y 1113936ni1(x2

i 4000)minus1113937ni1cos(xi

i

radic) + 1 [minus100 100]

F10 Salomon function y 1minus cos11138742π1113936

ni1x

2i

1113969

1113875 + 011113936

ni1x

2i

1113969[minus100 100]

F11 Zakharov function y 1113936ni1x

2i + 11138741113936

ni105ixi1113875

2+ 11138741113936

ni105ixi1113875

4[minus5 10]

Table 3 0e best and worst simulation results and their corre-sponding algorithms (dim 2)

Functions Value Corresponding algorithmBest fitnessF1 14238eminus 70 VM-GWOF2 32617eminus 36 VM-GWOF3 36792eminus 68 VM-GWOF4 33655eminus 66 Std GWOF7 78721eminus 222 VM-GWOF8 0 VM-GWO Std GWO PSO BAF9 0 VM-GWO Std GWOF11 26230eminus 69 VM-GWOWorst fitnessF1 10213eminus 07 BAF2 41489eminus 04 BAF3 59510eminus 08 BAF4 24192eminus 06 PSOF7 10627eminus 24 BAF8 57010eminus 13 BAF9 10850eminus 01 ALOF11 99157eminus 09 BA

6 Computational Intelligence and Neuroscience

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 5: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

We can know from Figure 2 the following (1) themaximum value am of the controlling parameter a inuencesthe MLIT under a given MAE when am is smaller than 10the smaller the am is the more the MLIT would be neededOn the contrary if the am is larger than 25 the larger the amis the more the MLIT would be needed (2) am should bevaried in [10 25] and am is found to be the best when it is16 or 17

32 Benchmark Functions Benchmark functions are stan-dard functions which are derived from the research onnature ey are usually diverse and unbiased dicult to besolved with analytical expressions e benchmark func-tions have been an essential way to test the reliabilityeciency and validation of optimization algorithms eyvaried from the number of ambiguous peaks in the functionlandscape the shape of the basins or valleys reparability tothe dimensional Mathematically speaking the benchmarkfunctions can be classied with the following ve attributes[46]

(a) Continuous or uncontinuous most of the functionsare continuous but some of them are not

(b) Dipounderentiable or nondipounderentiable some of thefunctions can be dipounderenced but some of them not

(c) Separable or nonseparable some of the functions canbe separated but some of them are not

(d) Scalable or nonscalable some of the functions can beexpanded to any dimensional but some of them arexed to two or three dimensionalities

(e) Unimodal or multimodal some of the functionshave only one peak in their landscape but some ofthem have many peaks e former attribute is calledunimodal and the latter is multimodal

ere are 175 benchmark functions being summarizedin literature [46] In this paper we choose 11 benchmarkfunctions from simplicity to complexity including all of theabove ve characteristics ey would be tted to test thecapability of the involved algorithms as listed in Table 2 andthey are all scalable

e functions are all n-dimensional and their inputvectors x (x1 x2 xn) are limited by the domain Valuesin the domain are maximum to be ub and minimum to be lb e single result values are all zeros theoretically forsimplicity

4 Results and Discussion

ere are 11 benchmark functions being involved in thisstudy Comparisons are made with the standard grey wolfoptimization algorithm (std GWO) and three other bionicmethods such as the ant lion optimization algorithm (ALO)the PSO algorithm and BA

41 General Reviews of theAlgorithms e randomness is allinvolved in the algorithms studied in this paper for examplethe random positions random velocities and randomcontrolling parameters e randomness causes the tnessvalues obtained during the optimization procedure touctuate So when an individual of the swarm is initialized

05 1 15 2 25 3 35 4 45 5am

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Min

imum

iter

atio

n (E

=1e

ndash25

)

F1F2F3

Figure 2 Relationship between the MLIT and maximum value am

Computational Intelligence and Neuroscience 5

or it randomly jumps to a position quite near the optimumthe best fitness value would be met Table 3 lists the best andworst fitness results of some chosen benchmark functionsand their corresponding algorithms During this experi-ment 100 Monte Carlo (MC) simulations are carried out forevery benchmark function 0e results show that the ran-domness indeed leads to some random work but at most ofthe time the final results would be more dependent on thealgorithms

0e GWO algorithms always work the best at first glanceof Table 3 either the VM-GWO or the std GWO algorithmcould optimize the benchmark functions best to its optimawith little absolute errors while the proposed VM-GWOalgorithm is almost always the best one Other comparedalgorithms such as the PSO ALO algorithms and the BAwould lead to the worst results at most time0ese mean thatthe GWO algorithms are more capable and the proposedVM-GWO algorithm is indeed improving the capability ofthe std GWO algorithm A figure about the absolute errorsaveraged over MC 100 versus iterations could also lead tothis conclusion as shown in Figure 3

0e convergence rate curve during the iterations ofF3 benchmark function is demonstrated in Figure 3 Itshows that the proposed VM-GWO algorithm wouldresult in faster converging low residual errors and stableconvergence

42 Comparison Statistical Analysis and Test General ac-quaintances of the metaheuristic algorithms might be gotfrom Table 3 and Figure 3 However the optimizationproblems often demand the statistical analysis and test Todo this 100MC simulations are carried out on the bench-mark functions 0e benchmark functions are all two di-mensional and they are optimized by the new proposedVM-GWO and other four algorithms over 100 timesCausing the benchmark functions are all concentrated tozeros and the simulated fitness results are also their absoluteerrors 0e mean values of the absolute errors and thestandard deviations of the final results are listed in Table 4

some of the values are quoted from the published jobs andreferences are listed correspondingly

0e proposed VM-GWO algorithm and its comparedalgorithms are almost all capable of searching the globaloptima of the benchmark functions 0e detailed values inTable 4 show that the standard deviations of the 100MCsimulations are all small We can further draw the followingconclusions

(1) All of the algorithms involved in this study were ableto find the optimum

(2) All of the benchmark functions tested in this ex-periment could be optimized whether they areunimodal or multimodal under the symmetric orunsymmetric domain

(3) Comparatively speaking although the bat algorithmis composed of much more randomness it did the

Table 2 Benchmark functions to be fitted

Label Function name Expressions Domain [lb ub]F1 De Jongrsquos sphere y 1113936

ni1x

2i [minus100 100]

F2 Schwefelrsquos problems 222 y 1113936ni1lfloorxirfloor + 1113937

ni1lfloorxirfloor [minus100 100]

F3 Schwefelrsquos problem 12 y 1113936ni11113936

ij1x

2j [minus100 100]

F4 Schwefelrsquos problem 221 y max1leilenlfloorxirfloor [minus100 100]

F5 Chung Reynolds function y (1113936ni1x

2i )2 [minus100 100]

F6 Schwefelrsquos problem 220 y 1113936ni1lfloorxirfloor [minus100 100]

F7 Csendes function y 1113936ni1x

2i (2 + sin(1xi)) [minus1 1]

F8 Exponential function y minuseminus051113936n

i1x2i [minus1 1]

F9 Griewankrsquos function y 1113936ni1(x2

i 4000)minus1113937ni1cos(xi

i

radic) + 1 [minus100 100]

F10 Salomon function y 1minus cos11138742π1113936

ni1x

2i

1113969

1113875 + 011113936

ni1x

2i

1113969[minus100 100]

F11 Zakharov function y 1113936ni1x

2i + 11138741113936

ni105ixi1113875

2+ 11138741113936

ni105ixi1113875

4[minus5 10]

Table 3 0e best and worst simulation results and their corre-sponding algorithms (dim 2)

Functions Value Corresponding algorithmBest fitnessF1 14238eminus 70 VM-GWOF2 32617eminus 36 VM-GWOF3 36792eminus 68 VM-GWOF4 33655eminus 66 Std GWOF7 78721eminus 222 VM-GWOF8 0 VM-GWO Std GWO PSO BAF9 0 VM-GWO Std GWOF11 26230eminus 69 VM-GWOWorst fitnessF1 10213eminus 07 BAF2 41489eminus 04 BAF3 59510eminus 08 BAF4 24192eminus 06 PSOF7 10627eminus 24 BAF8 57010eminus 13 BAF9 10850eminus 01 ALOF11 99157eminus 09 BA

6 Computational Intelligence and Neuroscience

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 6: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

or it randomly jumps to a position quite near the optimumthe best fitness value would be met Table 3 lists the best andworst fitness results of some chosen benchmark functionsand their corresponding algorithms During this experi-ment 100 Monte Carlo (MC) simulations are carried out forevery benchmark function 0e results show that the ran-domness indeed leads to some random work but at most ofthe time the final results would be more dependent on thealgorithms

0e GWO algorithms always work the best at first glanceof Table 3 either the VM-GWO or the std GWO algorithmcould optimize the benchmark functions best to its optimawith little absolute errors while the proposed VM-GWOalgorithm is almost always the best one Other comparedalgorithms such as the PSO ALO algorithms and the BAwould lead to the worst results at most time0ese mean thatthe GWO algorithms are more capable and the proposedVM-GWO algorithm is indeed improving the capability ofthe std GWO algorithm A figure about the absolute errorsaveraged over MC 100 versus iterations could also lead tothis conclusion as shown in Figure 3

0e convergence rate curve during the iterations ofF3 benchmark function is demonstrated in Figure 3 Itshows that the proposed VM-GWO algorithm wouldresult in faster converging low residual errors and stableconvergence

42 Comparison Statistical Analysis and Test General ac-quaintances of the metaheuristic algorithms might be gotfrom Table 3 and Figure 3 However the optimizationproblems often demand the statistical analysis and test Todo this 100MC simulations are carried out on the bench-mark functions 0e benchmark functions are all two di-mensional and they are optimized by the new proposedVM-GWO and other four algorithms over 100 timesCausing the benchmark functions are all concentrated tozeros and the simulated fitness results are also their absoluteerrors 0e mean values of the absolute errors and thestandard deviations of the final results are listed in Table 4

some of the values are quoted from the published jobs andreferences are listed correspondingly

0e proposed VM-GWO algorithm and its comparedalgorithms are almost all capable of searching the globaloptima of the benchmark functions 0e detailed values inTable 4 show that the standard deviations of the 100MCsimulations are all small We can further draw the followingconclusions

(1) All of the algorithms involved in this study were ableto find the optimum

(2) All of the benchmark functions tested in this ex-periment could be optimized whether they areunimodal or multimodal under the symmetric orunsymmetric domain

(3) Comparatively speaking although the bat algorithmis composed of much more randomness it did the

Table 2 Benchmark functions to be fitted

Label Function name Expressions Domain [lb ub]F1 De Jongrsquos sphere y 1113936

ni1x

2i [minus100 100]

F2 Schwefelrsquos problems 222 y 1113936ni1lfloorxirfloor + 1113937

ni1lfloorxirfloor [minus100 100]

F3 Schwefelrsquos problem 12 y 1113936ni11113936

ij1x

2j [minus100 100]

F4 Schwefelrsquos problem 221 y max1leilenlfloorxirfloor [minus100 100]

F5 Chung Reynolds function y (1113936ni1x

2i )2 [minus100 100]

F6 Schwefelrsquos problem 220 y 1113936ni1lfloorxirfloor [minus100 100]

F7 Csendes function y 1113936ni1x

2i (2 + sin(1xi)) [minus1 1]

F8 Exponential function y minuseminus051113936n

i1x2i [minus1 1]

F9 Griewankrsquos function y 1113936ni1(x2

i 4000)minus1113937ni1cos(xi

i

radic) + 1 [minus100 100]

F10 Salomon function y 1minus cos11138742π1113936

ni1x

2i

1113969

1113875 + 011113936

ni1x

2i

1113969[minus100 100]

F11 Zakharov function y 1113936ni1x

2i + 11138741113936

ni105ixi1113875

2+ 11138741113936

ni105ixi1113875

4[minus5 10]

Table 3 0e best and worst simulation results and their corre-sponding algorithms (dim 2)

Functions Value Corresponding algorithmBest fitnessF1 14238eminus 70 VM-GWOF2 32617eminus 36 VM-GWOF3 36792eminus 68 VM-GWOF4 33655eminus 66 Std GWOF7 78721eminus 222 VM-GWOF8 0 VM-GWO Std GWO PSO BAF9 0 VM-GWO Std GWOF11 26230eminus 69 VM-GWOWorst fitnessF1 10213eminus 07 BAF2 41489eminus 04 BAF3 59510eminus 08 BAF4 24192eminus 06 PSOF7 10627eminus 24 BAF8 57010eminus 13 BAF9 10850eminus 01 ALOF11 99157eminus 09 BA

6 Computational Intelligence and Neuroscience

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 7: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

worst job e PSO and the ALO algorithm did alittle better

(4) e GWO algorithms implement the optimizationprocedure much better e proposed VM-GWOalgorithm optimized most of the benchmark func-tions involved in this simulation at the best and itdid much better than the standard algorithm

erefore the proposed VM-GWO algorithm is betterperformed in optimizing the benchmark functions than thestd GWO algorithm as well as the ALO PSO algorithm andthe BA which can be also obtained from the Wilcoxon ranksum test [47] results as listed in Table 5

In Table 5 the p values of the Wilcoxon rank sum test isreported and show that the proposed VM-GWO algorithmhas superiority over most of the benchmark functions exceptF5 Rosenbrock function

43 Mean Least Iteration Times (MLIT) Analysis overMultidimensions Compared with other bionic algorithmsthe GWO algorithm has fewer numbers of parameterCompared with the std GWO algorithm the proposed VM-GWO algorithm does not generate additional uncontrollingparameters It furthermore improves the feasibility of thestd GWO algorithm by introducing an admissible maxi-mum iteration number On the contrary there are largenumbers of randomness in the compared bionic algorithmssuch as the ALO PSO algorithms and the BA erefore theproposed algorithm is expected to be fond by the engineerswho need the fastest convergence the most precise resultsand which are under most control us there is a need toverify the proposed algorithm to be fast convergent not onlya brief acquaintance from Figure 3

Generally speaking the optimization algorithms areusually used to nd the optima under constrained

conditions e optimization procedure must be ended inreality and it is expected to be as faster as capable eadmissible maximum iteration number M forbids the al-gorithm to be run endlessly but the algorithm is expected tobe ended quickly at the current conditions is experimentwill calculate the mean least iteration times (MLIT) under amaximum admissible error e absolute values of MAE areconstrained to be less than 10times10minus3 and M 10times105 Inthis experiment 100MC simulations are carried out and forsimplicity not all classical benchmark functions are involvedin this experiment e nal statistical results are listed inTables 6ndash8 Note that the complexity of the ALO algorithm isvery large and it is time exhausted based on the currentsimulation hardware described in Appendix So it is notincluded in this experiment

Table 8 lists the MLITdata when VW-GWO std GWOPSO algorithm and BA are applied to the unimodalbenchmark function F1 e best worst and the standarddeviation MLIT values are listed e mean values are alsocalculated and t-tested are carried out with α 005 e lastcolumn lists the remaining MC simulation numbers dis-carding all of the data when the searching processes reachthe admissible maximum iteration number M e nalresults demonstrate the best performance of the proposedVM-GWO algorithm on unimodal benchmark functionscompared to other four algorithms involved e data inTables 6ndash8 are under the same conditions and only dif-ference is that Table 6 lists the data obtained when the al-gorithms are applied to a multimodal benchmark functionwith the symmetrical domain However Table 8 lists the dataobtained when the algorithms applied to a multimodalbenchmark function with the unsymmetrical domain Asame conclusion could be drawn

Note that in this experiment the dimensions of thebenchmark functions are varied from 2 to 10 and 30 enal results also show that if the dimensions of thebenchmark functions are raised the MLIT values would beincreased dramatically is phenomenon would lead to thedoubt whether it also performs the best and is capable tosolve high-dimensional problems

44 High-Dimensional Availability Test Tables 6ndash8 showthat the larger the dimensions are the more the MLITvalueswould be needed to meet to experiment constraintsHowever as described in the rst part the optimizationalgorithms are mostly developed to solve the problems withhuge number of variables massive complexity or having noanalytical solutions us the high-dimensional availabilityis quite interested As described in the standard GWO al-gorithm the proposed VM-GWO algorithm should alsohave the merits to solve the large-scale problems An ex-periment with dim 200 is carried out to nd the capabilityof the algorithms solving the high-dimensional problemsFor simplicity three classical benchmark functions such asF4 Schwefelrsquos problem 221 function F8 exponentialfunction and F11 Zakharov function are used to dem-onstrate the results as listed in Table 9 e nal results of100MC experiments will be evaluated and counted and

F3

VM-GWOStd GWOALO

PSOBA

10ndash15

10ndash10

10ndash5

100

105

Abso

lute

erro

s

5 10 15 20 25 300Iterations

Figure 3 F3 convergence vs iterations (dim 2)

Computational Intelligence and Neuroscience 7

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 8: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

Tabl

e4

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

2)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F172039eminus

6635263eminus

65659Eminus28

634Eminus5[27]

259Eminus10

165Eminus10

[2]

136Eminus4

202Eminus4[27]

0773622

0528134

[2]

F213252eminus

3435002eminus

34718Eminus17

002901[27]

184241Eminus6

658Eminus7[2]

0042144

004542[27]

0334583

3186022

[2]

F337918eminus

6011757eminus

59329Eminus6

791496[27]

60685Eminus10

634Eminus10

[2]

7012562

221192[27]

0115303

0766036

[2]

F422262eminus

4628758eminus

46561Eminus7

131509[27]

136061Eminus8

181Eminus9[2]

031704

73549

[27]

0192185

0890266

[2]

F536015eminus

131

90004eminus

131

78319eminus

9724767eminus

9621459eminus

2028034eminus

2084327eminus

2017396eminus

1917314eminus

1749414eminus

17F9

00047

00040

000449

000666[27]

00301

00329

000922

000772[27]

00436

00294

F10

00200

00421

00499

00526

001860449

0009545

[2]

0273674

0204348

[2]

1451575

0570309

[2]

F11

12999eminus

6041057eminus

6068181eminus

3515724eminus

3411562eminus

1312486eminus

1323956eminus

1236568eminus

1250662eminus

0949926eminus

09

8 Computational Intelligence and Neuroscience

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 9: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

each time the search procedure will be also iterated for ahundred times

0e data listed in Table 9 show that the GWO algorithmswould be quickly convergent and the proposed algorithm isthe best to solve the large-scale problems

To test its capability even further we also carry out anexperiment to verify the capability solving some benchmarkfunction in high dimensions with restrictions MC 100 andMLIT 500 In this experiment we change the dimensionsfrom 100 to 1000 and the final results which are also the

Table 5 p values of the Wilcoxon rank sum test for VM-GWO over benchmark functions (dim 2)

F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11Std GWO 0000246 000033 0000183 000044 0000183 0 0000183 mdash 0466753 0161972 0000183PSO 0000183 0000183 0000183 0000183 0472676 0 0000183 0167489 0004435 0025748 0000183ALO 0000183 0000183 0000183 0000183 0472676 0 0000183 036812 0790566 0025748 0000183BA 0000183 0000183 0000183 0000183 0000183 0 0000183 0000747 0004435 001133 0000183

Table 6 MLITs and statistical results for F1

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 6 12 990 17180eminus 193 10493 100Std GWO 7 13 1038 18380eminus 22 12291 100

PSO 48 1093 35797 32203eminus 22 2053043 100BA 29 59 4100 13405eminus 101 58517 100

10

VW-GWO 53 66 5997 41940eminus 177 27614 100Std GWO 74 89 8040 19792eminus 80 27614 100

PSO 5713 11510 927922 29716eminus 76 13008485 88BA 6919 97794 4499904 75232eminus 26 251333096 78

30

VW-GWO 55 67 5985 12568eminus 122 24345 100Std GWO 71 86 8007 26197eminus 79 33492 100

PSO 5549 12262 931478 96390eminus 83 13163384 96BA 7238 92997 4418916 52685eminus 26 248317443 79

Table 7 MLITs and statistical results for F7

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 1 3 146 63755eminus 226 05397 100Std GWO 1 2 141 10070eminus 229 04943 100

PSO 2 2 200 0 0 100BA 1 3 102 85046eminus 269 0200 100

10

VW-GWO 5 9 765 57134eminus 199 09468 100Std GWO 5 11 748 51288eminus 191 11413 100

PSO 4 65 2423 16196eminus 85 109829 100BA 13 49 2529 59676eminus 109 62366 100

30

VW-GWO 13 22 1714 96509eminus 167 17980 100Std GWO 15 30 2080 13043eminus 148 26208 100

PSO 54 255 13332 57600eminus 12 425972 100BA 40 101 6268 18501eminus 53 118286 100

Table 8 MLITs and statistical results for F11

dim Algorithm Best Worst Mean t-test (α 005) Std deviation Number

2

VW-GWO 3 9 663 56526eminus 188 12363 100Std GWO 4 10 666 35865eminus 186 12888 100

PSO 6 125 4635 16006eminus 37 260835 100BA 5 62 2758 16166eminus 83 110080 100

10

VW-GWO 10 200 6557 28562eminus 12 432281 100Std GWO 14 246 6868 26622eminus 11 417104 100

PSO 15 1356 23174 12116eminus 6 2571490 94BA 15 214 11319 51511eminus 2 669189 100

30

VW-GWO 49 1179 31224 12262eminus 18 1947643 100Std GWO 65 945 29445 31486eminus 21 1607119 100

PSO 32 5005 108611 60513eminus 13 9803386 72BA 66 403 22160 19072eminus 51 405854 100

Computational Intelligence and Neuroscience 9

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 10: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

Tabl

e9

Statistical

analysison

theabsolute

errors

oftheselected

functio

ns(dim

200)

Functio

nsVM-G

WO

Std

GWO

ALO

PSO

BAMean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

Mean

Std

deviation

F433556eminus

5387424eminus

5316051eminus

4622035eminus

4642333eminus

0729234eminus

0730178eminus

0765449eminus

0716401eminus

0721450eminus

07F8

00

00

33307eminus

1774934eminus

1711102eminus

1735108eminus

1714466eminus

1419684eminus

14F1

100115

00193

00364

00640

83831

103213

126649

130098

47528e+

1628097e+

16

10 Computational Intelligence and Neuroscience

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 11: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

absolute errors averaged over MC times being shown inFigure 4

We can see from Figure 4 that the VM-GWO is capableto solve high-dimensional problems

5 Conclusions

In this paper an improved grey wolf optimization (GWO)algorithm with variable weights (VW-GWO algorithm) isproposed A hypothesize is made that the social hierarchy ofthe packs would also be functional in their searching po-sitions And variable weights are then introduced to theirsearching process To reduce the probability of being trappedin local optima a governing equation of the controllingparameter is introduced and thus it is declined exponen-tially from the maximum Finally three types of experimentsare carried out to verify the merits of the proposed VW-GWO algorithm Comparisons are made to the originalGWO and the ALO PSO algorithm and BA

All the selected experiment results show that the pro-posed VW-GWO algorithm works better under dipounderentconditions than the others e variance of dimensionscannot change its rst position among them and the pro-posed VW-GWO algorithm is expected to be a good choiceto solve the large-scale problems

However the proposed improvements are mainly fo-cusing on the ability to converge It leads to faster con-vergence and wide applications But it is not found to becapable for all the benchmark functions Further work wouldbe needed to tell the reasons mathematically Other ini-tializing algorithms might be needed to let the initial swarmindividuals spread all through the domain and newsearching rules when the individuals are at the basins wouldbe another hot spot of future work

Appendix

e simulation platform as described in Section 33 is runon an assembled desktop computer being congured as

follows CPU Xeon E3-1231 v3 GPU NVidia GeForce GTX750 Ti memory DDR3 1866MHz motherboard Asus B85-Plus R20 hard disk Kingston SSD

Data Availability

e associate software of this paper could be downloadedfrom httpddlesciencecnfErl2 with the access codekassof

Conflicts of Interest

e authors declare that they have no conicts of interest

Authorsrsquo Contributions

Zheng-Ming Gao formulated the governing equations ofvariable weights constructed the work and wrote the paperJuan Zhao proposed the idea on the GWO algorithm andprogrammed the work with Matlab Her major contributionis in the programmed work and the proposed declinedexponentially governing equations of the controlling pa-rameter Juan Zhao contributed equally to this work

Acknowledgments

is work was supported in part by Natural ScienceFoundation of Jingchu University of Technology with grantno ZR201514 and the research project of Hubei ProvincialDepartment of Education with grant no B2018241

References

[1] D H Wolpert and W G Macready ldquoNo free lunch theoremsfor optimizationrdquo IEEE Transactions on Evolutionary Com-putation vol 1 no 1 pp 67ndash82 1997

[2] S Mirjalili ldquo e ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[3] Y Xin-She Nature-Inpsired Optimization AlgorithmsElsevier Amsterdam Netherlands 2014

[4] H Zang S Zhang and K Hapeshi ldquoA review of nature-inspired algorithmsrdquo Journal of Bionic Engineering vol 7no 4 pp S232ndashS237 2010

[5] X S Yang S F Chien and T O Ting ldquoChapter 1-bio-inspired computation and optimization an overviewrdquo in Bio-Inspired Computation in Telecommunications X S YangS F Chien and T O Ting Eds Morgan Kaufmann BostonMA USA 2015

[6] A Syberfeldt and S Lidberg ldquoReal-world simulation-basedmanufacturing optimization using cuckoo search simulationconference (WSC)rdquo in Proceedings of the 2012 Winter Sim-ulation Conference (WSC) pp 1ndash12 Berlin GermanyDecember 2012

[7] L D S Coelho and V CMariani ldquoImproved rey algorithmapproach applied to chiller loading for energy conservationrdquoEnergy and Buildings vol 59 pp 273ndash278 2013

[8] C W Reynolds ldquoFlocks herds and schools a distributedbehavioral modelrdquo ACM SIGGRAPH Computer Graphicsvol 21 no 4 pp 25ndash34 1987

[9] Z Juan and G Zheng-Ming e Bat Algorithm and Its Pa-rameters Electronics Communications and Networks IV CRCPress Boca Raton FL USA 2015

F1F2

F3F11

10ndash80

10ndash60

10ndash40

10ndash20

100

1020

Abso

lute

erro

rs

200 300 400 500 600 700 800 900 1000100Dimension

Figure 4 Absolute errors vs dimensions based on VM-GWO

Computational Intelligence and Neuroscience 11

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 12: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

[10] J J Q Yu and V O K Li ldquoA social spider algorithm for globaloptimizationrdquo Applied Soft Computing vol 30 pp 614ndash6272015

[11] R Azizi ldquoEmpirical study of artificial fish swarm algorithmrdquoInternational Journal of Computing Communications andNetworking vol 3 no 1ndash3 pp 1ndash7 2014

[12] L Yan-Xia L Lin and Zhaoyang ldquoImproved ant colonyalgorithm for evaluation of graduatesrsquo physical conditionsmeasuring technology and mechatronics automation(ICMTMA)rdquo in Proceedings of the 2014 Sixth InternationalConference on Measuring Technology and Mechatronics Au-tomation pp 333ndash336 Zhangjiajie China January 2014

[13] Z Xiu Z Xin S L Ho and W N Fu ldquoA modification ofartificial bee colony algorithm applied to loudspeaker designproblemrdquo IEEE Transactions on Magnetics vol 50 no 2pp 737ndash740 2014

[14] M K Marichelvam T Prabaharan and X S Yang ldquoA dis-crete firefly algorithm for the multi-objective hybrid flowshopscheduling problemsrdquo IEEE Transactions on EvolutionaryComputation vol 18 no 2 pp 301ndash305 2014

[15] Y A N Chun-man G U O Bao-long andW U Xian-xiangldquoEmpirical study of the inertia weight particle swarm opti-mization with constraint factorrdquo International Journal of SoftComputing and Software Engineering [JSCSE] vol 2 no 2pp 1ndash8 2012

[16] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the 1999 Congress on Evo-lutionary Computation-CEC99 (Cat No 99TH8406)pp 345ndash357 Washington DC USA July 1999

[17] S Yılmaz and E U Kuccediluksille ldquoA newmodification approachon bat algorithm for solving optimization problemsrdquo AppliedSoft Computing vol 28 pp 259ndash275 2015

[18] A Basak D Maity and S Das ldquoA differential invasive weedoptimization algorithm for improved global numerical op-timizationrdquo Applied Mathematics and Computation vol 219no 12 pp 6645ndash6668 2013

[19] X Yuan T Zhang Y Xiang and X Dai ldquoParallel chaosoptimization algorithm with migration and merging op-erationrdquo Applied Soft Computing vol 35 pp 591ndash6042015

[20] M Kang J Kim and J M Kim ldquoReliable fault diagnosis forincipient low-speed bearings using fault feature analysis basedon a binary bat algorithmrdquo Information Sciences vol 294pp 423ndash438 2015

[21] Z Chen Y Zhou and M Lu ldquoA simplied adaptive bat al-gorithm based on frequencyrdquo Journal of Computational In-formation Systems vol 9 pp 6451ndash6458 2013

[22] J H Holland Adaptation in Natural and Artificial SystemsUniversity of Michigan Press Ann Arbor MI USA 1975

[23] N Metropolis A W Rosenbluth M N Rosenbluth andA H Teller ldquoEquation of state calculations by fast computingmachinesrdquo Journal of Chemical Physics vol 21 no 6pp 1087ndash1092 1953

[24] M Dorigo and M Birattari ldquoAnt colony optimizationrdquo IEEEComputational Intelligence Magazine vol 1 no 4 pp 28ndash392006

[25] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization(NICSO 2010) J Gonzalez D Pelta C Cruz et al EdsSpringer Berlin Germany 2010

[26] H Haklı and H Uguz ldquoA novel particle swarm optimizationalgorithm with Levy flightrdquo Applied Soft Computing vol 23pp 333ndash345 2014

[27] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[28] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer al-gorithm for the two-stage assembly flow shop schedulingproblem with release timerdquo Journal of Computational Sciencevol 8 pp 109ndash120 2015

[29] Y Sharma and L C Saikia ldquoAutomatic generation control ofa multi-area ST-thermal power system using grey wolf op-timizer algorithm based classical controllersrdquo InternationalJournal of Electrical Power amp Energy Systems vol 73pp 853ndash862 2015

[30] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using grey wolf-patternsearch algorithmsrdquo Energy Conversion and Managementvol 98 pp 411ndash429 2015

[31] X Song L Tang S Zhao et al ldquoGrey wolf optimizer forparameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[32] N Jayakumar S Subramanian S Ganesan andE B Elanchezhian ldquoGrey wolf optimization for combinedheat and power dispatch with cogeneration systemsrdquo In-ternational Journal of Electrical Power amp Energy Systemsvol 74 pp 252ndash264 2016

[33] S A Medjahed T A Saadi A Benyetto and M Ouali ldquoGraywolf optimizer for hyperspectral band selectionrdquo Applied SoftComputing vol 40 pp 178ndash186 2016

[34] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquo Neuro-computing vol 172 pp 371ndash381 2016

[35] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof interconnected power system using grey wolf optimiza-tionrdquo Swarm and Evolutionary Computation vol 27pp 97ndash115 2016

[36] M H Sulaiman ZMustaffa M RMohamed andO AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32pp 286ndash292 2015

[37] S Mirjalili S Saremi S M Mirjalili and L D S CoelholdquoMulti-objective grey wolf optimizer a novel algorithm formulti-criterion optimizationrdquo Expert Systems with Applica-tions vol 47 pp 106ndash119 2016

[38] E EmaryW Yamany A E Hassanien and V Snasel ldquoMulti-objective gray-wolf optimization for attribute reductionrdquoProcedia Computer Science vol 65 pp 623ndash632 2015

[39] S Saremi S Z Mirjalili and S M Mirjalili ldquoEvolutionarypopulation dynamics and grey wolf optimizerrdquo NeuralComputing and Applications vol 26 no 5 pp 1257ndash12632015

[40] R E Precup R C David E M Petriu A I Szedlak-Stineanand C A Bojan-Dragos ldquoGrey wolf optimizer-based ap-proach to the tuning of pi-fuzzy controllers with a reducedprocess parametric sensitivityrdquo IFAC-PapersOnLine vol 49no 5 pp 55ndash60 2016

[41] A Noshadi J Shi W S Lee P Shi and A Kalam ldquoOptimalPID-type fuzzy logic controller for a multi-input multi-outputactive magnetic bearing systemrdquo Neural Computing andApplications vol 27 no 7 pp 2031ndash2046 2016

[42] P B de Moura Oliveira H Freire and E J Solteiro PiresldquoGrey wolf optimization for PID controller design withprescribed robustness marginsrdquo Soft Computing vol 20no 11 pp 4243ndash4255 2016

[43] S Khalilpourazari and S Khalilpourazary ldquoOptimization ofproduction time in the multi-pass milling process via a Robust

12 Computational Intelligence and Neuroscience

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 13: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

Grey Wolf Optimizerrdquo Neural Computing and Applicationsvol 29 no 12 pp 1321ndash1336 2018

[44] R El Sehiemy A Shaheen and A Abou El-Ela ldquoMulti-objective fuzzy-based procedure for enhancing reactivepower managementrdquo IET Generation Transmission amp Dis-tribution vol 7 no 12 pp 1453ndash1460 2013

[45] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing and Applications vol 22 no 6 pp 1239ndash12552013

[46] M Jamil and X S Yang ldquoA literature survey of benchmarkfunctions for global optimisation problemsrdquo InternationalJournal of Mathematical Modelling and Numerical Optimi-sation vol 4 no 2 pp 150ndash194 2013

[47] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm in-telligence algorithmsrdquo Swarm and Evolutionary Computationvol 1 no 1 pp 3ndash18 2011

Computational Intelligence and Neuroscience 13

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom

Page 14: AnImprovedGreyWolfOptimizationAlgorithmwith VariableWeightsdownloads.hindawi.com/journals/cin/2019/2981282.pdf · ·X a → −X → (t) , D β → C 2 → ·X β → −X → (t)

Computer Games Technology

International Journal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Journal ofEngineeringVolume 2018

Advances in

FuzzySystems

Hindawiwwwhindawicom

Volume 2018

International Journal of

ReconfigurableComputing

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

thinspArtificial Intelligence

Hindawiwwwhindawicom Volumethinsp2018

Hindawiwwwhindawicom Volume 2018

Civil EngineeringAdvances in

Hindawiwwwhindawicom Volume 2018

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawiwwwhindawicom Volume 2018

Hindawi

wwwhindawicom Volume 2018

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

RoboticsJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Computational Intelligence and Neuroscience

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Modelling ampSimulationin EngineeringHindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018

Human-ComputerInteraction

Advances in

Hindawiwwwhindawicom Volume 2018

Scientic Programming

Submit your manuscripts atwwwhindawicom