PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

20
PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classifier Alaa Tharwat 1,2,5 , Aboul Ella Hassanien 3,4,5 1 Dept. of Electricity- Faculty of Engineering- Suez Canal University, Ismaalia, Egypt. 2 Faculty of Engineering, Ain Shams University, Cairo, Egypt. 3 Faculty of Computers Information, Cairo University, Cairo, Egypt. 4 Faculty of Computers and Information, Beni Suef University - Egypt. 5 Scientific Research Group in Egypt (SRGE) http://www.egyptscience.net. Swarm Work Shop - Nov. 7, 2015 Alaa Tharwat 1,2,5 , Aboul Ella Hassanien 3,Swarm Work Shop - Nov. 7, 2015 1/ 20

Transcript of PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Page 1: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

PSOk-NN: A Particle Swarm OptimizationApproach to Optimize k-Nearest Neighbor

Classifier

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5

1Dept. of Electricity- Faculty of Engineering- Suez Canal University, Ismaalia, Egypt.2Faculty of Engineering, Ain Shams University, Cairo, Egypt.

3Faculty of Computers Information, Cairo University, Cairo, Egypt.4Faculty of Computers and Information, Beni Suef University - Egypt.

5Scientific Research Group in Egypt (SRGE) http://www.egyptscience.net.

Swarm Work Shop - Nov. 7, 2015

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 1 /

20

Page 2: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Agenda

Introduction

Theoretical Background.

Proposed Model.

Experimental Results.

Conclusions and Future Work

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 2 /

20

Page 3: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Introduction

In machine learning field, there are two main learning approaches,namely, supervised and unsupervised learning approaches.

There are two main techniques of supervised learning, namely,regression and classification.

In the unsupervised approach, the targets or responses of the inputdata are not required to build the model.

There are many types of classifiers, but k-Nearest Neighbour (k-NN)classifier is one of the oldest and simplest classifier.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 3 /

20

Page 4: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Theoretical Background k-Nearest Neighbour (k-NN) Classifier

k-Nearest Neighbour (k-NN) is one of the most common and simplemethods for pattern classification.

In k-NN classifier, an unknown pattern is distinguished or classifiedbased on the similarity to the known samples (i.e. labelled or trainingsamples) by computing the distances from the unknown sample to alllabelled samples and select the k-nearest samples as the basis forclassification.

The unknown sample is assigned to the class containing the mostsamples among the k-nearest samples (i.e. voting), thus, the kparameter must be odd.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 4 /

20

Page 5: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Theoretical Background Particle Swarm Optimization (PSO)

The main objective of the PSO algorithm is to search in the searchspace for the positions which are close to the global minimum ormaximum solution.

In PSO algorithm, a number of particles, agents, or elements whichrepresent the solutions are randomly placed in the search space. Thenumber of particles is determined by a user.

The current location or position of each particle is used to calculatethe objective or fitness function at that location.

Each particle has three values, namely, position (xi ∈ Rn), velocity(vi), the previous best positions (pi), and (G) which represents theposition of the best fitness value achieved.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 5 /

20

Page 6: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Theoretical Background Particle Swarm Optimization (PSO)

The velocity of each particle is adjusted in each iteration as shown inEquation (1).

The movement of any particle is then calculated by adding thevelocity and the current position of that particle as in Equation (2).

vi(t+1) = Current Motion + Particle Memory Influnce + Swarm Influnce

vi(t+1) = wvi(t) + C1r1(pit − xi(t)) + C2r2(G− xi(t))

(1)

xi(t+1) = xi(t) + vi(t+1) (2)

where w represents the inertia weight, C1 is the cognition learning factor,C2 is the social learning factors, r1, r2 are the uniformly generatedrandom numbers in the range of [0 , 1].

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 6 /

20

Page 7: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Theoretical Background Particle Swarm Optimization (PSO)

x(t)i

x(t+1)i

x(t)j

x(t+1)j

G

P(t)i

P(t)j

v(t)i

v(t)j

v (t+1)i

v(t+1) j

vpi

vpj

vGi

vGj

Particle 1 (Current Position)

Particle 1 (Next Position)

Particle 2 (Current Position)

Particle 2 (Next Position)

Original VelocityVelocity to PbestVelocity to G

Resultant Velocity

(a)

x(t)i

G

xi(t+1)

xj(t+1)

x(t)j

P(t)j

P(t)i

`

(b)

Figure: An example to show how two particles are move using PSO algorithm,(a) general movement of the two particles, (b) movement of two particle inone-dimensional space.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 7 /

20

Page 8: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Proposed Model: PSOk-NN

Particle Swarm Optimization (PSO)

TraininigCSamples

kCParameter

MisclassificationCRate

fB

fG

B G < Y ? 6 7 8 9

B

G

<

Y

?

6

7

8

k=B

k=<

k=?

ClassCBCClassCGC

ClassCGClassCG

ClassCB

IntializeCPSO

ForCEachCParticle

UpdateCVelocityCdviV

UpdateCPositionCdxiV

EvaluateCFitnessCFunctionCdFdxiVV

SatisfyCTerminationC

Criterion

NextCIteration

BestCSloutionCdGV

IfCdFdxiV<FdPiVVPi=xi

IfCdFdxiV<FdGVVG=xi

NextCParticle

No

Yes

TestingSamples

TestingC

?

Figure: PSOk-NN algorithm searches for the optimal k parameter whichminimizes the misclassification rate of the testing samples.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 8 /

20

Page 9: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Simulated Example

Table: Description of the training data used in our simulated example.

PatternNo.

Class 1(ω1)

Class 2(ω2)

f1 f2 f1 f21 7 1 3 3

2 5 2 4 4

3 9 2 7 4

4 10 4 5 5

5 8 4 6 5

6 11 4 6 10

7 9 9 4 11

8 9 11 2 11

9 10 9 2 6

10 8 6 5 9

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 9 /

20

Page 10: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Simulated Example

k=1k=3k=5k=7

f1

f2

1 2 3 4 5 6 7 8 9 10 11 12

1

2

3

4

5

6

7

8

9

10

11

12

k=1k=3

k=5k=7k=9

C2 (false)C2 (false)

C1 (true)C1 (true)C2 (false)

Value of k

PredictedClass Label

Class 1 (Training Pattern)

Class 1 (Testing Pattern)

Class 2 (Training Pattern)

Class 2 (Testing Pattern)

k=9

Figure: Example of how k parameter controls the predicted class labels of theunknown sample, hence controls the misclassification rate.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 10 /

20

Page 11: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Simulated Example

Table: Description of the testing data used in our simulated example and itspredicted class labels using k-NN classifier using different values of k.

Testing Samples True ClassLabel (yi)

Predicted Class Labels (yi)No. ofSample

f1 f2 k=1 k=3 k=5 k=7 k=9

1 7 9 1 2 2 1 1 22 4 2 2 1 2 2 2 2

3 9 3 1 1 1 1 1 1

4 2 7 2 2 2 2 2 2

Misclassification Rate (%) 50 25 0 0 25The bold values indicate the wrong class label.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 11 /

20

Page 12: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Simulated Example

Initial ValuesParticle

No.Position (xi) Velocity (vi)

FitnessFunction (F)

P i G

1 1 0 100 - -

2 9 0 100 - -

3 5 0 100 - -

4 3 0 100 - -

First Iteration1 1 5.6 50 1 -

2 9 -5.6 25 9 -

3 5 0 0 5 G

4 3 2.8 25 3 -

Second Iteration1 5 3.36 0 5 G

2 5 -3.36 0 5 G

3 5 0 0 5 G

4 5 -1.68 0 5 G

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 12 /

20

Page 13: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Simulated Example

ParticleS1ParticleS2SParticleS3SParticleS4S

k=1 k=3 k=5 k=7 k=9

F(x1)=50x1

F(x2)=25x2

F(x3)=0x3

F(x4)=25x4

Mis

cla

ssifi

catio

nSR

ate

S(6

)

0

25

50

FirstSIteration

k=1 k=3 k=5 k=7 k=9

Mis

cla

ssifi

catio

nSR

ate

S(6

)

0

25

50

SecondSIteration

v2=-5.6

v1=5.6

v4=2.8

v3=0

Figure: Visualization of how PSO algorithm searches for the best k value whichachieves the minimum misclassification rate.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 13 /

20

Page 14: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Experiments Using Real Data

Table: Data sets description.

Data set Dimension Samples ClassesIris 4 150 3

Ionosphere 34 351 2

Liver-disorders 6 345 2

Ovarian 4000 216 2

Breast Cancer 13 683 2

Wine 13 178 3

Sonar 60 208 2

Pima Indians Diabetes 8 768 2

ORL32×32 1024 400 40

Yale32×32 1024 165 15

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 14 /

20

Page 15: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Experiments Using Real Data

DatasetPSOk-NN GAk-NN ACOk-NN

Misclassification Rate Misclassification Rate Misclassification RateIris 1.4667±0.4216 4±0 2.6667±0

Iono 13.1429±0 17.1429±0 16.9143±0.5521

Liver 30.9302±1.4708 31.9767.±0 35.4651±7.4898×10−15

Ovarian 13.0556±0.2928 14.2321±0.2145 13.8889±0

Breast Cancer 30.3021±(0.8037) 31.0850±7.4898×10−15 32.2581±7.4898×10−15

Wine 23.0899±0 24.7191±3.7449×10−15 28.3146±0.7106

Sonar 17.45±0 21.1538±0 17.3077±2.0271Diabate 24.7448±0.9025 22.9167±3.7449×10−15 26.0417±7.4898×10−15

ORL32×32 8.5±0 9.5±0 8.5±0Yale32×32 21.9512±3.7449×10−15 21.9512±3.7449×10−15 25.8537±0.7713

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 15 /

20

Page 16: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Experiments Using Real Data

0 10 20 30 40 50 60 70 80 90 1000

200

400

600

800

1000

1200

1400

1600

1800

No. of Iterations

Tot

al A

bsol

ute

Vel

ocity

Iono DatasetIris DatasetSonar Dataset

Figure: Toal absolute velocity of the PSOk-NN algorithm using Iono, Iris, andSonar datasets.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 16 /

20

Page 17: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Experiments Using Real Data

0 5 10 15 20 25 30 35 402

3

4

5

6

7

8

9

10

k Value

Fitn

ess

Fun

ctio

n

PSO particles

(a) After the first iteration

0 10 20 30 40 50 60 70 800

10

20

30

40

50

60

70

k ValueF

itnes

s F

unct

ion

PSO particles

(b) After the seconditeration

0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

k Value

Fitn

ess

Fun

ctio

n

PSO particles

(c) After the tenth iteration

Figure: Visualization of the movements of all particles of PSOk-NN algorithmtill it reaches to the optimal solution which achieved the minimummisclassification rate.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 17 /

20

Page 18: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Experimental Results Experiments Using Real Data

−4 −3 −2 −1 0 1 2 3 4−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

First Feature

Sec

ond

Fea

ture

setosaversicolorvirginica

(a) After the first iteration

−4 −3 −2 −1 0 1 2 3 4−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

First Feature

Sec

ond

Fea

ture

setosaversicolorvirginica

(b) After the tenth iteration

Figure: Misclassification samples after the first and tenth iterations usingPSOk-NN algorithm.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 18 /

20

Page 19: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Conclusions

PSOk-NN algorithm achieved the minimum misclassification error ineight of the datasets (80%) compared with the other two algorithms.

PSOk-NN algorithm converges to the optimal solution faster thanthe other two algorithms due to the use of linearly decreasing inertiaweight in PSO algorithm.

GAk-NN fluctuating up and down, while PSOk-NN algorithm is morestable during converging to the optimal solution because in PSO, thebest solution gives information to all other particles to move to theoptimal solution, while in GA the all agents are changed randomlywithout any guiding from any agent.

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 19 /

20

Page 20: PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbor Classier

Thank you

Thank YouQurstions

Alaa Tharwat1,2,5, Aboul Ella Hassanien3,4,5Swarm Work Shop - Nov. 7, 2015 20 /

20