PhDThesis, Dr Shen Furao

38
1/39 An Algorithm for Incremental Unsupervised Learning and Topology Representation Shen Furao Hasegawa Lab Department of Computational Intelligence and Systems Science

Transcript of PhDThesis, Dr Shen Furao

Page 1: PhDThesis, Dr Shen Furao

1/39

An Algorithm for Incremental Unsupervised Learning and Topology Representation

Shen Furao Hasegawa Lab Department of Computational Intelligence and Systems Science

Page 2: PhDThesis, Dr Shen Furao

2/39

Contents

Chapter 1: Introduction Chapter 2: Vector Quantization Chapter 3: Adaptive Incremental LBG Chapter 4: Experiment of adaptive

incremental LBG Chapter 5: Self-organizing incremental

neural network Chapter 6: Experiment with artificial data Chapter 7: Application Chapter 8: Conclusion and discussion

Page 3: PhDThesis, Dr Shen Furao

3/39

Introduction

Clustering: Construct decision boundaries based on unlabeled data.

Topology learning: find a topology structure that closely reflects the topology of the data distribution

Online incremental learning: Adapt to new information without corrupting previously learned information

Page 4: PhDThesis, Dr Shen Furao

4/39

Vector Quantization

Targets To minimize the average distortion through a

suitable choice of codewords

Application Data compression, speech recognition Separate the data set to Voronoi regions, find the

centroid of the Voronoi regions

LBG method (Linde, Buzo & Gray, 1980) Dependence on initial starting conditions Tendency to result in local minima

Page 5: PhDThesis, Dr Shen Furao

5/39

Adaptive incremental LBG (Shen & Hasegawa, 2005)

To solve the problem caused by poorly chosen initial conditions independent of initial conditions

With fixed number of codewords, to find a suitable codebook to minimize the distortion error MQE. It can work better than or same as ELBG (Patane &

Russo, 2001)

With fixed distortion error, to minimize the number of codewords and find a suitable codebook. Meaning: To get the same reconstruction quality for

different vector set, the codebook will have different size and thus can save plenty of storage.

Page 6: PhDThesis, Dr Shen Furao

6/39

Test Image

Lena (512*512*8) is separated to 4*4 blocks. Such blocks are the input vectors. There are totally 16384 vectors.

Peak Signal to Noise Ratio (PSNR) is used to evaluate the resulting images after the quantization process.

N

iigif

N

PSNR

1

2

2

10

))()((1

255log10

Lena (512*512*8)

Page 7: PhDThesis, Dr Shen Furao

7/39

Improvement I: Incrementally inserting codewords

The optimal solution of k-clustering problem can be reachable from the (k-1)-clustering problem.

Page 8: PhDThesis, Dr Shen Furao

8/39

Improvement II: Distance measure function

Within cluster distance must be significantly less than between cluster distance.

p

l

i

ii cxcxd ))((),(1

2

1log10 qp

Page 9: PhDThesis, Dr Shen Furao

9/39

Improvement III: Delete and insert codeword

Delete codeword with lowest local distortion error Insert codeword near the codeword with highest local distortion error

Page 10: PhDThesis, Dr Shen Furao

10/39

Experiment 1

Number of

codewords

PSNR

LBG (Linde et al.,1980)

Mk (Lee et al., 1997)

ELBG(Patane, 2001)

AILBG

256 31.60 31.92 31.94 32.01

512 32.49 33.09 33.14 33.22

1024 33.37 34.42 34.59 34.71

Meaning: With the same number of codewords, proposed

method can get highest PSNR, i.e., with the same compression

ratio, proposed method can get best reconstruction quality.

Page 11: PhDThesis, Dr Shen Furao

11/39

Experiment 2

PSNR

Number of codewords

ELBG (Patane, 2001)

AILBG

31.94 256 244

33.14 512 488

34.59 1024 988

Meaning:

• With a predefined reconstruction quality, proposed method can

find a good codebook with reasonable number of codewords.

Page 12: PhDThesis, Dr Shen Furao

12/39

Experiment 3: Original Images

Boat Gray21

Page 13: PhDThesis, Dr Shen Furao

13/39

Results of experiment 3

PSNR

(dB)

Number of codewords

Gray21 Lena Boat

28.0 9 22 54

30.0 12 76 199

33.0 15 454 1018

Meaning:

1. For different images, with the same PSNR, number of codewords will be different.

2. Proposed method can be used to set up an image database with same

reconstruction quality (PSNR)

Page 14: PhDThesis, Dr Shen Furao

14/39

Unsupervised learning

Clustering K-means (King, 1967), ELBG (Patane, 2001), Global k-means (Likas, 2003),

AILBG (Shen, 2005) Determine the number of clusters k in advance data sets consisting only of isotropic clusters

Single-link (Sneath, 1973), complete-link (King, 1967), CURE (Guha, 1998) Computation overload, much memory space Unsuitable for large data sets or online data

Topology Learning: Reflects topology of high-dimension data distribution

SOM (Kohonen, 1982): predetermined structure and size CHL+NG (Martinetz, 1994): a priori decision about the network size GNG (Fritzke, 1995): permanent increase in the number of nodes

Online Learning GNG-U (Frutzke, 1998): destroy learned knowledge LLCS (Hamker, 2001): supervised learning

Page 15: PhDThesis, Dr Shen Furao

15/39

Self-organizing incremental neural network (Shen & Hasegawa, 2005)

1. To process the on-line non-stationary data.

2. To do the unsupervised learning without any priori condition such as:

• suitable number of nodes

• a good initial codebook

• how many classes there are

3. Report a suitable number of classes

4. Represent the topological structure of the input probability density.

5. Separate the classes with some low-density overlaps

6. Detect the main structure of clusters polluted by noises

Page 16: PhDThesis, Dr Shen Furao

16/39

The Proposed algorithm

Input

pattern

First Layer

Growing

Network

First

Output

Second Layer

Growing

Network

Second

Output

Insert

Node

Delete

Node Classify

Page 17: PhDThesis, Dr Shen Furao

17/39

Algorithms

Insert new nodes

Criterion: nodes with high errors serve as a criterion to insert a new node

error-radius is used to judge if the insert is successful

Delete nodes

Criterion: remove nodes in low probability density regions

Realize: delete nodes with no or only one direct topology neighbor

Classify

Criterion: all nodes linked with edges will be one cluster

Page 18: PhDThesis, Dr Shen Furao

18/39

Input signals==

multiple of

First-layer Second-layer

Between-class

Insertion

Connect winner

and second winner

Find winner

and second winner

Input signal

Initialize

Update weight of

winner and neighbor

Within-class

Insertion

Judge if insertion

is successful

Delete overlap and

noise nodes

Input signals==

multiple of LT

First-layer

Output results

Y

N N

N

Y

Y

Page 19: PhDThesis, Dr Shen Furao

19/39

Experiment

Environment

I II III IV V VI VII

A 1 0 1 0 0 0 0

B 0 1 0 1 0 0 0

C 0 0 1 0 0 1 0

D 0 0 0 1 1 0 0

E1 0 0 0 0 1 0 0

E2 0 0 0 0 0 1 0

E3 0 0 0 0 0 0 1 Original Data Set

Page 20: PhDThesis, Dr Shen Furao

20/39

Experiment: Stationary environment

Original Data Set GNG (Fritzke, 1995)

Page 21: PhDThesis, Dr Shen Furao

21/39

Experiment:

Stationary environment

Proposed method: first layer Proposed method: final results

Page 22: PhDThesis, Dr Shen Furao

22/39

Experiment:

Non-stationary environment

GNG-U (Fritzke, 1998) GNG (Fritzke, 1995)

Page 23: PhDThesis, Dr Shen Furao

23/39

Experiment:

Non-stationary environment

Proposed method: first layer

Page 24: PhDThesis, Dr Shen Furao

24/39

Experiment:

Non-stationary environment

Proposed method: first layer

Page 25: PhDThesis, Dr Shen Furao

25/39

Proposed method: first layer

Experiment:

Non-stationary environment

Page 26: PhDThesis, Dr Shen Furao

26/39

Experiment:

Non-stationary environment

Proposed method: first layer Proposed method: Final output

Page 27: PhDThesis, Dr Shen Furao

27/39

Application: Face recognition

Facial Im

age

(AT

T_FA

CE

)

(a) 10 classes

(b) 10 samples of class 1

Page 28: PhDThesis, Dr Shen Furao

28/39

Face recognition: Feature Vector

Vector of (a)

Vector of (b)

Page 29: PhDThesis, Dr Shen Furao

29/39

Face Recognition: results

10 clusters

Stationary

Correct

Recognition

Ratio: 90%

Non-Stationary

Correct

Recognition

Ratio: 86%

Page 30: PhDThesis, Dr Shen Furao

30/39

Application: Vector Quantization

Original Lena (512*512*8) Stationary Environment: Decoding

image, 130 nodes, 0.45bpp,

PSNR = 30.79dB

Page 31: PhDThesis, Dr Shen Furao

31/39

Vector Quantization: Compare with GNG

Number of Nodes

bpp PSNR

First-layer 130 0.45 30.79

GNG (Fritzke, 1995)

130 0.45 29.98

Second-layer 52 0.34 29.29

GNG 52 0.34 28.61

Stationary Environment

Page 32: PhDThesis, Dr Shen Furao

32/39

Vector Quantization:

Non-stationary Environment

First-layer: 499 nodes, 0.56bpp,

PSNR = 32.91dB

Second-layer: 64 nodes, 0.375bpp,

PSNR = 29.66dB

Page 33: PhDThesis, Dr Shen Furao

33/39

Application: Handwritten character recognition

Optical Recognition of Handwritten Digits database (optdigits) (UCI repository, 1996) 10 classes (handwritten digits) from a total of 43

people 30 contributed to the training set, 3823 samples Different 13 to the test set, 1797 samples Dimension of the samples is 64

Method: Train: A separate SOINN to describe each class of data Test: Classify an unknown data point according to

whichever model gives the best match (nearest neighbor)

Page 34: PhDThesis, Dr Shen Furao

34/39

Optdigits: Comparison with 1-NN

1-NN Proposed method

(1) (2) (3) (4)

Recognition

ratio 98% 98.5% 97.1% 96.5% 96.0%

No. of

prototype 3823 845 544 415 334

Speed up

(times) 1 4.53 7.02 9.21 11.45

Memory 100% 22.1% 14.2% 10.8% 8.7%

Page 35: PhDThesis, Dr Shen Furao

35/39

Optdigits: Comparison with SVM

Traditional SVM Improved SVM

(Passerini, 2002) Proposed method

One-vs-All All-pairs One-vs-All All-pairs

Recognition ratio

97.2 97.4 98.2 98.1 98.5

Gaussian Kernel

Page 36: PhDThesis, Dr Shen Furao

36/39

Application: others

Humanoid robot

Scene recognition

Texture recognition

Semi-supervised learning

Page 37: PhDThesis, Dr Shen Furao

37/39

Journal papers (2003~2005)

1. Shen Furao & Osamu Hasegawa, “An adaptive incremental LBG for vector quantization,” Neural Networks, accepted.

2. Shen Furao & Osamu Hasegawa, “An incremental network for on-line unsupervised classification and topology learning,” Neural Networks, accepted.

3. Shen Furao & Osamu Hasegawa, Fractal image coding with simulated annealing search, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.9, No.1, pp.80-88, 2005.

4. Shen Furao & Osamu Hasegawa, A fast no search fractal image coding method, Signal Processing: Image Communication, vol.19, pp.393-404, (2004)

5. Shen Furao & Osamu Hasegawa, A growing neural network for online unsupervised learning, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.8, No.2, pp.121-129, (2004)

Page 38: PhDThesis, Dr Shen Furao

38/39

Refereed International Conference (2003~2005)

1. Shen Furao, Youki Kamiya & Osamu Hasegawa, “An incremental neural network for online supervised learning and topology representation,” 12th International Conference on Neural Information Processing (ICONIP 2005), Taipei, Taiwan, October 30 - November 2, 2005, accepted.

2. Shen Furao & Osamu Hasegawa, “An incremental k-means clustering algorithm with adaptive distance measure,” 12th International Conference on Neural Information Processing (ICONIP 2005), Taipei, Taiwan, October 30 - November 2, 2005, accepted.

3. Shen Furao & Osamu Hasegawa, “An on-line learning mechanism for unsupervised classification and topology representation,” IEEE Computer Society International Conference on Computer Vision and Pattern Recognition (CVPR 2005), San Diego, CA, USA, June 21-26, 2005.

4. Shen Furao & Osamu Hasegawa, “An incremental neural network for non-stationary unsupervised learning,” 11th International Conference on Neural Information Processing (ICONIP 2004), Calcutta, India, November 22-25, 2004.

5. Shen Furao & Osamu Hasegawa, “An effective fractal image coding method without search,” IEEE International Conference on Image Processing (ICIP 2004), Singapore, October 24-27, 2004.

6. Youki Kamiya, Shen Furao & Osamu Hasegawa, “Non-stop learning : a new scheme for continuous learning and recognition,” Joint 2nd SCIS and 5th ISIS, Keio University, Yokohama, Japan, September 21-24, 2004.

7. Osamu Hasegawa & Shen Furao, “A self-structurizing neural network for online incremental learning,” CD-ROM SICE Annual Conference in Sapporo, FAII-5-2, August 4-6, 2004.

8. Shen Furao & Osamu Hasegawa, “A self-organized growing network for on-line unsupervised learning,” 2004 International Joint Conference on Neural Networks (IJCNN 2004), Budapest, Hungary, CD-ROM ISBN 0-7803-8360-5, Vol.1, pp.11-16, 2004.

9. Shen Furao & Osamu Hasegawa, “A fast and less loss fractal image coding method using simulated annealing,” 7th Joint Conference on Information Science (JCIS 2003), Cary, North Carolina, USA, September 26-30, 2003.