Introduction to Probabilistic Image Processing and ...

51
1 October, 2007 1 October, 2007 ALT&DS2007 (Sendai, Japan ALT&DS2007 (Sendai, Japan 1 1 Introduction to Introduction to Probabilistic Image Processing Probabilistic Image Processing and Bayesian Networks and Bayesian Networks Kazuyuki Tanaka Kazuyuki Tanaka Graduate School of Information Sciences, Graduate School of Information Sciences, Tohoku University, Sendai, Japan Tohoku University, Sendai, Japan [email protected] [email protected] http://www.smapip.is.tohoku.ac.jp/~kazu/ http://www.smapip.is.tohoku.ac.jp/~kazu/

Transcript of Introduction to Probabilistic Image Processing and ...

Page 1: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 11

Introduction to Introduction to Probabilistic Image Processing Probabilistic Image Processing

and Bayesian Networksand Bayesian Networks

Kazuyuki TanakaKazuyuki TanakaGraduate School of Information Sciences,Graduate School of Information Sciences,

Tohoku University, Sendai, JapanTohoku University, Sendai, [email protected]@smapip.is.tohoku.ac.jp

http://www.smapip.is.tohoku.ac.jp/~kazu/http://www.smapip.is.tohoku.ac.jp/~kazu/

Page 2: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 22

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 3: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 33

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 4: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 4

Markov Random Fields for Image Processing

S. S. GemanGeman and D. and D. GemanGeman (1986): IEEE Transactions on PAMI(1986): IEEE Transactions on PAMIImage Processing for Image Processing for Markov Random Fields (MRF)Markov Random Fields (MRF)(Simulated Annealing, Line Fields)(Simulated Annealing, Line Fields)

J. Zhang (1992): IEEE Transactions on Signal ProcessingJ. Zhang (1992): IEEE Transactions on Signal ProcessingImage Processing in EM algorithm for Image Processing in EM algorithm for Markov Markov Random Fields (MRF)Random Fields (MRF) (Mean Field Methods)(Mean Field Methods)

Markov Random Fields are One of Probabilistic Methods for Image processing.

Page 5: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 5

Markov Random Fields for Image ProcessingIn Markov Random Fields, we have to consider not only the states with high probabilities but also ones with low probabilities.In Markov Random Fields, we have to estimate not only the image but also hyperparameters in the probabilistic model.

We have to perform the calculations of statistical quantities repeatedly.

Hyperparameter Estimation

Statistical Quantities

Estimation of Image

We need a deterministic algorithm for calculating statistical quantities. Belief Propagation

Page 6: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 6

Belief Propagation

Belief Propagation has been proposed in order to achieve probabilistic inference systems (Pearl, 1988).It has been suggested that Belief Propagation has a closed relationship to Mean Field Methods in the statistical mechanics (Kabashima and Saad 1998). Generalized Belief Propagation has been proposed based on Advanced Mean Field Methods (Yedidia, Freeman and Weiss, 2000).Interpretation of Generalized Belief Propagation also has been presented in terms of Information Geometry (Ikeda, T. Tanaka and Amari, 2004).

Page 7: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 7

Probabilistic Model and Belief Propagation

∑∑∑∑∑1 2 3 4 5

),(),(),(),( 25242321x x x x x

xxCxxCxxBxxA x3

x1 x2 x4

x5

∑∑∑1 2 3

),(),(),( 133221x x x

xxCxxBxxATreex3x1

x2

Cycle

Function consisting of a product of functions with two variables can be assigned to a graph representation.

Examples

Belief Propagation can give us an exact result for the calculations of statistical quantities of probabilistic models with tree graph representations.Generally, Belief Propagation cannot give us an exact result for the calculations of statistical quantities of probabilistic models with cycle graph representations.

Page 8: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 8

Application of Belief Propagation

Turbo and LDPC codes in Error Correcting Codes (Berrou and Glavieux: IEEE Trans. Comm., 1996; Kabashima and Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (Kabashima: J. Phys. A, 2003).Satisfability (SAT) Problems in Computation Theory (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (Tanaka: J. Phys. A, 2002, Topical Review; Willsky: Proceedings of IEEE, 2002).Probabilistic Inference in AI (Kappen and Wiegerinck, NIPS, 2002).

Applications of belief propagation to many problems which are formulated as probabilistic models with cycle graph representations have caused to many successful results.

Page 9: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 9

Purpose of My Talk

Review of formulation of probabilistic model for image processing by means of conventional statistical schemes.Review of probabilistic image processing by using Gaussian graphical model (Gaussian Markov Random Fields) as the most basic example.Review of how to construct a belief propagation algorithm for image processing.

Page 10: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 1010

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical Model Gaussian Graphical Model 4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 11: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 11

Image Representation in Computer Vision

Digital image is defined on the set of points arranged on a square lattice.The elements of such a digital array are called pixels.We have to treat more than 100,000 pixels even in the digital cameras and the mobile phones.

xx

y y

)1,1( )1,2( )1,3(

)2,1( )2,2( )2,3(

)3,1( )3,2( )3,3(

),( yxPixels 200,307480640 =×

Page 12: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 12

Image Representation in Computer Vision

Pixels 65536256256 =×

x

yAt each point, the intensity of light is represented as an integer number or a real number in the digital image data.A monochrome digital image is then expressed as a two-dimensional light intensity function and the value is proportional to the brightness of the image at the pixel.

( )yxfyx ,),( →

( ) 0, =yxf ( ) 255, =yxf

Page 13: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 13

Noise Reduction by Conventional FiltersNoise Reduction by Conventional Filters

173110218100120219202190202192

Average =⎪⎭

⎪⎬

⎪⎩

⎪⎨

192 202 190

202 219 120

100 218 110

192 202 190

202 173 120

100 218 110

It is expected that probabilistic algorithms for image processing can be constructed from such aspects in the conventional signal processing.

Markov Random Fields Probabilistic Image ProcessingAlgorithm

Smoothing Filters

The function of a linear filter is to take the sum of the product of the mask coefficients and the intensities of the pixels.

Page 14: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 14

Bayes Formula and Bayesian Network

Posterior Probability

}Pr{}Pr{}|Pr{}|Pr{

BAABBA =

Bayes Rule

Prior Probability

Event A is given as the observed data.Event B corresponds to the original information to estimate. Thus the Bayes formula can be applied to the estimation of the original information from the given data.

A

BBayesian Network

Data-Generating Process

Page 15: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 15

Image Restoration by Probabilistic Model

Original Image

Degraded Image

Transmission

Noise

444 3444 21

444 8444 764444444 84444444 76

4444444 84444444 76

Likelihood Marginal

PriorLikelihood

Posterior

}ageDegradedImPr{}Image OriginalPr{}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

=

Assumption 1: The degraded image is randomly generated from the original image by according to the degradation process. Assumption 2: The original image is randomly generated by according to the prior probability.

Bayes Formula

Page 16: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 16

Image Restoration by Probabilistic Model

Degraded

Image

i

fi: Light Intensity of Pixel iin Original Image

),( iii yxr =r

Position Vector of Pixel i

gi: Light Intensity of Pixel iin Degraded Image

i

Original

Image

The original images and degraded images are represented by f = {fi} and g = {gi}, respectively.

Page 17: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 17

Probabilistic Modeling of Image Restoration

444 8444 764444444 84444444 76

4444444 84444444 76

PriorLikelihood

Posterior

}Image OriginalPr{}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

∏=

Ψ====N

iii fg

1),(}|Pr{

}Image Original|Image DegradedPr{

fFgG

fg

Random Fieldsfi

gi

fi

gi

or

Assumption 1: A given degraded image is obtained from the original image by changing the state of each pixel to another state by the same probability, independently of the other pixels.

Page 18: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 18

Probabilistic Modeling of Image Restoration

444 8444 764444444 84444444 76

4444444 84444444 76

PriorLikelihood

Posterior

}Image OriginalPr{}Image Original|Image DegradedPr{

}Image Degraded|Image OriginalPr{

∏Φ===NN:

),(}Pr{ }Image OriginalPr{

ijji fffF

f

Random Fields

Assumption 2: The original image is generated according to a prior probability. Prior Probability consists of a product of functions defined on the neighbouring pixels.

i j

Product over All the Nearest Neighbour Pairs of Pixels

Page 19: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 19

Prior Probability for Binary Image

== >p p p−

21 p−

21i j Probability of

NeigbouringPixel

∏Φ==NeighbourNearest :

),(}Pr{ij

ji fffFi j

It is important how we should assume the function Φ(fi,fj) in the prior probability.

)0,1()1,0()0,0()1,1( Φ=Φ>Φ=Φ

We assume that every nearest-neighbour pair of pixels take the same state of each other in the prior probability.

1,0=if

Page 20: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 20

Prior Probability for Binary Image

Prior probability prefers to the configuration with the least number of red lines.

Which state should the center pixel be taken when the states of neighbouring pixels are fixed to the white states?

>

== >p p p−

21 p−

21i j Probability of

Nearest NeigbourPair of Pixels

Page 21: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 21

Prior Probability for Binary ImagePrior Probability for Binary Image

Which state should the center pixel be taken when the states of neighbouring pixels are fixed as this figure?

?-?== >

p p

> >=

Prior probability prefers to the configuration with the least number of red lines.

Page 22: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 22

What happens for the case of large umber of pixels?

p 0.0

0.2

0.4

0.6

0.8

1.0

0.0 0.2 0.4 0.6 0.8 1.0lnp

Disordered State Critical Point(Large fluctuation)

small p large p

Covariance between the nearest neghbourpairs of pixels

Sampling by Marko chain Monte Carlo

Ordered State

Patterns with both ordered statesand disordered states are often generated near the critical point.

Page 23: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 23

Pattern near Critical Point of Prior Probability

0.0

0.2

0.4

0.6

0.8

1.0

0.0 0.2 0.4 0.6 0.8 1.0ln p

similar

small p large p

Covariance between the nearest neghbourpairs of pixels

We regard that patterns generated near the critical point are similar to the local patterns in real world images.

Page 24: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 24

Bayesian Image Analysis

{ } { } { }{ }

⎟⎟

⎜⎜

⎛Φ⎟

⎟⎠

⎞⎜⎜⎝

⎛Ψ∝

=

======

∏∏= NeighbourNearest:1

),(),(

PrPrPr

Pr

ijji

N

iii ffgf

gGfFfFgG

gGfF

fg

{ }fF =Pr { }fFgG ==Pr gOriginalImage

Degraded Image

Prior Probability

Posterior Probability

Degradation Process

Image processing is reduced to calculations of averages, variances and co-variances in the posterior probability.

B:Set of all the nearest neighbourpairs of pixels

Ω:Set of All the pixels

Page 25: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 25

Estimation of Original Image

We have some choices to estimate the restored image from posterior probability.

In each choice, the computational time is generally exponential order of the number of pixels.

}|Pr{maxargˆ gG === iiz

i zFfi

2

}|Pr{minargˆ⎟⎟

⎜⎜

⎛==−= ∑

zgGzFiii zf

ζ

}|Pr{maxargˆ gGzFfz

===

Thresholded Posterior Mean (TPM) estimation

Maximum posterior marginal (MPM) estimation

Maximum A Posteriori (MAP) estimation

∑ =====if

ii fF\

}|Pr{}|Pr{f

gGfFgG

(1)

(2)

(3)

Page 26: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 26

Statistical Estimation of Hyperparameters

∑ =====z

zFzFgGgG }|Pr{},|Pr{},|Pr{ αββα

( )},|Pr{max arg)ˆ,ˆ(

,βαβα

βαgG ==

f g

Marginalized with respect to F

}|Pr{ αfF = },|Pr{ βfFgG == gOriginal Image

Marginal Likelihood

Degraded ImageΩy

x

},|Pr{ βαgG =

Hyperparameters α, β are determined so as to maximize the marginal likelihood Pr{G=g|α,β} with respect to α, β.

Page 27: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 27

Maximization of Marginal Likelihood by EM Algorithm

∑ =====z

zFzFgGgG }|Pr{},|Pr{},|Pr{ ασσαMarginal Likelihood

( ) },|,Pr{ln}',',|Pr{

,',',

∑ =====z

gGzFgGzF

g

σασα

σασαQ

( ) ( )( )

( ) ( )( )( )

( ) ( )( ).,,maxarg1,1 :Step-M

},|,Pr{ln)}(),(,|Pr{

,, :Step-E

,ttQtt

tt

ttQ

σασασα

σασα

σασα

σα←++

====← ∑z

gGzFgGzF

E-step and M-Step are iterated until convergence:EM (Expectation Maximization) Algorithm

Q-Function

Page 28: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 2828

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 29: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 29

Bayesian Image Analysis by Gaussian Graphical Model

{ } ( ) ⎟⎟

⎜⎜

⎛−−∝= ∑

∈Bijji ff 2

21expPr αfF

0005.0=α 0030.0=α0001.0=α

Patterns are generated by MCMC.

Markov Chain Monte Carlo Method

PriorProbability

( )+∞∞−∈ ,if

B:Set of all the nearest-neghbourpairs of pixels

Ω:Set of all the pixels

Page 30: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 30

Bayesian Image Analysis by Gaussian Graphical Model

( )2,0~ σNfg ii −

{ } ( )∏Ω∈

⎟⎠⎞

⎜⎝⎛ −−===

iii gf 2

22 21exp

21Pr

σπσfFgG

Histogram of Gaussian Random Numbers

nr Noise Gaussianfr

Image Original gr Image Degraded

( )+∞∞−∈ ,, ii gf

Degraded image is obtained by adding a white Gaussian noise to the original image.

Degradation Process is assumed to be the additive white Gaussian noise.

Ω:Set of all the pixels

Page 31: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 31

( ) gCI

Izgzz 2 ,ασ

σα+

=∫ d,P

Bayesian Image Analysis by Gaussian Graphical Model ( )+∞∞−∈ ,, ii gf

( )( )

( )⎪⎩

⎪⎨

⎧∈−=

=otherwise0

14

Bijji

ji C

Multi-Dimensional Gaussian Integral Formula

( ) ( ) ⎟⎟

⎜⎜

⎛−−−−∝ ∑∑

∈Ω∈ Bijji

iii ffgfP 22

2 21

21exp),,|( ασ

σαgf

Posterior Probability

Average of the posterior probability can be calculated by using the multi-dimensional Gauss integral Formula

NxN matrix

B:Set of all the nearest-neghbourpairs of pixels

Ω:Set of all the pixels

Page 32: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 32

Bayesian Image Analysis by Gaussian Graphical Model

( ) ( )( ) ( )( ) ( ) ( )

1

2T

2

2

111

111Tr1

⎟⎟⎠

⎞⎜⎜⎝

−−++

−−+

−← g

CICg

CIC

ttNttt

Nt

σασα

σα

( ) ( )( ) ( )( )

( ) ( )( ) ( )( )

gCI

CgCI

I22

242T

2

2

11

11111

1Tr1

−−+

−−+

−−+

−←

tt

ttNtt

tN

tσα

σα

σα

σσ

0

0.0002

0.0004

0.0006

0.0008

0.001

0 20 40 60 80 100( )tσ

( )tα

g f̂

gCI

Im 2)()()(

ttt

σα+= ( )2)(minarg)(ˆ tmztf ii

zi

i−=

Iteration Procedure of EM algorithm in Gaussian Graphical Model

EM

g

Page 33: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 33

Image Restoration by Markov Random Field Model and Conventional Filters

( )2ˆ||

1MSE ∑Ω∈

−Ω

=i

ii ff

315Statistical Method

445(5x5)

486(3x3)Median Filter

413(5x5)388(3x3)Lowpass

Filter

MSE

(3x3) (3x3) LowpassLowpass (5x5) Median(5x5) MedianMRFMRF

Original ImageOriginal Image Degraded ImageDegraded Image

RestoredRestoredImageImage

Ω:Set of all the pixels

Page 34: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 3434

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 35: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 35

Graphical Representation for Tractable Models

Tractable Model

⎟⎟⎠

⎞⎜⎜⎝

⎛⎟⎟⎠

⎞⎜⎜⎝

⎛⎟⎟⎠

⎞⎜⎜⎝

⎛= ∑∑∑

∑ ∑ ∑

===

= = =

FT,FT,FT,

FT, FT, FT,

),(),(),(

),(),(),(

CBA

A B C

DChDBgDAf

DChDBgDAf A

B CD∑ ∑ ∑

= = =FT, FT, FT,A B C

Intractable Model∑ ∑ ∑= = =FT, FT, FT,

),(),(),(A B C

AChCBgBAf

A

B C

∑ ∑ ∑= = =FT, FT, FT,A B C

Tree Graph

Cycle Graph

It is possible to calculate each summation independently.

It is hard to calculate each summation independently.

Page 36: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 36

Belief Propagation for Tree Graphical Model

3

1 2

5

4

3

1 2

5

4

23→M

24→M

25→M

After taking the summations over red nodes 3,4 and 5, the function of nodes 1 and 2 can be expressed in terms of some messages.

44 344 2144 344 2144 344 21)(

52

)(

42

)(

3221

52423221

225

8

224

7

223

3

3 4 5

),(),(),(),(

),(),(),(),(

fM

f

fM

f

fM

f

f f f

ffDffCffBffA

ffDffCffBffA

→→→

⎟⎟

⎜⎜

⎟⎟

⎜⎜

⎟⎟

⎜⎜

⎛= ∑∑∑

∑∑∑

Page 37: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 37

Belief Propagation for Tree Graphical Model

3

1 2

5

4

By taking the summation over all the nodes except node 1, message from node 2 to node 1 can be expressed in terms of all the messages incoming to node 2 except the own message.

3

1 2

5

4

23→M

24→M

25→M

∑=2z

212→M

1

Summation over all the nodes except 1

Page 38: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 38

Loopy Belief Propagation for Graphical Model in Image Processing

Graphical model for image processing is represented in terms of the square lattice.Square lattice includes a lot of cycles.Belief propagation are applied to the calculation of statistical quantities as an approximate algorithm.

1 2 4

5

31 2 4

5

Every graph consisting of a pixel and its four neighbouring pixels can be regarded as a tree graph.

Loopy Belief Propagation

3

1 2

5

4∑←2z

21

3

Page 39: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 39

Loopy Belief Propagation for Graphical Model in Image Processing

( )MMrrr

Ψ←

( )( ) ( ) ( ) ( )

( ) ( ) ( ) ( )∑∑

→→→

→→→

→ Φ

Φ

1 2

1

1151141132112

1151141132112

221 ,

,

z z

z

zMzMzMzz

zMzMzMfz

fM

42

3

1

5

Message Passing Rule in Loopy Belief Propagation

Averages, variances and covariances of the graphical model are expressed in terms of messages.

3

1 2

5

4∑←2z

21

Page 40: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 40

Loopy Belief Propagation for Graphical Model in Image Processing

We have four kinds of message passing rules for each pixel.

Each massage passing rule includes 3 incoming messages and 1 outgoing message

Visualizations of Passing Messages

Page 41: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 41

EM algorithm by means of Belief Propagation

Input

Output

LoopyBP EM Update Rule of Loopy BP

3

1 2

5

423→M

24→M25→M

∑=2z

212→M

1

EM Algorithm for Hyperparameter Estimation

( ) ( )( )

( )( ) ( )( ).,,,maxarg

1,1

,gttQ

ttσασα

σα

σα←

++

Page 42: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 42

Probabilistic Image Processing by EM Algorithm and Loopy BP for Gaussian Graphical Model

( ) ( )( )( )

( ) ( )( ).,,,maxarg1,1,

gttQtt σασασασα

←++

gf̂

Loopy Belief Propagation

Exact

0006000ˆ335.36ˆ

.==

LBP

LBP

ασ

0007130ˆ624.37ˆ

.==

Exact

Exact

ασ

MSE:327

MSE:315

0

0.0002

0.0004

0.0006

0.0008

0.001

0 20 40 60 80 100( )tσ

( )tα

Page 43: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 4343

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical Model Gaussian Graphical Model 4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 44: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 44

Digital Images Inpaintingbased on MRF

Inpu

t

Out

put

MarkovRandomFields

M. Yasuda, J. Ohkubo and K. Tanaka: Proceedings ofCIMCA&IAWTIC2005.

Page 45: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 4545

ContentsContents

1.1. IntroductionIntroduction2.2. Probabilistic Image ProcessingProbabilistic Image Processing3.3. Gaussian Graphical ModelGaussian Graphical Model4.4. Belief PropagationBelief Propagation5.5. Other ApplicationOther Application6.6. Concluding RemarksConcluding Remarks

Page 46: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 46

Summary

Formulation of probabilistic model for image processing by means of conventional statistical schemes has been summarized.Probabilistic image processing by using Gaussian graphical model has been shown as the most basic example.It has been explained how to construct a belief propagation algorithm for image processing.

Page 47: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 47

Statistical Mechanics Informatics for Probabilistic Image Processing

S. S. GemanGeman and D. and D. GemanGeman (1986): IEEE Transactions on PAMI(1986): IEEE Transactions on PAMIImage Processing for Markov Random Fields (MRF) Image Processing for Markov Random Fields (MRF) ((Simulated AnnealingSimulated Annealing, Line Fields), Line Fields)

J. Zhang (1992): IEEE Transactions on Signal ProcessingJ. Zhang (1992): IEEE Transactions on Signal ProcessingImage Processing in EM algorithm for Markov Random Image Processing in EM algorithm for Markov Random Fields (MRF) (Fields (MRF) (Mean Field MethodsMean Field Methods))

K. Tanaka and T. Morita (1995): Physics Letters AK. Tanaka and T. Morita (1995): Physics Letters ACluster Variation MethodCluster Variation Method for MRF in Image Processingfor MRF in Image Processing

Original ideas of some techniques, Simulated Annealing, Mean Field Methods and Belief Propagation, is often based on the statistical mechanics.

Mathematical structure of Belief Propagation is equivalent to Bethe Approximation and Cluster Variation Method (Kikuchi Method) which are ones of advanced mean field methods in the statistical mechanics.

Page 48: Introduction to Probabilistic Image Processing and ...

1 October, 2007 ALT&DS2007 (Sendai, Japan) 48

Statistical Mechanical Informatics for Probabilistic Information Processing

It has been suggested that statistical performance estimations for probabilistic information processing are closed to the spin glass theory. The computational techniques of spin glass theory has been applied to many problems in computer sciences.

Error Correcting Codes (Y. Kabashima and D. Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (T. Tanaka: IEEE Information Theory, 2002).SAT Problems (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (K. Tanaka: J. Phys. A, 2002, Topical Review).

Page 49: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 4949

SMAPIP ProjectSMAPIP ProjectSMAPIP Project

MEXT Grant-in Aid for Scientific Research on Priority AreasMEXT Grant-in Aid for Scientific Research on Priority Areas

Period: 2002 –2005Head Investigator: Kazuyuki TanakaPeriod: 2002 –2005Head Investigator: Kazuyuki Tanaka

Webpage URL: http://www.smapip.eei.metro-u.ac.jp./Webpage URL: http://www.smapip.eei.metro-u.ac.jp./

Member:K. Tanaka, Y. Kabashima,H. Nishimori, T. Tanaka, M. Okada, O. Watanabe, N. Murata, ......

Page 50: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 5050

DEX-SMI ProjectDEXDEX--SMI ProjectSMI Project

http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/http://dexhttp://dex--smi.sp.dis.titech.ac.jp/DEXsmi.sp.dis.titech.ac.jp/DEX--SMISMI//

情報統計力学 GOGO

DEX-SMI GOGO

MEXT Grant-in Aid for Scientific Research on Priority Areas

Period: 2006 –2009Head Investigator: Yoshiyuki KabashimaPeriod: 2006 –2009Head Investigator: Yoshiyuki Kabashima

Deepening and Expansion of Statistical Mechanical Informatics

Page 51: Introduction to Probabilistic Image Processing and ...

1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 5151

ReferencesReferencesK. Tanaka: StatisticalK. Tanaka: Statistical--Mechanical Approach to Mechanical Approach to

Image Processing (Topical Review), J. Phys. A, Image Processing (Topical Review), J. Phys. A, 3535 (2002).(2002).

A. S. Willsky: Multiresolution Markov Models for A. S. Willsky: Multiresolution Markov Models for Signal and Image Processing, Proceedings of Signal and Image Processing, Proceedings of IEEE, IEEE, 9090 (2002).(2002).