Post on 24-Jun-2020
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 11
Introduction toIntroduction toProbabilistic Image ProcessingProbabilistic Image Processing
and Bayesian Networksand Bayesian Networks
Kazuyuki TanakaKazuyuki TanakaGraduate School of Information Sciences,Graduate School of Information Sciences,
Tohoku University, Sendai, JapanTohoku University, Sendai, Japankazu@smapip.is.tohoku.ac.jpkazu@smapip.is.tohoku.ac.jp
http://www.smapip.is.tohoku.ac.jp/~kazu/http://www.smapip.is.tohoku.ac.jp/~kazu/
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 22
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 33
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 4
Markov Random Fields for Image Processing
S. S. GemanGeman and D. and D. GemanGeman (1986): IEEE Transactions on PAMI (1986): IEEE Transactions on PAMIImage Processing for Image Processing for Markov Random Fields (MRF)Markov Random Fields (MRF)(Simulated Annealing, Line Fields)(Simulated Annealing, Line Fields)
J. Zhang (1992): IEEE Transactions on Signal ProcessingJ. Zhang (1992): IEEE Transactions on Signal ProcessingImage Processing in EM algorithm for Image Processing in EM algorithm for MarkovMarkovRandom Fields (MRF)Random Fields (MRF) (Mean Field Methods) (Mean Field Methods)
Markov Random Fieldsare One of Probabilistic Methods for Image processing.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 5
Markov Random Fields for Image Processing
In Markov Random Fields, we have to consider notonly the states with high probabilities but also oneswith low probabilities.In Markov Random Fields, wehave to estimate not only theimage but also hyperparametersin the probabilistic model.We have to perform thecalculations of statisticalquantities repeatedly.
HyperparameterEstimation
Statistical Quantities
Estimation of Image
We need a deterministic algorithm for calculatingstatistical quantities.Belief Propagation
1 October, 2007 ALT&DS2007 (Sendai, Japan) 6
Belief Propagation
Belief Propagation has been proposed in order toachieve probabilistic inference systems (Pearl,1988).It has been suggested that Belief Propagation has aclosed relationship to Mean Field Methods in thestatistical mechanics (Kabashima and Saad 1998).Generalized Belief Propagation has been proposedbased on Advanced Mean Field Methods (Yedidia,Freeman and Weiss, 2000).Interpretation of Generalized Belief Propagationalso has been presented in terms of InformationGeometry (Ikeda, T. Tanaka and Amari, 2004).
1 October, 2007 ALT&DS2007 (Sendai, Japan) 7
Probabilistic Model and Belief Propagation
!!!!!1 2 3 4 5
),(),(),(),( 25242321
x x x x x
xxCxxCxxBxxA x3
x1 x2 x4
x5
!!!1 2 3
),(),(),( 133221
x x x
xxCxxBxxA
Treex3x1
x2
Cycle
Function consisting of a product of functions with twovariables can be assigned to a graph representation.
Examples
Belief Propagation can give us an exact result for thecalculations of statistical quantities of probabilistic modelswith tree graph representations.Generally, Belief Propagation cannot give us an exactresult for the calculations of statistical quantities ofprobabilistic models with cycle graph representations.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 8
Application of Belief Propagation
Turbo and LDPC codes in Error Correcting Codes (Berrou andGlavieux: IEEE Trans. Comm., 1996; Kabashima and Saad: J. Phys.A, 2004, Topical Review).CDMA Multiuser Detection in Mobile Phone Communication(Kabashima: J. Phys. A, 2003).Satisfability (SAT) Problems in Computation Theory (Mezard,Parisi, Zecchina: Science, 2002).Image Processing (Tanaka: J. Phys. A, 2002, Topical Review;Willsky: Proceedings of IEEE, 2002).Probabilistic Inference in AI (Kappen and Wiegerinck, NIPS, 2002).
Applications of belief propagation to many problems whichare formulated as probabilistic models with cycle graphrepresentations have caused to many successful results.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 9
Purpose of My Talk
Review of formulation of probabilistic modelfor image processing by means ofconventional statistical schemes.Review of probabilistic image processing byusing Gaussian graphical model (GaussianMarkov Random Fields) as the most basicexample.Review of how to construct a beliefpropagation algorithm for imageprocessing.
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 1010
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 11
Image Representation in Computer Vision
Digital image is defined on the set of points arranged ona square lattice.The elements of such a digital array are called pixels.We have to treat more than 100,000 pixels even in thedigital cameras and the mobile phones.
xx
y y
)1,1( )1,2( )1,3(
)2,1( )2,2( )2,3(
)3,1( )3,2( )3,3(
),( yxPixels 200,307480640 =!
1 October, 2007 ALT&DS2007 (Sendai, Japan) 12
Image Representation in Computer Vision
Pixels 65536256256 =!
x
y
At each point, the intensity of light is represented as aninteger number or a real number in the digital imagedata.A monochrome digital image is then expressed as atwo-dimensional light intensity function and the valueis proportional to the brightness of the image at thepixel.
( )yxfyx ,),( !
( ) 0, =yxf ( ) 255, =yxf
1 October, 2007 ALT&DS2007 (Sendai, Japan) 13
Noise Reduction byConventional Filters
173
110218100
120219202
190202192
Average =
!"
!#
$
!%
!&
'
192 202 190
202 219 120
100 218 110
192 202 190
202 173 120
100 218 110
It is expected that probabilistic algorithms forimage processing can be constructed from suchaspects in the conventional signal processing.
Markov Random Fields Probabilistic Image ProcessingAlgorithm
Smoothing Filters
The function of a linearfilter is to take the sumof the product of themask coefficients and theintensities of the pixels.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 14
Bayes Formula and Bayesian Network
Posterior Probability
}Pr{
}Pr{}|Pr{}|Pr{
B
AABBA =
Bayes Rule
PriorProbability
Event A is given as the observed data.Event B corresponds to the originalinformation to estimate.Thus the Bayes formula can be applied to theestimation of the original information fromthe given data.
A
BBayesian Network
Data-Generating Process
1 October, 2007 ALT&DS2007 (Sendai, Japan) 15
Image Restoration by Probabilistic Model
OriginalImage
DegradedImage
Transmission
Noise
444 3444 21
444 8444 764444444 84444444 76
4444444 84444444 76
Likelihood Marginal
PriorLikelihood
Posterior
}ageDegradedImPr{
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
=
Assumption 1: The degradedimage is randomly generatedfrom the original image byaccording to the degradationprocess.Assumption 2: The originalimage is randomly generatedby according to the priorprobability.
Bayes Formula
1 October, 2007 ALT&DS2007 (Sendai, Japan) 16
Image Restoration by Probabilistic Model
Degraded
Image
i
fi: Light Intensity of Pixel iin Original Image
),( iii yxr =r
Position Vectorof Pixel i
gi: Light Intensity of Pixel iin Degraded Image
i
Original
Image
The original images and degraded images arerepresented by f = {fi} and g = {gi}, respectively.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 17
Probabilistic Modeling of ImageRestoration
444 8444 764444444 84444444 76
4444444 84444444 76
PriorLikelihood
Posterior
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
!
!=
"====N
i
ii fg
1
),(}|Pr{
}Image Original|Image DegradedPr{
fFgG
fg
Random Fieldsfi
gi
fi
gior
Assumption 1: A given degraded image is obtained from theoriginal image by changing the state of each pixel to anotherstate by the same probability, independently of the otherpixels.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 18
Probabilistic Modeling ofImage Restoration
444 8444 764444444 84444444 76
4444444 84444444 76
PriorLikelihood
Posterior
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
!
!"===
NN:
),(}Pr{
}Image OriginalPr{
ij
ji fffF
f
Random Fields
Assumption 2: The original image is generated according to aprior probability. Prior Probability consists of a product offunctions defined on the neighbouring pixels.
i j
Product over All the Nearest Neighbour Pairs of Pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 19
Prior Probability for Binary Image
== >p p p!
2
1p!
2
1i j Probability ofNeigbouringPixel
!"==
NeighbourNearest :
),(}Pr{
ij
ji fffFi j
It is important how we should assume the functionΦ(fi,fj) in the prior probability.
)0,1()1,0()0,0()1,1( !=!>!=!
We assume that every nearest-neighbour pair of pixelstake the same state of each other in the prior probability.
1,0=if
1 October, 2007 ALT&DS2007 (Sendai, Japan) 20
Prior Probability for Binary Image
Prior probability prefers to the configurationwith the least number of red lines.
Which state should the centerpixel be taken when the statesof neighbouring pixels are fixedto the white states?
?
>
== >p p p!
2
1p!
2
1i j Probability ofNearest NeigbourPair of Pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 21
Prior Probability for Binary Image
Which state should the centerpixel be taken when the states ofneighbouring pixels are fixed asthis figure?
?-?== >
p p
> >=
Prior probability prefers to the configurationwith the least number of red lines.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 22
What happens for the case oflarge umber of pixels?
p 0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.2 0.4 0.6 0.8 1.0lnp
Disordered State Critical Point(Large fluctuation)
small p large p
Covariancebetween thenearest neghbourpairs of pixels
Sampling byMarko chainMonte Carlo
Ordered State
Patterns withboth ordered statesand disordered statesare often generatednear the critical point.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 23
Pattern near Critical Pointof Prior Probability
0.0
0.2
0.4
0.6
0.8
1.0
0.0 0.2 0.4 0.6 0.8 1.0ln p
similar
small p large p
Covariancebetween thenearest neghbourpairs of pixels
We regard that patternsgenerated near the criticalpoint are similar to thelocal patterns in real worldimages.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 24
Bayesian Image Analysis
{ }{ } { }
{ }
!!
"
#
$$
%
&'
!!
"
#
$$
%
&()
=
======
**= NeighbourNearest:1
),(),(
Pr
PrPrPr
ij
ji
N
i
ii ffgf
gG
fFfFgGgGfF
fg
{ }fF =Pr { }fFgG ==Pr gOriginalImage
DegradedImage
Prior Probability
Posterior Probability
Degradation Process
Image processing is reduced to calculations ofaverages, variances and co-variances in theposterior probability.
B:Set of allthe nearestneighbourpairs of pixels
Ω:Set of Allthe pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 25
Estimation of Original Image
We have some choices to estimate the restored image fromposterior probability.
In each choice, the computational time is generallyexponential order of the number of pixels.
}|Pr{maxargˆ gG === iiz
i zFfi
2
}|Pr{minargˆ!!
"
#
$$
%
&=='= (
z
gGzFiii zfi
))
}|Pr{maxargˆ gGzFfz
===
Thresholded Posterior Mean (TPM) estimation
Maximum posterior marginal (MPM) estimation
Maximum A Posteriori (MAP) estimation
! =====
if
ii fF
\
}|Pr{}|Pr{
f
gGfFgG
(1)
(2)
(3)
1 October, 2007 ALT&DS2007 (Sendai, Japan) 26
Statistical Estimation of Hyperparameters
! =====
z
zFzFgGgG }|Pr{},|Pr{},|Pr{ "##"
( )},|Pr{max arg)ˆ,ˆ(
,
!"!"!"
gG ==
f
Marginalized with respect to F
}|Pr{ !fF = },|Pr{ !fFgG == gOriginal Image
Marginal Likelihood
Degraded Image!y
x
},|Pr{ !"gG =
Hyperparameters α, β are determined so as tomaximize the marginal likelihood Pr{G=g|α,β} withrespect to α, β.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 27
Maximization of Marginal Likelihood byEM Algorithm
! =====
z
zFzFgGgG }|Pr{},|Pr{},|Pr{ "##"MarginalLikelihood
( ) },|,Pr{ln}',',|Pr{
,',',
! =====
z
gGzFgGzF
g
"#"#
"#"#Q
( ) ( )( )
( ) ( )( )( )
( ) ( )( ).,,maxarg1,1 :Step-M
},|,Pr{ln)}(),(,|Pr{
,,
:Step-E
,ttQtt
tt
ttQ
!"!"!"
!"!"
!"!"
!"#++
====#$z
gGzFgGzF
E-step and M-Step are iterated until convergence:EM (Expectation Maximization) Algorithm
Q-Function
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 2828
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 29
Bayesian Image Analysis byGaussian Graphical Model
{ } ( )!!
"
#
$$
%
&''(= )
*Bij
ji ff2
2
1expPr +fF
0005.0=! 0030.0=!0001.0=!
Patterns are generated by MCMC.
Markov Chain Monte Carlo Method
PriorProbability
( )+!!"# ,if
B:Set of all thenearest-neghbourpairs of pixels
Ω:Set of all the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 30
Bayesian Image Analysis byGaussian Graphical Model
( )2,0~ !Nfg ii "
{ } ( )!"#
$%
&'(
)**===
i
ii gf2
22 2
1exp
2
1Pr
+,+fFgG
Histogram of Gaussian Random Numbers
nr
Noise Gaussianfr
Image Original gr
Image Degraded
( )+!!"# ,, ii gf
Degraded image isobtained by adding awhite Gaussian noise tothe original image.
Degradation Process is assumed to be theadditive white Gaussian noise.
Ω:Set of all the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 31
( ) gCI
Izgzz
2 ,
!""!
+=# d,P
Bayesian Image Analysisby Gaussian Graphical Model
( )+!!"# ,, ii gf
( )( )
( )!"
!#
$
%&
=
=
otherwise0
1
4
Bij
ji
ji C
Multi-Dimensional Gaussian Integral Formula
( ) ( )!!
"
#
$$
%
&''''( ))
*+* Bij
ji
i
ii ffgfP22
2 2
1
2
1exp),,|( ,
--,gf
Posterior Probability
Average of the posterior probabilitycan be calculated by using the multi-dimensional Gauss integral Formula
NxN matrix
B:Set of all thenearest-neghbourpairs of pixels
Ω:Set of all the pixels
1 October, 2007 ALT&DS2007 (Sendai, Japan) 32
Bayesian Image Analysisby Gaussian Graphical Model
( )( )
( ) ( )( ) ( ) ( )
1
2
T
2
2
11
1
11
1Tr
1
!
""
#
$
%%
&
'
!!++
!!+
!( g
CI
Cg
CI
C
ttNtt
t
Nt
)*)*
)*
( )( )
( ) ( )( )( ) ( )
( ) ( )( )g
CI
Cg
CI
I
22
242T
2
2
11
111
11
1Tr
1
!!+
!!+
!!+
!"
tt
tt
Ntt
t
Nt
#$
#$
#$
##
0
0.0002
0.0004
0.0006
0.0008
0.001
0 20 40 60 80 100( )t!
( )t!
gf̂
gCI
Im
2)()()(
tt
t
!"+= ( )2)(minarg)(ˆ tmztf ii
zi
i
!=
Iteration Procedure of EM algorithm in Gaussian Graphical Model
EM
f̂
g
1 October, 2007 ALT&DS2007 (Sendai, Japan) 33
Image Restoration by Markov RandomField Model and Conventional Filters
( )2ˆ||
1MSE !
"#
$"
=
i
ii ff
315Statistical Method
445(5x5)486(3x3)Median
Filter
413(5x5)388(3x3)Lowpass
Filter
MSE
(3x3) (3x3) LowpassLowpass (5x5) Median(5x5) MedianMRFMRF
Original ImageOriginal Image Degraded ImageDegraded Image
RestoredRestoredImageImage
Ω:Set of all the pixels
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 3434
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 35
Graphical Representationfor Tractable Models
Tractable Model
!!"
#$$%
&!!"
#$$%
&!!"
#$$%
&= '''
' ' '
===
= = =
FT,FT,FT,
FT, FT, FT,
),(),(),(
),(),(),(
CBA
A B C
DChDBgDAf
DChDBgDAf A
B CD! ! !
= = =FT, FT, FT,A B C
Intractable Model! ! != = =FT, FT, FT,
),(),(),(A B C
AChCBgBAf
A
B C
! ! != = =FT, FT, FT,A B C
Tree Graph
Cycle Graph
It is possible to calculate eachsummation independently.
It is hard to calculate eachsummation independently.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 36
Belief Propagationfor Tree Graphical Model
3
1 2
5
4
3
1 2
5
4
23!M
24!M
25!M
After taking thesummations over rednodes 3,4 and 5,the function of nodes 1and 2 can be expressedin terms of somemessages. 44 344 2144 344 2144 344 21
)(
52
)(
42
)(
3221
52423221
225
8
224
7
223
3
3 4 5
),(),(),(),(
),(),(),(),(
fM
f
fM
f
fM
f
f f f
ffDffCffBffA
ffDffCffBffA
!!!
""
#
$
%%
&
'
""
#
$
%%
&
'
""
#
$
%%
&
'= (((
(((
1 October, 2007 ALT&DS2007 (Sendai, Japan) 37
Belief Propagationfor Tree Graphical Model
3
1 2
5
4
By taking the summation over all the nodesexcept node 1, message from node 2 to node 1can be expressed in terms of all the messagesincoming to node 2 except the own message.
3
1 2
5
4
23!M
24!M
25!M
!=2z
212!M
1
Summation over allthe nodes except 1
1 October, 2007 ALT&DS2007 (Sendai, Japan) 38
Loopy Belief Propagation forGraphical Model in Image Processing
Graphical model for image processing is representedin terms of the square lattice.Square lattice includes a lot of cycles.Belief propagation are applied to the calculation ofstatistical quantities as an approximate algorithm.
1 2 4
5
31 2 4
5
Every graph consisting of apixel and its fourneighbouring pixels can beregarded as a tree graph.
Loopy Belief Propagation
3
1 2
5
4!"
2z
21
3
1 October, 2007 ALT&DS2007 (Sendai, Japan) 39
Loopy Belief Propagation forGraphical Model in Image Processing
( )MM
rrr!"
( )
( ) ( ) ( ) ( )
( ) ( ) ( ) ( )!!
!
"""
"""
"#
#
$
1 2
1
1151141132112
1151141132112
221,
,
z z
z
zMzMzMzz
zMzMzMfz
fM
42
3
1
5
Message Passing Rulein Loopy Belief Propagation
Averages, variances and covariances ofthe graphical model are expressed interms of messages.
3
1 2
5
4!"
2z
21
1 October, 2007 ALT&DS2007 (Sendai, Japan) 40
Loopy Belief Propagationfor Graphical Model in Image Processing
We have four kinds of message passing rules for each pixel.
Each massage passing rule includes3 incoming messages and 1 outgoing message
Visualizations ofPassing Messages
1 October, 2007 ALT&DS2007 (Sendai, Japan) 41
EM algorithm by means of Belief Propagation
Input
Output
LoopyBP EM Update Rule of Loopy BP
3
1 2
5
423!M
24!M
25!M
!=2z
212!M
1
EM Algorithm for Hyperparameter Estimation
( ) ( )( )
( )( ) ( )( ).,,,maxarg
1,1
,gttQ
tt
!"!"
!"
!"#
++
1 October, 2007 ALT&DS2007 (Sendai, Japan) 42
Probabilistic Image Processing byEM Algorithm and Loopy BP for GaussianGraphical Model( ) ( )( )
( )( ) ( )( ).,,,maxarg1,1
,gttQtt !"!"!"
!"#++
g
f̂
Loopy Belief Propagation
Exact
0006000ˆ
335.36ˆ
.=
=
LBP
LBP
!
"
0007130ˆ
624.37ˆ
.=
=
Exact
Exact
!
"
MSE:327
MSE:315
0
0.0002
0.0004
0.0006
0.0008
0.001
0 20 40 60 80 100( )t!
( )t!
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 4343
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 44
Digital Images Inpaintingbased on MRF
Inpu
t
Out
put
MarkovRandomFields
M. Yasuda, J. Ohkuboand K. Tanaka:Proceedings ofCIMCA&IAWTIC2005.
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 4545
ContentsContents
IntroductionIntroductionProbabilistic Image ProcessingProbabilistic Image ProcessingGaussian Graphical ModelGaussian Graphical ModelBelief PropagationBelief PropagationOther ApplicationOther ApplicationConcluding RemarksConcluding Remarks
1 October, 2007 ALT&DS2007 (Sendai, Japan) 46
Summary
Formulation of probabilistic model for imageprocessing by means of conventional statisticalschemes has been summarized.Probabilistic image processing by usingGaussian graphical model has been shown as themost basic example.It has been explained how to construct a beliefpropagation algorithm for image processing.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 47
Statistical Mechanics Informatics forProbabilistic Image Processing
S. S. GemanGeman and D. and D. GemanGeman (1986): IEEE Transactions on PAMI (1986): IEEE Transactions on PAMIImage Processing for Markov Random Fields (MRF)Image Processing for Markov Random Fields (MRF)((Simulated AnnealingSimulated Annealing, Line Fields), Line Fields)
J. Zhang (1992): IEEE Transactions on Signal ProcessingJ. Zhang (1992): IEEE Transactions on Signal ProcessingImage Processing in EM algorithm for Markov RandomImage Processing in EM algorithm for Markov RandomFields (MRF) (Fields (MRF) (Mean Field MethodsMean Field Methods))
K. Tanaka and T. Morita (1995): Physics Letters AK. Tanaka and T. Morita (1995): Physics Letters ACluster Variation MethodCluster Variation Method for MRF in Image Processing for MRF in Image Processing
Original ideas of some techniques, Simulated Annealing,Mean Field Methods and Belief Propagation, is oftenbased on the statistical mechanics.
Mathematical structure of Belief Propagation is equivalent toBethe Approximation and Cluster Variation Method (KikuchiMethod) which are ones of advanced mean field methods in thestatistical mechanics.
1 October, 2007 ALT&DS2007 (Sendai, Japan) 48
Statistical Mechanical Informatics forProbabilistic Information Processing
It has been suggested that statistical performanceestimations for probabilistic information processing areclosed to the spin glass theory.The computational techniques of spin glass theory has beenapplied to many problems in computer sciences.
Error Correcting Codes (Y. Kabashima and D. Saad: J. Phys.A, 2004, Topical Review).CDMA Multiuser Detection in Mobile Phone Communication(T. Tanaka: IEEE Information Theory, 2002).SAT Problems (Mezard, Parisi, Zecchina: Science, 2002).Image Processing (K. Tanaka: J. Phys. A, 2002, TopicalReview).
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 4949
SMAPIP ProjectSMAPIP Project
MEXT Grant-in Aid for Scientific Research onPriority Areas
Period: 2002 –2005Head Investigator: Kazuyuki Tanaka
Webpage URL: http://www.smapip.eei.metro-u.ac.jp./
Member:K. Tanaka, Y. Kabashima,H. Nishimori, T. Tanaka,M. Okada, O. Watanabe,N. Murata, ......
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 5050
DEX-SMI ProjectDEX-SMI Project
http://dex-smi.sp.dis.titech.ac.jp/DEX-SMIhttp://dex-smi.sp.dis.titech.ac.jp/DEX-SMI//
情報統計力学 GOGO
DEX-SMI GOGO
MEXT Grant-in Aid for ScientificResearch on Priority Areas
Period: 2006 –2009Head Investigator: Yoshiyuki Kabashima
Deepening and Expansion ofStatistical Mechanical Informatics
1 October, 20071 October, 2007 ALT&DS2007 (Sendai, JapanALT&DS2007 (Sendai, Japan)) 5151
ReferencesReferencesK. Tanaka: Statistical-Mechanical Approach toK. Tanaka: Statistical-Mechanical Approach to
Image Processing (Topical Review), J. Phys. A,Image Processing (Topical Review), J. Phys. A,3535 (2002). (2002).
A. S. Willsky: Multiresolution Markov Models forA. S. Willsky: Multiresolution Markov Models forSignal and Image Processing, Proceedings ofSignal and Image Processing, Proceedings ofIEEE, IEEE, 9090 (2002). (2002).