Institute of Electronics, NCTU 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
-
Upload
charlene-briggs -
Category
Documents
-
view
229 -
download
5
Transcript of Institute of Electronics, NCTU 指導教授 : 王聖智 S. J. Wang 學生 : 羅介暐 Jie-Wei Luo.
Markov random fieldInstitute of Electronics, NCTU
指導教授 : 王聖智 S. J. Wang學生 : 羅介暐 Jie-Wei Luo
Sites◦ ◦ Ex: pixel, feature(line, surface patch)
Label: An event happen to a site◦ EX: L ={edge,nonedge}, L={0, . . . , 255}
Prior Knowledge
f = {f1, . . . , fm}◦ Each fi labeling sites in term of Labels f : S →L
Labeling Problem
Labeling is called configuration in random field
4
Prior knowledge(conti)
In order to explain the concept of the MRF, we first introduce following definition:
1. i: Site (Pixel) 2. Ni: The neighboring point of i
3. S: Set of sites (Image)
4. fi: The value at site i (Intensity)
f1 f2 f3
f4 fi f6
f7 f8 f9
A 3x3 imagined image
5
Neighborhood system
The sites in S are related to one another via a neighborhood system. Its definition for S is defined as:
where Ni is the set of sites neighboring i.
The neighboring relationship has the following properties: (1) A site is not neighboring to itself(2) The neighboring relationship is mutual
f1 f2 f3f4 fi f6f7 f8 f9
' 'i ii N i N
6
Example(Regular sites)
First order neighborhood system
Second order neighborhood system
Nth order neighborhood system
7
Example(Irregular sites)
The neighboring sites of the site i are m, n, and f.
The neighboring sites of the site j are r and x
8
Clique
A clique C is defined as a subset of sites in S.
Following are some examples◦ Single-site
◦ pair-site
◦ triple-site
9
Clique: Example
Take first order neighborhood system and second order neighborhood for example:
Neighborhood system
Clique types
Random field is a list of random numbers whose indices are mapped onto a space (of n dimensions)
F = {F1, . . . , Fm} be a family of random variables defined on the set S in which each random variable Fi takes a value fi in L. The family F is called a random field.
Random field
View the 2D image f as the collection of the random variables (Random field)
Markov Random field is a set of random variables having a Markov property
Markov Random field
{ }
(1) ( ) 0, (Positivity)
(2) ( | ) ( | ) (Markovianity)i S i i Ni
P f f
P f f P f f
F
12
Gibbs random field (GRF) and Gibbs distribution
A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is:
Image configuration f
f1 f2 f3f4 fi f6f7 f8 f9
1 2
1 2 '{ } { , '}
1 2 '{ } { } '
( ) ( ) ( ) ( , ) .....
( ) ( , ) .....i
c i i ic C i C i i C
i i ii S i S i N
U f V f V f V f f
V f V f f
1( )1( )
U fTP f Z e
U(f): Energy function; T: Temperature Vi(f): Clique potential
Design U for different applications
(1) As the quantitative measure of the global quality of the solution and
(2) As a guide to the search for a minimal solution.
By MRF Modeling to find
Role of Energy Function
The temperature T controls the sharpness of the distribution.◦ When Temperature is high, all configurations tend
to be equally distributed.
Role of Temperature
1( )1( )
U fTP f Z e
15
Markov-Gibbs equivalence
Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF
Proof: Let P(f) be a Gibbs distribution on S with the neighborhood system N.
f1 f2 f3f4 fi f6f7 f8 f9
A 3x3 imagined image
( )
{ } ( '){ }
'
( )( | )
( )
cc C
cc C
i
V f
i S i V fS i
f
P f eP f f
P fe
{ }( | ) ( | ) i S i i NiP f f P f f
16
Markov-Gibbs equivalence
Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i:
A 3x3 imagined image
f1 f2 f3f4 fi f6f7 f8 f9
( ) ( ) ( )
{ } ( ') ( ') ( ')
''
( )
( ')
'
[ ][ ]( | )
{[ ][ ]}
[ ] ( | )
{[ ]}
c c cc C c A c B
c c cc C c A c B
ii
cc A
cc A
i
V f V f V f
i S i V f V f V f
ff
V f
i NiV f
f
e e eP f f
e e e
eP f f
e
17
Optimization-based vision problem
Denoising
Noisy signal d denoised signal f
When both prior and likelihood is known MAP-MRF Labeling
The MAP-MRF Framework
20
MAP formulation for denoising problem
Assume the observation is the true signal plus the independent Gaussian noise, that is
Under above circumstance, the observation model could be expressed as
2 2
1
( ) / 2( | )
2 2
1 1( | )
2 2
m
i i ii
f dU d f
m m
i ii m i m
p d f e e
U(d|f): Likelihood energy
21
MAP formulation for denoising problem
Assume the unknown data f is MRF, the prior model is:
Based on above information, the posteriori probability becomes
1( )1( )
U fTP f Z e
2 2
1
( )( ) / 21
2
1( | ) ( | )* ( ) *
2
m
i i ii
U ff dT
m
ii m
p f d P d f P f e Z e
22
MAP formulation for denoising problem
The MAP estimator for the problem is:
2 2
1
( )( ) / 21
2
2 2
1
arg max{ ( | )} arg max{ ( | ) ( )}
1arg max{ * }
2
arg min{ ( ) / 2 ( )}
arg min{ ( | ) ( )}
m
i i ii
f f
U ff dT
f m
ii m
m
f i i ii
f
f p f d p d f p f
e Z e
f d U f
U d f U f
?
U(f)=[f(n)(x)]2 the order n determines the number of sites in the cliques involved
N=1 (constant gray level)◦
N=2 (constant gradient)◦
N=3 (constant curvature)◦
The Smoothness Prior
24
MAP formulation for denoising problem
Define the smoothness prior:
Substitute above information into the MAP estimator, we could get:
21( ) ( )i i
i
U f f f
22
121 1
arg max{ ( | )} arg min{ ( | ) ( )}
( )arg min{ ( ) }
2
f f
m mi i
f i ii i
f p f d U d f U f
f df f
Observation model (Similarity measure)
Prior model (Reconstruction constrain)
Call posterior Energy function
Piecewise Continuous Restoration
22
121 1
arg max{ ( | )} arg min{ ( | ) ( )}
( )arg min{ ( ) }
2
f f
m mi i
f i ii i
f p f d U d f U f
f df f
𝐸 ( 𝑓 )=∑𝑖=1
𝑚
( 𝑓 𝑖−𝑑𝑖 ) 2+¿ 2λ∑𝑖=1
𝑚
𝑔 ( 𝑓 𝑖− 𝑓 𝑖−1 )¿
If g(x)=x2, at discontinuities tend to be very large , giving an oversmoothed result.
To encode piecewise smoothness , g should be saturate at its asymptotic upper bound to allow discontinuities
𝑔 (𝑥 )=min {𝑥2 ,𝐶 }
Result