Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation...

11
1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garc´ es and Roberto Rodr´ ıguez Abstract—Edges of an image are considered a crucial type of information. These can be extracted by applying edge detectors with different methodology. Edge detection is a vital step in computer vision tasks, because it is an essential issue for pattern recognition and visual interpretation. In this paper, we propose a new method for edge detection in images, based on the estimation by kernel of the probability density function. In our algorithm, pixels in the image with minimum value of density function are labeled as edges. The boundary between two homogeneous regions is defined in two domains: the spatial/lattice domain and the range/color domain. Extensive experimental evaluations proved that our edge detection method is significantly a competitive algorithm. Index Terms—Edge Detection, Probability Density Function, Kernel Density Estimation. I. I NTRODUCTION In analysis systems and image processing is essential to distinguish among objects of interest and the rest of the image. The used techniques in order to determine the objects of interest are known as image segmentation. One of the most common is the segmentation by edge detection. An edge can be defined as a significant change in the value of the pixel intensity in a region of the image [1]. The main purpose of the edge detection is to simplify the image data in order to minimize the amount of information to be processed [2]. Generally, an edge is defined as the boundary pixels that connect two separate regions [1], [3], [4]. The detection operation starts with the examination of the local discontinuity at each pixel element at the image. Amplitude, orientation and location of a particular subarea are the main characteristics of possible edges [1]. Based on these characteristics, the edge detector must to decide whether each of the examined pixels is an edge or not. Classical edge detection methods labeled a pixel as edge according to discontinuities in gray levels, colors or textures. The Roberts [5], Sobel [6], and Prewitt [7] operators detect edges by convolving a grayscale image with local derivative filters. Marr and Hildreth [8] used zero crossings of the Lapla- cian of Gaussian operator. The Canny detector [2] also models edges as sharp discontinuities in the brightness channel, adding non-maximum suppression and hysteresis thresholding steps. There are many other techniques in the literature used for edge detection like [9] and [10]; some of them are based on histograms, error minimization, maximizing an object function, fuzzy logic, wavelet approach, morphology, genetic algorithms, neural network and among others. A method for image segmentation by active contours has been proposed in [11], it based on a variational analysis, in which the active contour is driven by the forces stemming from minimization of a cost functional. The segmentation method proposed by [11] is based on a distance among probability densities. In particular, the active contours have been evolved to maximize the Bhattacharyya distance among nonparametric (kernel-based) estimates of the probability densities of seg- mentation classes [12]. One algorithm that makes use of nonparametric density estimation is Mean shift (MSH) [13], [14], [15]. In essence, MSH is an iterative mode detection algorithm in the density distribution space [14], [16], [17]. In [18], the authors proposed an approach for Mean Shift algorithm and contour image. The technique was applied to CT Angiography images. Another approach proposed an algorithm that constructs a kernel function histogram combining intensity and then uses the Mean Shift algorithm with this kernel function in order to detect automatically edges in the gray level image [19]. We find in [20] an adaptive algorithm to accomplish edge detection in multidimensional and color images. This is a statistical approach based on local work and nonparametric kernel density estimation. The location of the edge discontinu- ity coincides with the minimum of the image density function and it is determined by an appropriate resampling of the locally defined space of probability. In this paper, we propose a new method for edge detection based on the kernel estimation of the probability density function. As in [20], in our approach the pixels in the image with the minimum value of the density function are labeled as edges. The main difference between [20] and our method is that: in [20], the boundary between two homogeneous regions is defined in terms only of gray level domain, while that in our algorithm we uses the range-spatial domain (gray levels and pixel position). The remainder of the paper is organized as follows. In Section II, the theorical aspetcs concerning the kernel density estimation are exposed. Section III, describes in details our edge detections algorithm. The experimental results, compar- isons and discussion are presented in Section IV. In section V, the most important conclusions are given. II. THEORICAL ASPECTS A. Kernel density estimation One of the most popular nonparametric density estimators is estimation by density kernels. Mathematically speaking, the arXiv:1411.1297v1 [cs.CV] 5 Nov 2014

Transcript of Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation...

Page 1: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

1

Edge Detection based on KernelDensity Estimation

Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodrıguez

Abstract—Edges of an image are considered a crucial type ofinformation. These can be extracted by applying edge detectorswith different methodology. Edge detection is a vital step incomputer vision tasks, because it is an essential issue for patternrecognition and visual interpretation. In this paper, we proposea new method for edge detection in images, based on theestimation by kernel of the probability density function. Inour algorithm, pixels in the image with minimum value ofdensity function are labeled as edges. The boundary between twohomogeneous regions is defined in two domains: the spatial/latticedomain and the range/color domain. Extensive experimentalevaluations proved that our edge detection method is significantlya competitive algorithm.

Index Terms—Edge Detection, Probability Density Function,Kernel Density Estimation.

I. INTRODUCTION

In analysis systems and image processing is essential todistinguish among objects of interest and the rest of the image.The used techniques in order to determine the objects ofinterest are known as image segmentation. One of the mostcommon is the segmentation by edge detection.

An edge can be defined as a significant change in the valueof the pixel intensity in a region of the image [1]. The mainpurpose of the edge detection is to simplify the image data inorder to minimize the amount of information to be processed[2].

Generally, an edge is defined as the boundary pixels thatconnect two separate regions [1], [3], [4]. The detectionoperation starts with the examination of the local discontinuityat each pixel element at the image. Amplitude, orientation andlocation of a particular subarea are the main characteristics ofpossible edges [1]. Based on these characteristics, the edgedetector must to decide whether each of the examined pixelsis an edge or not.

Classical edge detection methods labeled a pixel as edgeaccording to discontinuities in gray levels, colors or textures.The Roberts [5], Sobel [6], and Prewitt [7] operators detectedges by convolving a grayscale image with local derivativefilters. Marr and Hildreth [8] used zero crossings of the Lapla-cian of Gaussian operator. The Canny detector [2] also modelsedges as sharp discontinuities in the brightness channel, addingnon-maximum suppression and hysteresis thresholding steps.

There are many other techniques in the literature used foredge detection like [9] and [10]; some of them are basedon histograms, error minimization, maximizing an objectfunction, fuzzy logic, wavelet approach, morphology, geneticalgorithms, neural network and among others.

A method for image segmentation by active contours hasbeen proposed in [11], it based on a variational analysis, inwhich the active contour is driven by the forces stemming fromminimization of a cost functional. The segmentation methodproposed by [11] is based on a distance among probabilitydensities. In particular, the active contours have been evolvedto maximize the Bhattacharyya distance among nonparametric(kernel-based) estimates of the probability densities of seg-mentation classes [12].

One algorithm that makes use of nonparametric densityestimation is Mean shift (MSH) [13], [14], [15]. In essence,MSH is an iterative mode detection algorithm in the densitydistribution space [14], [16], [17].

In [18], the authors proposed an approach for Mean Shiftalgorithm and contour image. The technique was applied to CTAngiography images. Another approach proposed an algorithmthat constructs a kernel function histogram combining intensityand then uses the Mean Shift algorithm with this kernelfunction in order to detect automatically edges in the graylevel image [19].

We find in [20] an adaptive algorithm to accomplish edgedetection in multidimensional and color images. This is astatistical approach based on local work and nonparametrickernel density estimation. The location of the edge discontinu-ity coincides with the minimum of the image density functionand it is determined by an appropriate resampling of the locallydefined space of probability.

In this paper, we propose a new method for edge detectionbased on the kernel estimation of the probability densityfunction. As in [20], in our approach the pixels in the imagewith the minimum value of the density function are labeled asedges.

The main difference between [20] and our method is that:in [20], the boundary between two homogeneous regions isdefined in terms only of gray level domain, while that in ouralgorithm we uses the range-spatial domain (gray levels andpixel position).

The remainder of the paper is organized as follows. InSection II, the theorical aspetcs concerning the kernel densityestimation are exposed. Section III, describes in details ouredge detections algorithm. The experimental results, compar-isons and discussion are presented in Section IV. In section V,the most important conclusions are given.

II. THEORICAL ASPECTS

A. Kernel density estimationOne of the most popular nonparametric density estimators

is estimation by density kernels. Mathematically speaking, the

arX

iv:1

411.

1297

v1 [

cs.C

V]

5 N

ov 2

014

Page 2: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

2

general multivariate kernel density estimate at the point x, isdefined by:

f(x) =1

nhd

n∑i=1

K

(x− xih

)(1)

where n data points xi, i = 1, 2, 3, . . . , n, represent a popu-lation with some unknown density function f(x) [13], [15],[17].

For image segmentation, the feature space is composed oftwo independent domains: the spatial/lattice domain andthe range/color domain. Due to the different natures of thedomains, the kernel is usually broken into the product of twodifferent radially symmetric kernels:

f(x) =c

n(hs)p(hr)q

n∑i=1

ks

(∥∥x−xi

hs

∥∥2) kr (∥∥x−xi

hr

∥∥2) (2)

where x is a pixel, ks and kr are the profiles used in the tworespective domains, hs and hr are employed bandwidths inspatial-range domains and c is the normalization constant.

As was shown in (2), it there are two main parameters thathave to be defined by the user: the spatial bandwidth hs andthe range bandwidth hr.

B. Validation of Edge Detection

At the present time, an unique segmentation method thatachieves good results for any image type does not exist. Forthis reason, it is necessary to quantify the efficiency of an edgedetection method comparing the obtained results with a realmodel [24], [25],[26]. However, to find a measure that carriesout a correct evaluation of the obtained borders is a complexproblem [27].

Many techniques have been proposed for evaluation of edgedetection algorithms. One of them is the Rand Index [28],which was introduced for a general clustering evaluation.The Rank Index operates by comparing the compatibility ofassignments between pairs of elements in the clusters.

Acoording to [28] and [29], the rank index is:

RI =a+ b

a+ b+ c+ d=a+ b(

n2

) (3)

where a + b is the number of agreements between X and Yand c+ d is the number of disagreements between X and Y.

Variants of the Rand Index have been proposed to deal withthe case of multiple ground-truth segmentations [30], [31].Given a set of ground-truth segmentations Gk, the ProbabilisticRand Index (PRI) is defined as:

PRI(S, {Gk}) =1

T

∑i<j

[cijpi,j + (1− cij)(1− pij)] (4)

where cij is the event that pixels i and j have the same label,pij is the probability of the event and the T letter is the totalnumber of pixel pairs.

The PRI has the drawback of suffering of a small dynamicrange [30], [31]. In [30], this drawback is resolved withnormalization in order to produce the Normalized ProbabilisticRand Index (NPRI). The NPRI uses a typical normalization

scheme: if the baseline index value is the expected value of theindex of any given segmentation of a particular image, then:

NPRI =PRI − ExpectedIndex

MaximumIndex− ExpectedIndex(5)

Recently another metric of similarity among images wasproposed in [32]. The Natural Entropy Distance (NED) wasintroduced for the purpose of comparing two images. NED isan index of similarity among images that use Zn rings and theentropy function, this defined as:

Definition 1: Let A and B be, two images; then the naturalentropy distance is defined by

ν(A,B) = E(A+ (−B)) (6)

where −(B) is the additive inverse of B and this is calculatedby using the inverse of each pixel of B in Zn.

This index was applied as new stopping criterion to theMean Shift Iterative Algorithm (MSHi) with the goal ofreaching a better segmentation. The properties of this indexwere demostrated in [32]. Some of them are the non-negativity,symmetry, invariance under affine transformations, such as:translation, reflection and rotation. Also, this fulfills with theaxiom of identity of indiscernibles.

III. EDGE DETECTION USING A PROBABILITY DENSITYFUNCTION

As it was pointed out in [33], the most common way of anedges detector is by convolving the image with a mask. Theresponse of the mask at any point of the image is given by:

R = z1 · f1 + z2 · f2 + . . .+ z9 · f9 =

9∑i=1

zi · fi (7)

where, fi is the gray level of the pixel in the image directlybelow coefficient zi in the used mask. The value of R isassigned to the central pixel of the mask in the output image.

If the response of the mask into the central position satisfiesexpression (8), then we can say that we have found an edgepoint.

|R| < u (8)

where u is a nonnegative threshold.Our approach for edge detection, which it is named “Edge

Detection by Density” (EDD)(see Algorithm 1), it is based onthe estimation of density function in the central pixel of themask, where the results of the convolution are stored into anew image. The obtained density image has values of pixelsthat belong to the interval [0, 1].

After this process, it is necessary to use a threshold, whichis obtained according to the following steps (see Figure 1):

1) Calculate the histogram of the density image.2) To find the threshold (u), around the associated density

value of the biggest value of frequency in the histogram.3) Label as edges all pixels with the lower density values

than threshold u.Computationally speaking, a linear implementation of our

method over CPU has an algorithmic complexity of Θ(hn),where h is a kernel bandwidth and n is the number of pixels

Page 3: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

3

in the image. A parallel version of the algorithm over GPUhas Θ(1) of algorithmic complexity.

From Figure 2(b) to Figure 2(k), some algorithm iterationsover the quadrant of Figure 2(a) are shown. Note that animplicit edge there is, which this separates two regions ofdifferent labels.

Algorithm 1: Edge Detection by Density.Data:I: input image;m: image width; n: image height;hs: bandwidths in spatial/lattice domain;hr: bandwidths in range/color domain;

1 Initialize:2 R = 0;3 for i = 1, 2, . . . ,m do4 for j = 1, 2, . . . , n do5 R(i, j) = f(p(i, j), hs, hr);6 where p is the pixel in the Image I at position i, j

and f is the kernel density estimation funtion (2).

7 u = threshold(R) R = |R| < u ;Result: R is the edges image.

Figure 2(l) shows the obtained edges at the end of thealgorithm. For this example two kernels were combined, theuniform kernel

Ks(x) =

{1 if ‖x‖ ≤ 00 if ‖x‖ ≥ 0

(9)

and Gaussian kernel

Kr(x) = e−‖x‖2

(10)

Many researches have been carried out for edge detection.For this reason, in the literature have been proposed a greatnumber of procedures and techniques in order to evaluate edgedetectors. However, this continues being an open problem, dueto complexity of images.

The performance of our proposed algorithm is comparedto the classical algorithms such as: Canny, Sobel, Prewittand Roberts. A limitation of the classical methods for edgedetection is that they operate, only, over the range domain,i.e., in the gray levels. Today, most of edge detectors processthe image in the range-spatial domain, and the obtained resultsare better.

Our strategy works in the range-spatial domain and thisguarantees the continuity of the edges. This does not happenwith the classical methods.

In the next section, we will carry out an experimentalcomparison between the detected edges with the classicalalgorithms and the detected edges with our approach.

IV. RESULTS AND VALIDATION

In this section, we will show the obtained results of applyingour edge detection algorithm. Our algorithm was applied tosegmented images (6303, 41006 and 175083) of the Berke-ley’s database. The performance of the proposed method was

tested and this was compared with other different approaches(classical edge detections).

Firstly, the obtained edges with the new proposed algorithmwere compared with the accepted manual ground-truth seg-mentation of the Berkeley’s database. One can note in Figures3, 4 and 5, that there is not too much difference among theobtained edge images with the EDD algorithm and the trueedge segmentation of the Berkeley’s database.

A quantitative validation was carried out using severalmetrics: the probabilistic rank index (PRI), normalized proba-bilistic rank index (NPRI) and natural entropy distance (NED).The exposed results in Table I show that edge detection carriedout with our EDD algorithm is highly competitive comparedwith other previously proposed algorithms [2], [9], [10], [11].In Table II, we can observe how the values of PRI, NPRI andNED of the resulting edge image of our algorithm are similarand sometimes greater, than the values of the resulting edgeimage of the classical algorithms.

In Figure 6, one can see the principal problem of the classi-cal methods: the edge discontinuities. However, our proposedstrategy guarantees the edge continuity when operating in thespatial-range domain.

An example of an applications of our proposed strategyto a medical image is shown in Figure 7. In this case, weused different values of hs and hr in the MSHi. This is apreliminary result. A deeper paper about these results will bepublished.

V. CONCLUSION

In this work, we proposed a new edge detector algorithmbased on the kernel density estimation. EDD introduces a newstrategy for automatically find the threshold value based on theminimum of the density function. The proposed algorithm wasapplied to ground-truth of the Berkely’s database and medicalimages.

Our method has been compared with the classical edgedetection algorithms. The quality and quantity results areappropriate according to the criteria of specialists. Our methodhave a computational complexity similar to the classical edgedetector with the advantage of preserving the edge continuity.

The extensive experimental evaluation showed that our edgedetection method is significantly a competitive algorithm. Infuture works, the experimental results related to a real problemin the field medical image will be expanded.

REFERENCES

[1] W. Frei and C. Chen, Fast Boundary Detection: A Generalization andNew Algorithm, IEEE Trans. Computers, vol. C-26, no. 10, pp. 988-998,Oct. 1977.

[2] J. Canny, A computational approach to edge detection, IEEE Trans.Pattern Analysis and Machine Intelligence, Vol. 8, No. 6, pp. 679-698,Nov. 1986.

[3] R. C. Gonzalez and R. E. Woods, Digital Image Processing. Upper SaddleRiver, NJ: Prentice-Hall, pp. 572-585, 2001.

[4] W. K. Pratt, Digital Image Processing. New York, NY, Wiley-Interscience,pp. 491-556, 1991.

[5] L. G. Roberts, Machine perception of three-dimensional solids, In Opticaland Electro-Optical Information Processing, J. T. Tippett et al. Eds.Cambridge, MA: MIT Press, 1965.

[6] R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis.New York: Wiley, 1973.

Page 4: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

4

[7] J. M. S. Prewitt, Object enhancement and extraction, In Picture Processingand Psychopictorics, B. Lipkin and A. Rosenfeld. Eds. Academic Press,New York, 1970.

[8] D. C. Marr and E. Hildreth, Theory of edge detection, Proceedings of theRoyal Society of London, 1980.

[9] S.Lakshmi and V.Sankaranarayanan, A study of Edge Detection Tech-niques for Segmentation Computing Approaches, IJCA Special Issue on“Computer Aided Soft Computing Techniques for Imaging and Biomed-ical Applications”, CASCT, 2010.

[10] G. Papari and N. Petkov, Edge and line oriented contour detection: Stateof the art, Elsevier, Image and Vision Computing. Vol. 29 pp. 79103,2011.

[11] O. Michailovich, Y. Rathi and A. Tannenbaum, Image SegmentationUsing Active Contours Driven by the Bhattacharyya Gradient Flow, IEEETransaction on Image Processing, Vol. 16, No. 11, November 2007.

[12] T. Kailath,The divergence and Bhattacharyya distance measures insignal selection, IEEE Trans. Commun. Technol., vol. COM-15, no. 1,pp. 5260, Feb. 1967.

[13] Fukunaga and L. D. Hostetler: The Estimation of the Gradient of aDensity Function. IEEE Trans., Information Theory, Vol. 21, pp. 32-40,1975.

[14] Shen, C. and Brooks, M. J.: Fast Global Kernel Density Mode Seeking:Applications to Localization and Tracking, IEEE Transactions on ImageProcessing, Vol. 16, No. 5, (May 2007), pp (1457-1469), ISSN: 1057-7149.

[15] Y. Cheng, Mean Shift, Mode Seeking, and Clustering, IEEE Trans.,Pattern Analysis and Machine Intelligence, Vol. 17, No. 8, pp. 790-799,1995.

[16] D. Comaniciu and P. Meer, Mean shift: A robust approach toward featurespace analysis, IEEE Transaction on Pattern Analysis and MachineIntelligence, vol. 24, no. 5, May 2002.

[17] D. I. Comaniciu, Nonparametric robust method for computer vision,Ph.D. dissertation, New Brunswick, Rutgers, The State University of NewJersey, 2000.

[18] Ali Hassan, Hind Rostom Mohamed, Raghad Saaheb Al-Shimsah, CTAngiography Image Segmentation by Mean Shift Algorithm and Contourwith Connected Components Image, International Journal of Scientificand Engineering Research Volume 3, Issue 8, August 2012.

[19] Li Zhengzhou, Liu Mei, Wang Huigai, Yang Yang, Chen Jin, Jin Gang,Gray-scale Edge Detection and Image Segmentation Algorithm Based onMean Shift, TELKOMNIKA, Vol.11, No.3, pp. 1414-1421, March 2013.

[20] Economou, G.; Fotinos, A.; Makrogiannis, S.; Fotopoulos, S., Colorimage edge detection based on nonparametric density estimation, ImageProcessing, 2001. Proceedings. 2001 International Conference on , vol.1,no., pp.922,925 vol.1, 2001

[21] Wand, M., Jones, M.,Kernel Smoothing, Chapman and Hall, p. 95, 1995.[22] B. W. Silverman, Density Estimation for Statistics and Data Analysis.

Boca Raton, FL: CRC, 1986.[23] J. S. Simonoff, Smoothing Methods in Statistics, New York: Springer,

1996.[24] Barra V., Lundervold A., A Collaborative Software Tool for the Evalu-

ation of MRI Brain Segmentation Methods, International Symposium onInformation Technology Convergence (ISITC 2007), pp. 235 239, 2007.

[25] Bouix S. et al., On evaluating brain tissue classifiers without a ground-truth, Neuroimage, V ol. 36, pp. 1207 1224, 2007.

[26] Crum W. R., Generalized Overlap Measures for Evaluation and Valida-tion in Medical Image Analysis, IEEE Transactions on Medical Imaging,V ol. 25, pp. 1451-1461.

[27] P. Christian Hansen, Discrete Inverse Problems. Insight and Algorithms,SIAM, 2010.

[28] W. M. Rand, Objective criteria for the evaluation of clustering methods,Journal of the American Statistical Association, vol. 66, pp. 846850, 1971.

[29] Lawrence Hubert and Phipps Arabie,Comparing partitions, Journal ofClassification, pp. 193218, 1985.

[30] R. Unnikrishnan, C. Pantofaru, and M. Hebert, Toward objective evalu-ation of image segmentation algorithms, PAMI, 2007.

[31] A. Y. Yang, J. Wright, Y. Ma, and S. S. Sastry, Unsupervised segmen-tation of natural images via lossy data compression, CVIU, 2008.

[32] Garces, Yasel; Torres, Esley; Pereira, Osvaldo; Perez, Claudia andRodrıguez, Roberto. Stopping Criterion for the Mean Shift IterativeAlgorithm. Lecture Notes of Computer Science, vol. 8258, pp. 383 - 390,2013.

[33] Roberto Rodrguez and Juan H. Sossa, Procesamiento y Anlisis Digital deImgenes”, (”Processing and Digital Image Analysis”), Book Publishedby ”Ra-Ma” Editorial, ISBN: 978-84-9964-007-8, May 2011.

Osvaldo Pereira. Received his bachelor degree inComputer Science Engineering in October 2008.He received his degree of Master of Science inmention of Applied Computer in October 2010. Nowis part of the Digital Signal Processing Group of theInstitute of Cybernetics, Mathematics and Physics(ICIMAF). His research interests include: processingand segmentation of digital images, reconstructionof three-dimensional models from medical imaging,visualization and virtual reality. He is developing hisPhD in topics of edge detection in images.

Esley Torres. Received his bachelor degree in Math-ematics from the Havana University in 2009. Since2011, he is part of the Digital Signal ProcessingGroup of the Institute of Cybernetics, Mathematicsand Physics (ICIMAF). His research interests in-clude segmentation, restoration, visual pattern recog-nition, and analysis of images. Since 2009, heteaches mathematics at Polytechnic University JosAntonio Echavarra (ISPJAE). He has published morethan 12 articles in international journals and hasparticipated in many international conferences. He

has received one national prize.

Yaser Garces. Received his bachelor degree inMathematics from the Havana University in 2011and the Master degree in 2014. Since 2011, he ispart of the Digital Signal Processing Group of theInstitute of Cybernetics, Mathematics and Physics(ICIMAF). His research interests include segmen-tation, restoration, visual pattern recognition, andanalysis of images. He has published more than 10articles in international journals and has participatedin many international conferences. He has receivedmore of five national prizes.

Roberto Rodrıguez. Received his diploma in Physicfrom the Physics Faculty, Havana University in1978 and the PhD degree from Technical Univer-sity of Havana, in 1995. Since 1998, he is thehead of the Digital Signal Processing Group of theInstitute of Cybernetics, Mathematics and Physics(ICIMAF). His research interests include Segmenta-tion, Restoration, Mathematical Morphology, Visualpattern recognition, Analysis and Interpretation ofimages, Theoretical studies of Gaussian Scale-Spaceand Mean shift. He has published more than 100

articles in international journals and in many international conferences. Hehas two published books and three written chapters in other related bookswith the speciality. He has received more of ten prizes national e international.

Page 5: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

5

APPENDIX

Image 6303 of Berkeley’s Database Image 41006 of Berkeley’s Database Image 175083 of Berkeley’s Database

0.35

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.35

0.35

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.35

0.35

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.35

0.35

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.35

0.35

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.35

0.35

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.31

0.47

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.43

0.43

0.62

0.62

0.47

0.31

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.35

0.62

0.62

0.47

0.31

0.58

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.43

0.43

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.43

0.43

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.58

0.31

0.47

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.43

0.43

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62

0.62 0.62

0.43

0.43

0.62

0.62

0.62

0.62

0.56

0.61

0.56

0.33

0.16

0.28

0.40

0.44

0.56

0.61

0.61

0.61

0.61

0.56

0.49

0.56

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.33

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.33

0.33

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.33

0.33

0.56

0.61

0.61

0.61

0.61

0.61

0.56

0.33

0.33

0.56

0.61

0.61

0.61

0.61

0.56

0.44

0.28

0.33

0.56

0.44

0.28

0.52

0.44

0.40

0.28

0.28

0.44

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.40

0.40

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.40

0.40

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.44

0.40

0.40

0.61

0.61

0.61

0.61

0.61

0.56

0.33

0.28

0.40

0.40

0.33

0.56

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.28

0.44

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.40

0.40

0.61

0.61

0.61

0.44

0.28

0.56

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.44

0.24

0.40

0.40

0.40

0.40

0.40

0.44

0.56

0.61

0.56

0.44

0.40

0.40

0.40

0.40

0.40

0.28

0.33

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.52

0.16

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.44

0.28

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.28

0.44

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.44

0.28

0.56

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.44

0.61

0.61

0.61

0.61

0.61

0.61

0.61

0.56

0.33

0.28

0.61

0.61

0.61

0.61

0.61

0.56

0.44

0.28

0.33

0.56

0.61

0.61

0.56

0.44

0.40

0.28

0.28

0.44

0.56

0.61

0.61

0.56

0.33

0.28

0.40

0.44

0.56

0.61

0

2000

4000

6000

8000

10000

0 0.2 0.4 0.6 0.8 1

Threshold

Edges Background

0

2000

4000

6000

8000

10000

0 0.2 0.4 0.6 0.8 1

Threshold

Edges Background

0

2000

4000

6000

8000

10000

0 0.2 0.4 0.6 0.8 1

Edges Background

Threshold

Fig. 1. Automatic calculation of the threshold value. 1st Row: Density images obtained with EDD algorithm. 2nd Row: Zoom of the region in the densityimages. 3th Row: Histograms of frecuency of the density images.

TABLE IINDEX OF SIMILARITY AMONG OUR EDGE DETECTION ALGORITHM AND GROUND-TRUTH IMAGES.

Experiment I: Image 6303 Experiment II: Image 41006 Experiment III: Image 175083Ground-Truth PRI NPRI NED PRI NPRI NED PRI NPRI NED

I. 0.968898 0.871664 0.087767 0.960236 0.835922 0.128089 0.928050 0.703113 0.190817II. 0.971290 0.881534 0.088066 0.956514 0.820564 0.122966 0.949825 0.792963 0.144619III. 0.969495 0.874127 0.061267 0.934549 0.729930 0.171914 0.939635 0.750916 0.138976IV. 0.940551 0.754696 0.146649 0.955903 0.818043 0.118532 0.943945 0.768700 0.179005V. 0.967766 0.866993 0.080747 0.938192 0.744962 0.189659 0.941919 0.760341 0.183362

Abbreviations: PRI: Probabilistic Rank Index, NPRI: Normalized Probabilistic Rank Index, NED: Natural Entropy Distance.

TABLE IIQUANTITATIVE COMPARATION OF EDD METHODS VERSUS CLASSICAL EDGE DETECTOR ALGORITHMS.

Experiment I: Image 6303 Experiment II: Image 41006 Experiment III: Image 175083Algorithm PRI NPRI NED PRI NPRI NED PRI NPRI NED

EDD 0.969495 0.874127 0.061267 0.934549 0.729930 0.171914 0.939635 0.750916 0.138976Canny 0.976007 0.900997 0.035261 0.941772 0.759734 0.101104 0.945158 0.773706 0.080776Prewitt 0.967537 0.866048 0.116485 0.944864 0.772492 0.234304 0.953138 0.806633 0.179450Roberts 0.972056 0.884695 0.029136 0.946424 0.778929 0.088014 0.953728 0.809068 0.111554Sobel 0.968576 0.870335 0.113957 0.938396 0.745804 0.242177 0.952704 0.804843 0.179254

Abbreviations: PRI: Probabilistic Rank Index, NPRI: Normalized Probabilistic Rank Index, NED: Natural Entropy Distance.

Page 6: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

6

8

8

8

7

8

8

8

8

8

7

7

7

8

8

8

7

7

7

7

7

8

8

7

7

7

7

7

7

8

7

7

7

7

7

7

7

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

8

(a)

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.47

(b)

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.40

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.47

(c)

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.40

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.40

0.00

0.00

0.00

0.00

0.00

0.00

0.00

0.47

(d)

0.00

0.00

0.00

0.00

0.40

0.40

0.40

0.47

0.00

0.00

0.00

0.00

0.33

0.33

0.33

0.40

0.00

0.00

0.00

0.00

0.33

0.33

0.32

0.33

0.00

0.00

0.00

0.00

0.32

0.28

0.22

0.25

0.00

0.00

0.00

0.28

0.22

0.22

0.28

0.37

0.00

0.00

0.00

0.22

0.28

0.32

0.33

0.40

0.00

0.00

0.00

0.32

0.33

0.33

0.33

0.40

0.00

0.00

0.00

0.40

0.40

0.40

0.40

0.47

(e)

0.00

0.00

0.00

0.00

0.40

0.40

0.40

0.47

0.00

0.00

0.00

0.00

0.33

0.33

0.33

0.40

0.00

0.00

0.00

0.00

0.33

0.33

0.32

0.33

0.00

0.00

0.00

0.33

0.32

0.28

0.22

0.25

0.00

0.00

0.00

0.28

0.22

0.22

0.28

0.37

0.00

0.00

0.00

0.22

0.28

0.32

0.33

0.40

0.00

0.00

0.00

0.32

0.33

0.33

0.33

0.40

0.00

0.00

0.00

0.40

0.40

0.40

0.40

0.47

(f)

0.00

0.00

0.00

0.00

0.40

0.40

0.40

0.47

0.00

0.00

0.00

0.00

0.33

0.33

0.33

0.40

0.00

0.00

0.00

0.33

0.33

0.33

0.32

0.33

0.00

0.00

0.00

0.33

0.32

0.28

0.22

0.25

0.00

0.00

0.00

0.28

0.22

0.22

0.28

0.37

0.00

0.00

0.00

0.22

0.28

0.32

0.33

0.40

0.00

0.00

0.00

0.32

0.33

0.33

0.33

0.40

0.00

0.00

0.00

0.40

0.40

0.40

0.40

0.47

(g)

0.00

0.40

0.40

0.40

0.40

0.40

0.40

0.47

0.00

0.33

0.33

0.33

0.33

0.33

0.33

0.40

0.00

0.33

0.33

0.33

0.33

0.33

0.32

0.33

0.40

0.33

0.33

0.33

0.32

0.28

0.22

0.25

0.40

0.33

0.32

0.28

0.22

0.22

0.28

0.37

0.40

0.32

0.24

0.22

0.28

0.32

0.33

0.40

0.37

0.24

0.24

0.32

0.33

0.33

0.33

0.40

0.38

0.25

0.37

0.40

0.40

0.40

0.40

0.47

(h)

0.00

0.40

0.40

0.40

0.40

0.40

0.40

0.47

0.00

0.33

0.33

0.33

0.33

0.33

0.33

0.40

0.40

0.33

0.33

0.33

0.33

0.33

0.32

0.33

0.40

0.33

0.33

0.33

0.32

0.28

0.22

0.25

0.40

0.33

0.32

0.28

0.22

0.22

0.28

0.37

0.40

0.32

0.24

0.22

0.28

0.32

0.33

0.40

0.37

0.24

0.24

0.32

0.33

0.33

0.33

0.40

0.38

0.25

0.37

0.40

0.40

0.40

0.40

0.47

(i)

0.00

0.40

0.40

0.40

0.40

0.40

0.40

0.47

0.40

0.33

0.33

0.33

0.33

0.33

0.33

0.40

0.40

0.33

0.33

0.33

0.33

0.33

0.32

0.33

0.40

0.33

0.33

0.33

0.32

0.28

0.22

0.25

0.40

0.33

0.32

0.28

0.22

0.22

0.28

0.37

0.40

0.32

0.24

0.22

0.28

0.32

0.33

0.40

0.37

0.24

0.24

0.32

0.33

0.33

0.33

0.40

0.38

0.25

0.37

0.40

0.40

0.40

0.40

0.47

(j)

0.47

0.40

0.40

0.40

0.40

0.40

0.40

0.47

0.40

0.33

0.33

0.33

0.33

0.33

0.33

0.40

0.40

0.33

0.33

0.33

0.33

0.33

0.32

0.33

0.40

0.33

0.33

0.33

0.32

0.28

0.22

0.25

0.40

0.33

0.32

0.28

0.22

0.22

0.28

0.37

0.40

0.32

0.24

0.22

0.28

0.32

0.33

0.40

0.37

0.24

0.24

0.32

0.33

0.33

0.33

0.40

0.38

0.25

0.37

0.40

0.40

0.40

0.40

0.47

(k)

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

1

1

0

0

0

0

1

1

0

0

0

0

1

1

0

0

0

0

0

1

1

0

0

0

0

0

0

1

0

0

0

0

0

0

(l)Fig. 2. Algorithm Iterations. (a) Original image (b)-(k) Algorithm iterations (l) Edges detected.

Page 7: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

7

Experiment I: Image 6303 of Berkeley’s Database.Ground-Truth I

Ground-Truth II

Ground-Truth III

Ground-Truth IV

Ground-Truth V

Fig. 3. Comparation of edge ground-truth segmentation with density algorithm segmentation. 1st Column: Region ground-truth segmentation. 2nd Column:Edge ground-truth segmentation. 3th Column: Edge Detection by Density (EDD).

Page 8: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

8

Experiment II: Image 41006 of Berkeley’s Database.Ground-Truth I

Ground-Truth II

Ground-Truth III

Ground-Truth IV

Ground-Truth V

Fig. 4. Comparation of edge ground-truth segmentation with density algorithm segmentation. 1st Column: Region ground-truth segmentation. 2nd Column:Edge ground-truth segmentation. 3th Column: Edge Detection by Density (EDD).

Page 9: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

9

Experiment III: Image 175083 of Berkeley’s Database.Ground-Truth I

Ground-Truth II

Ground-Truth III

Ground-Truth IV

Ground-Truth V

Fig. 5. Comparation of edge ground-truth segmentation with density algorithm segmentation. 1st Column: Region ground-truth segmentation. 2nd Column:Edge ground-truth segmentation. 3th Column: Edge Detection by Density (EDD).

Page 10: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

10

Image 6303 of Berkeley’s Database Image 41006 of Berkeley’s Database Image 175083 of Berkeley’s Database

EDD

Canny

Prewwit

Roberts

Sobel

Fig. 6. Visual comparation of EDD with classical edge detectors.

Page 11: Edge Detection based on Kernel Density Estimation1 Edge Detection based on Kernel Density Estimation Osvaldo Pereira, Esley Torre, Yasel Garces and Roberto Rodr´ ´ıguez Abstract—Edges

11

(a) CT Image (b) MSHi Image (c) hs=1 hr=15

(d) hs=3 hr=15 (e) hs=5 hr=15 (f) hs=7 hr=15

(g) hs=1 hr=5 (h) hs=1 hr=7 (i) hs=1 hr=30

Fig. 7. Examples of Density Edge Detection in a CT image segmented with MSHi algorithm.