Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking...

15
Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking in Distributed Visual Sensor Networks Tien-Wen Sung and Chu-Sing Yang Department of Electrical Engineering, Institute of Computer and Communication Engineering, National Cheng Kung University, No. 1 University Road, Tainan 701, Taiwan Correspondence should be addressed to Tien-Wen Sung; [email protected] Received 11 December 2013; Accepted 23 January 2014; Published 5 March 2014 Academic Editor: Yue Shan Chang Copyright © 2014 T.-W. Sung and C.-S. Yang. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Target tracking is one of the important applications of wireless sensor networks. For a visual sensor network, the visual sensors have particular characteristics such as directional sensing, limited angle of view, and line-of-sight view. Target tracking with visual sensors is different from that with scalar sensors. Moreover, the quality of sensed visual data can be much important in many applications. In this paper, the concept of Voronoi diagram is utilized for target tracking in visual sensor networks. e structure of Voronoi cells is suitable for the design of a distributed/localized algorithm and it exists a property of bisector for dividing a distance line or region. is paper proposed a Voronoi-based distributed sensor handover protocol for visual sensor networks. e simulation result shows the benefits of our proposed approach in terms of target-detected latency, target-tracked ratio, and average target distance, which indicates that the quality of target tracking service can be improved with the proposed approach for visual sensor networks. 1. Introduction Wireless sensor networks (WSN) [1], which have the essential capabilities of sensing, computing, and communicating, have attracted a wide range of attention in the past decade. WSNs are well suited to many applications such as surveillance and monitoring applications [2, 3]. In such applications sensing coverage is one of the fundamental measurement indexes of the QoS (Quality of Service) [4]. A scalar sensor usually has an omnidirectional sensing range (a circular sensing coverage), while a directional sensor has a limited sensing direction and noncircular sensing range [5]. e advances in the technologies of image sensor and embedded system have promoted rapid development of camera/visual sensor networks (VSN) [6, 7]. VSNs belong to a directional sensor network and can provide visual image or video data for the applications of surveillance and monitoring. Numerous related works of object detection, localization, and tracking for WSNs have been proposed [8]. However, most of these approaches are not applicable to the VSNs due to the different characteristics of a visual sensor [9]. In other words, a visual sensor has a limited effective sensing range characterized by its directionality and size-specific sensing angle; moreover, the type of sensed data by a visual sensor is image-based, which are different from the wireless scalar sensors. Different approaches are necessary for the object detection, localiza- tion, and tracking in VSN applications. In this paper, the geometric structure of Voronoi diagram [10] is utilized in the proposed design of a visual sensor handover protocol for target tracking in VSNs where a large number of visual sensors are considered to be randomly deployed in a wild field to perform the sensing and tracking tasks. Moreover, the target localization is designed to be completed by single visual sensor without any prior knowledge of target positions, while other related works did the localization by multiple visual sensors. e use of the concept of Voronoi diagram also facilitates the determination of a nearest visual sensor to a detected target object. To the best of our knowledge, this is the first paper that utilizes Voronoi diagram for both target localization and target tracking in a distributed VSN. e major contribution of this paper includes (1) the target localization for tracking can be completed by only single visual sensor using the proposed scheme and needs no prior knowledge of the target position. e localization Hindawi Publishing Corporation International Journal of Distributed Sensor Networks Volume 2014, Article ID 586210, 14 pages http://dx.doi.org/10.1155/2014/586210

Transcript of Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking...

Page 1: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

Research ArticleA Voronoi-Based Sensor Handover Protocol forTarget Tracking in Distributed Visual Sensor Networks

Tien-Wen Sung and Chu-Sing Yang

Department of Electrical Engineering Institute of Computer and Communication EngineeringNational Cheng Kung University No 1 University Road Tainan 701 Taiwan

Correspondence should be addressed to Tien-Wen Sung tienwensunggmailcom

Received 11 December 2013 Accepted 23 January 2014 Published 5 March 2014

Academic Editor Yue Shan Chang

Copyright copy 2014 T-W Sung and C-S Yang This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

Target tracking is one of the important applications of wireless sensor networks For a visual sensor network the visual sensorshave particular characteristics such as directional sensing limited angle of view and line-of-sight view Target tracking with visualsensors is different from that with scalar sensors Moreover the quality of sensed visual data can be much important in manyapplications In this paper the concept of Voronoi diagram is utilized for target tracking in visual sensor networks The structureof Voronoi cells is suitable for the design of a distributedlocalized algorithm and it exists a property of bisector for dividing adistance line or regionThis paper proposed a Voronoi-based distributed sensor handover protocol for visual sensor networksThesimulation result shows the benefits of our proposed approach in terms of target-detected latency target-tracked ratio and averagetarget distance which indicates that the quality of target tracking service can be improved with the proposed approach for visualsensor networks

1 Introduction

Wireless sensor networks (WSN) [1] which have the essentialcapabilities of sensing computing and communicating haveattracted a wide range of attention in the past decade WSNsare well suited to many applications such as surveillance andmonitoring applications [2 3] In such applications sensingcoverage is one of the fundamental measurement indexesof the QoS (Quality of Service) [4] A scalar sensor usuallyhas an omnidirectional sensing range (a circular sensingcoverage) while a directional sensor has a limited sensingdirection and noncircular sensing range [5] The advancesin the technologies of image sensor and embedded systemhave promoted rapid development of cameravisual sensornetworks (VSN) [6 7] VSNs belong to a directional sensornetwork and can provide visual image or video data forthe applications of surveillance and monitoring Numerousrelated works of object detection localization and trackingfor WSNs have been proposed [8] However most of theseapproaches are not applicable to the VSNs due to the differentcharacteristics of a visual sensor [9] In other words a visualsensor has a limited effective sensing range characterized by

its directionality and size-specific sensing angle moreoverthe type of sensed data by a visual sensor is image-basedwhich are different from the wireless scalar sensors Differentapproaches are necessary for the object detection localiza-tion and tracking in VSN applications In this paper thegeometric structure of Voronoi diagram [10] is utilized inthe proposed design of a visual sensor handover protocolfor target tracking in VSNs where a large number of visualsensors are considered to be randomly deployed in a wildfield to perform the sensing and tracking tasks Moreoverthe target localization is designed to be completed by singlevisual sensor without any prior knowledge of target positionswhile other related works did the localization by multiplevisual sensors The use of the concept of Voronoi diagramalso facilitates the determination of a nearest visual sensorto a detected target object To the best of our knowledgethis is the first paper that utilizes Voronoi diagram forboth target localization and target tracking in a distributedVSN The major contribution of this paper includes (1) thetarget localization for tracking can be completed by onlysingle visual sensor using the proposed scheme and needsno prior knowledge of the target position The localization

Hindawi Publishing CorporationInternational Journal of Distributed Sensor NetworksVolume 2014 Article ID 586210 14 pageshttpdxdoiorg1011552014586210

2 International Journal of Distributed Sensor Networks

is completed by neither collaborative multiple sensors norexploiting the target objects which are equipped with anadditional signal transmission component (2) the proposeddistributed Voronoi-based target tracking approach makesthe target be tracked by only single sensor at a time withoutthe need of cooperation of multiple visual sensors (3) thehandover protocol decides when to handover by only thecurrent tracker based on the geometrical Voronoi cell struc-ture It needs no cooperation information of multiple sensorsto compute and make the decision and (4) the proposedVoronoi-based tracking scheme can ensure the visual qualityof the target because it is a shortest-distance-based trackingand handover scheme

The remainder of the paper is organized as followsSection 2 briefs the related works of target localizationand tracking for WSNs and VSNs Section 3 describes theassumptions and preconditions for this study In Section 4the proposed scheme of Voronoi-based target tracking andsensor handover are described Section 5 evaluates the perfor-mance by simulations Finally concluding remarks are madein Section 6

2 Related Works

For many applications of WSNs utilization of a localizationtechnology is a requirement to perform certain functionsLocalization techniques utilized in WSNs provide usefullocation information for subjects such as deployment cov-erage routing tracking and rescue It can be categorizedinto sensor localization and target localization [11] Both thecategories of localization in WSNs have drawn considerableattention of researchers and the solution schemes can beclassified as coarse grained or fine grained GPS free orGPS based anchor free or anchor based centralized ordistributed range free or range based and stationary ormobile sensors [12 13] Target localization is specifically forthe applications of surveillance and monitoring because thiskind of applications is usually interested in not only theexistence but also the location of a target object Most of thetarget localization schemes used in scalarWSNs are not appli-cable to VSNs due to the characteristic of directional FoVof a visualcamera sensor A sensor selection-based targetlocalization algorithm was proposed in [14] A set of camerasensors that can detect the target with a high probability isselected as candidates and then certain of the candidateswill be selected for the estimation of target position Thealgorithm needs numerous camera sensors deployed withhigher density in the surveillance region and the targetneeds to be observeddetected synchronously by a numberof sensors In [15] the authors also proposed an algorithmof sensor selection to select a set of camera sensors for theimprovement of target localization accuracy The algorithmalso needs the target to occur in the overlapped area of FoVsof several camera sensors In [16] coarse-grained and fine-grained localization procedures are performed respectivelyOrdinary wireless sensors are used for rough estimation oftarget position and camera sensors are used to determineaccurate target location However it is assumed that targets

are equipped with a wireless transceiver to calculate ReceivedSignal Strength Indication (RSSI) and transmit informationto sever The calculation of target position is completed inthe server and it is a centralized scheme In our study targetlocalization can be completed by only one sensor and thereis no need to equip with an additional localization device orsignal transmitter on the targets

Regarding the target tracking numerous related worksaimed for tracking target in ordinary WSNs [17ndash20] Datadelivery delay time and network lifetime were consideredfor target tracking in [17] A heterogeneous wireless sensornetwork and a mobile wireless sensor network were usedin the studies [18 19] respectively In [20] boundary nodesof a monitoring region can be found and used to know theentry and exit of a target through the monitoring region ForVSNs the related work [16] used a hybrid sensor networkcomposed of ordinary and camera sensors for target trackingThe sensors are regularly deployed and arranged in an arraywhich is different from a random deployment The work [21]focuses on the solution of cooperative multiobject trackingamong multiple camera sensors in a VSN established witha highly correlated sensing mode In the approach globalinformation is needed for each camera sensor In [22] acentralized 2-camera system was proposed for detecting andtracking humans in a realistic indoor home environmentThe work [23] utilizes Graphics Processing Unit (GPU)coprocessors to compare image frames of a camera sensorso as to acquire human position and velocity for tracking andhandoff In [24] multiple camera sensors which detected thesame target will cooperate to track the target An optimalsensor will be selected from these sensors according to aconfidence measurement and this optimal sensor is mainlyresponsible for the tracking task When the target is detectedby a new camera sensor or there is a target movement a newoptimal camera sensor will be selected The tracking handoffscheme needs the communication of cooperation data ofthese sensors In our Voronoi-based handover protocolthere is no need of cooperation data of target detectionand tracking of multiple sensors In [25] an observationcorrelation coefficient (OCC) is defined as the ratio of theoverlap area of two cameras to the FoV of one camera It isused for the determination of activating a set of cameras toimprove the accuracy of target tracking The scheme worksunder the cooperation of multiple sensors which observedthe target More camera sensors need to be deployed in themonitoring field and bring coverage overlaps for performingthe algorithm Our proposed protocol can be performedwithout high overlapped coverage among sensors On thecontrary our scheme make camera sensors reduce FoVoverlaps to have a high overall field coverage and can trackthe target by a single camera sensor Table 1 shows a summaryof comparing the proposed scheme with the related works

3 Preliminaries

31 Voronoi Diagram In this study the properties of aVoronoi diagram is utilized to propose the sensor handover

International Journal of Distributed Sensor Networks 3

Table 1 Summary of comparison with related works

Reference Characteristic ComparisonRelated work Proposed scheme in this paper

[16]Uses a hybrid sensornetwork composed ofordinary and camerasensors for target tracking

(i) Targets are equipped with a radiotransceiver for RSSI calculation andlocation estimation(ii) Sensors are regularly deployed andarranged in an array(iii) A centralized approach

(i) No additional component to beequipped on the targets(ii) Sensors are randomly deployed(iii) A distributed approach

[21]Focuses on the solution ofcooperative multiobjecttracking among multiplecamera sensors

(i) Cooperative tracking by relevantmultiple sensor nodes and the VSN isestablished with a highly correlatedsensing mode(ii) Assumes that relative positions andsensing parameters of all sensors areknown by each sensor(iii) Multitarget tracking

(i) Tracks the target by a single camerasensor based on Voronoi cell structure(ii) Not need global information(iii) Each sensor aims to track one targetbut the algorithm can be extended totrack multiple target with one sensor

[22]

Detects and trackshumans in a realistichome environment byexploiting both color anddepth camera sensors

(i) It is designed for indoor smart homeenvironment(ii) A 2-camera system and fuses imagesfrom two channels to achieve accuracyrate(iii) A centralized system

(i) It is designed for large scale VSNs(ii) Tracks the target by a single camerasensor based on Voronoi cell structure(iii) A distributed algorithm

[23]

Utilizes GPUcoprocessors to compareimage frames and obtainhuman position andvelocity for target tracking

(i) Needs Graphics Processing Unit(GPU)(ii) Projects several hundred grid-basedellipsoids on each image frame tocompare with the image

(i) Uses general camera sensor withoutadditional GPU(ii) Uses Voronoi cells for target trackingnot grid-based structure

[24]Presents a cooperativemulticamera targettracking method based onnode selection scheme

(i) Tracking handoff scheme needs thecommunication of cooperation data ofmultiple sensors to select optimal tracker(ii) Whenever the target is detected by anew camera sensor or there is a targetmovement a new optimal camera sensorwill be selected

(i) Voronoi-based handover protocoldoes not need the cooperation data ofmultiple sensors(ii) No sensor selection procedure

[25]

Shows an observationcorrelation coefficient(OCC) which is definedand used for thedetermination ofactivating a set of camerasto improve the accuracyof target tracking

(i) At least two cameras are needed todetermine the target location(ii) Tracking under the cooperation ofmultiple sensors(iii) More camera sensors are needed tobe deployed with coverage overlaps forperforming the algorithm

(i) Tracks the target by a single camerasensor based on Voronoi cell structure(ii) Performs without high overlappedsensor coverage on the contrary overlapis reduced to increase overall coverageratio

protocol for target tracking in a VSN A Voronoi diagram asshown in Figure 1 has the following properties

(1) It divides an area into convex polygons calledVoronoi cells according to a given set of points

(2) Each of the given points lies in exactly one Voronoicell

(3) The commonshared edge of two adjacent Voronoicells is the bisector line between the two points in thetwo cells

(4) For any position 119902 which lies in the cell associatedwith point 119901 the distance between 119902 and 119901 must beshorter than the distance between 119902 and the associatedpoint within any other cell

(5) For an area with a given set of points the Voronoidiagram is unique

A well-known algorithm to construct Voronoi diagram isFortunersquos algorithm [26] which is a centralized algorithmIt can generate the Voronoi diagram with the global infor-mation of positions of a set of points in a plane In a VSNdeployed visual sensors can be treated as the set of pointsfor the construction of Voronoi diagram of the surveillanceregion For any object which appears in a Voronoi cell thesensor in the same cell can obtain the optimal sensing quality(the clarity of the picture) of the object and has the strongestreaction capacity to the object Therefore the concept ofVoronoi diagram is used in the proposed protocol Howeverin this study Voronoi diagram is constructed distributed

4 International Journal of Distributed Sensor Networks

Figure 1 A Voronoi diagram

and localized by the deployed sensors without global infor-mation It is not constructed by using a centralized algorithmwhich needs the global information Each sensor constructsits own Voronoi cell with local information Those localVoronoi cells that are constructed by all the deployed sensorswill form a complete Voronoi diagram The detail of theconstruction of Voronoi cell in this study is described in theSection 41

32 Assumptions The basic assumptions in this study aredescribed as follows

(1) The visual sensors are homogeneous and randomlydeployed in the surveillance region at initial phaseThey are stationary and direction rotatable

(2) Each visual sensor can obtain its coordinates by alocalization technology and has enough communica-tion range or can use amultihop transmissionmethodto transmit information to its neighbor sensors

(3) The objects to be tracked have no positioning compo-nent Their coordinates are originally unknown

(4) The type of target object in the tracking system isdeterminate or it can be simply recognized by theshape of detected object Accordingly the height ofthe target object can be given approximately

33 Visual Sensing Model The visual sensors used in thisstudy are directional And as mentioned in previous Sec-tion 32 of assumptions the sensors are direction rotatableThe effective sensing area (Field of View) of the visual sensoris in sector shape The sensing model for the visual sensorsin the proposed protocol is shown in Figure 2 and the relatednotations are listed in Table 2

y-axisx-axis

3

2120587

1

2120587

(xs ys)

120572

120579s

120587

0

r

Field of view

Figure 2 Sensing model for visual sensors

Table 2 Notations for the sensing model

Notation Description(119909119904 119910119904) Coordinates of the visual sensor

120572 Angle of view (AoV) of the visual sensor119903 Effective sensing radius of the visual sensor

120579119904

Sensing direction of the visual sensor(direction angle relative to the positive119909-axis)

4 Voronoi-Based Sensor HandoverProtocol (VSHP)

41 Local Voronoi Cell Construction and Sensing CoverageEnhancement In this study the concept of Voronoi diagramis utilized for dividing the surveillance region into numerousconvex polygons namely Voronoi cellsThe sensors deployedin the surveillance region can be treated as the points ina Voronoi diagram and each sensor belongs to one andonly one cell in the diagram As mentioned in Section 31a Voronoi diagram can be generated by the well-knownFortunersquos algorithm which is a centralized algorithm froma given set of points in a plane However this study aims atproposing a distributed protocol the sensors only distribut-edly construct the local Voronoi cell of their own without theglobal information of the sensor network

After the random deployment of sensors each sensor willbroadcast its coordinates to and receive coordinates fromneighbors Figure 3 illustrates the construction of a localVoronoi cell with only the local information of neighborpositions The sensor 119904

1199050receives coordinates from 119904

1199051 1199041199052

in sequence and constructs the corresponding bisector linesegments one at a time Finally 119904

1199050can obtain the structure of

its local Voronoi cell enclosed by these bisector line segmentsBecause the visual sensors are randomly deployed their

positions and sensing directions are also random at initialphase A coverage enhancement is needed to reduce theoverlaps of sensing coverage of these visual sensors Thenthe overall coverage ratio of the surveillance region can beimproved for the task of target trackingThe algorithm in ourprevious work [27] is utilized for coverage enhancement inthis study

International Journal of Distributed Sensor Networks 5

Voronoi cell

1 2 3 4

5 6 7 8

St0St0 St0 St0

St0St0St0St0

St1 St1

St1St1St1St1

St1

St2 St2

St2St2St2St2

St3

St3St3St3St3

St4 St4 St4St4

St5St5 St5

St6 St6

St7

St1

St2

St3

St4

St5

St6

St7

Figure 3 Construction of a local Voronoi cell

Background image

Updated background New subsequent image

Background update Object movement

minus

minus

=

=

Subsequent imagef(x y ti+1)

f(x y ti+1)

f(x y ti)

f(x y ti+2)

Moved object

Detected object

Figure 4 Dynamic background subtraction for object and movement detection

42 Object Detection In a tracking system object detectionis necessary prior to the object tracking For tracking in aVSN the detection can be done by object segmentation fromthe camera scenes Background subtraction [28ndash30] is animportant approach for object segmentation in visual surveil-lance system which compares two images to acquire thedifference A simple illustration of background subtractionis shown in Figure 4 A foreground intruder object can beextracted by comparing current image frame 119891(119909 119910 119905

119894+1)

with previous reference background image 119891(119909 119910 119905119894) Fur-

thermore a dynamic background subtraction approach willkeep background image updated for next comparison withnewer incoming image to detect object movement Dynamicbackground subtraction can be defined as the followingequationwhere119891diff(119909 119910 119905119894+1) is the image frame of differencebetween background frame and subsequent frame

119891diff (119909 119910 119905119894+1) =1003816100381610038161003816119891 (119909 119910 119905119894) minus 119891 (119909 119910 119905119894+1)

1003816100381610038161003816 (1)

6 International Journal of Distributed Sensor Networks

Table 3 Notations for calculation of target object direction

Notation Description

120579119905

Angle of the direction from the sensor to thetarget relative to the positive 119909-axis

120573Included angle between the line of sight fromthe sensor to the target and the right edge ofthe field of view (FoV)

119882119877

Horizontal resolution (width in pixels) of theimage

119908119905

Width (in pixels) of the distance between thetarget object center and the right edge of theimage

119908LB

Width (in pixels) of the distance between theleft edge of the detected target object and theright edge of the image

119908RB

Width (in pixels) of the distance between theright edges of the detected target object and theimage

43 Target Localization Once an incoming object is detectedby a visual sensor the sensor will estimate the actual positionof the object on the ground plane To estimate the objectposition firstly the direction of the object will be calculatedand secondly the distance of the object Regarding the objectdirection as shown in Figure 5 the view from the visualsensor is limited by its angle of view and the captured sceneis shown on the image in a rectangle shape Things in aline of sight will appear in a vertical line on the image Asthe illustration the direction of the detected target object (ahuman) can be calculated by the following equations with thenotations listed in Table 3

120573 = 120572 sdot119908119905

119882119877

= 120572 sdot119908LB + 119908RB2119882119877

120579119905= 120579119904minus120572

2+ 120573

(2)

Regarding the distance of the object as shown in Figure 6the image of real object is projected reversely on the internalimage sensor (usually a CCD or CMOS sensor) through thecamera lens and then electronic signals generated by theinternal image sensor are processed with an image processorto form the digital image Since the target object was assumedto be a height of given value the distance between the targetobject and visual sensor can be calculated by the followingequations with the notations listed in Table 4

tan 120582 =ℎ119905

119889119905

=ℎ119904

119889119904

119889119905=ℎ119905119889119904

ℎ119904

=ℎ119905119889119904

119867119878(ℎBB minus ℎTB) 119867119877

=ℎ119905119889119904119867119877

119867119878(ℎBB minus ℎTB)

(3)

If the height of the target object is not exactly equal tothe given assumed value and there is a difference of 120576 acalculation error of distance will occur As shown in Figure 6if the actual height of the object is ℎ1015840

119905= ℎ119905+ 120576 and the image

Table 4 Notations for calculation of target object distance

Notation Descriptionℎ119905 Assumed (given) height of the target object

119889119905

Distance between the visual sensor and targetobject

120582Included angle of the two lines of sight to topand bottom of the target object

119867119878 Height of the internal image sensor

ℎ119904

Height of the captured target object in theinternal image sensor

119889119904

Distance between the internal image sensor andthe camera lens

119867119877 Vertical resolution (height in pixels) of the image

ℎBB

Height (in pixels) of the distance between thebottom edge of the object and the top edge of theimage

ℎTBHeight (in pixels) of the distance between thetop edges of the object and the image

height of the captured object is the same the system believesthat the object distance is 119889

119905after the calculation However

the actual distance should be 1198891015840119905 which can be calculated by

the following equations Equation (7) shows that there will bean error ratio of 120576ℎ

119905between the calculated and actual object

distances For instance if ℎ119905is given as a value of 170 cm and

the calculated object distance 119889119905is 30m but the actual height

ℎ1015840119905of the object is 175 cm then the actual distance should have

an error of 5 cm170 cmtimes30m = 088mWe believe that thedistance error ratio of 120576ℎ

119905is tolerable

Let

119896 =119889119904119867119877

119867119878(ℎBB minus ℎTB)

(4)

119889119905= ℎ119905119896 (5)

1198891015840

119905= (ℎ119905+ 120576) 119896 = ℎ

119905119896 + 120576119896 = 119889

119905+ 120576119889119905

ℎ119905

(6)

1198891015840

119905= 119889119905(1 +

120576

ℎ119905

) (7)

As shown in Figure 7 (119909119904 119910119904) and (119909

119905 119910119905) indicate the

coordinates of the visual sensor and target object respec-tively Once both the direction and distance of the targetobject are calculated the coordinates of the object can beobtained as follows This will be used in the control of sensorhandover

119909119905= 119909119904+ 119889119905cos 120579119905

119910119905= 119910119904+ 119889119905sin 120579119905

(8)

44 Sensor Handover After the visual sensor obtains thecoordinates of the target object it adjusts the sensing direc-tion toward the target and then detects object movement bydynamic background subtraction mentioned in Section 42Once a movement of the target object is detected the new

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 2: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

2 International Journal of Distributed Sensor Networks

is completed by neither collaborative multiple sensors norexploiting the target objects which are equipped with anadditional signal transmission component (2) the proposeddistributed Voronoi-based target tracking approach makesthe target be tracked by only single sensor at a time withoutthe need of cooperation of multiple visual sensors (3) thehandover protocol decides when to handover by only thecurrent tracker based on the geometrical Voronoi cell struc-ture It needs no cooperation information of multiple sensorsto compute and make the decision and (4) the proposedVoronoi-based tracking scheme can ensure the visual qualityof the target because it is a shortest-distance-based trackingand handover scheme

The remainder of the paper is organized as followsSection 2 briefs the related works of target localizationand tracking for WSNs and VSNs Section 3 describes theassumptions and preconditions for this study In Section 4the proposed scheme of Voronoi-based target tracking andsensor handover are described Section 5 evaluates the perfor-mance by simulations Finally concluding remarks are madein Section 6

2 Related Works

For many applications of WSNs utilization of a localizationtechnology is a requirement to perform certain functionsLocalization techniques utilized in WSNs provide usefullocation information for subjects such as deployment cov-erage routing tracking and rescue It can be categorizedinto sensor localization and target localization [11] Both thecategories of localization in WSNs have drawn considerableattention of researchers and the solution schemes can beclassified as coarse grained or fine grained GPS free orGPS based anchor free or anchor based centralized ordistributed range free or range based and stationary ormobile sensors [12 13] Target localization is specifically forthe applications of surveillance and monitoring because thiskind of applications is usually interested in not only theexistence but also the location of a target object Most of thetarget localization schemes used in scalarWSNs are not appli-cable to VSNs due to the characteristic of directional FoVof a visualcamera sensor A sensor selection-based targetlocalization algorithm was proposed in [14] A set of camerasensors that can detect the target with a high probability isselected as candidates and then certain of the candidateswill be selected for the estimation of target position Thealgorithm needs numerous camera sensors deployed withhigher density in the surveillance region and the targetneeds to be observeddetected synchronously by a numberof sensors In [15] the authors also proposed an algorithmof sensor selection to select a set of camera sensors for theimprovement of target localization accuracy The algorithmalso needs the target to occur in the overlapped area of FoVsof several camera sensors In [16] coarse-grained and fine-grained localization procedures are performed respectivelyOrdinary wireless sensors are used for rough estimation oftarget position and camera sensors are used to determineaccurate target location However it is assumed that targets

are equipped with a wireless transceiver to calculate ReceivedSignal Strength Indication (RSSI) and transmit informationto sever The calculation of target position is completed inthe server and it is a centralized scheme In our study targetlocalization can be completed by only one sensor and thereis no need to equip with an additional localization device orsignal transmitter on the targets

Regarding the target tracking numerous related worksaimed for tracking target in ordinary WSNs [17ndash20] Datadelivery delay time and network lifetime were consideredfor target tracking in [17] A heterogeneous wireless sensornetwork and a mobile wireless sensor network were usedin the studies [18 19] respectively In [20] boundary nodesof a monitoring region can be found and used to know theentry and exit of a target through the monitoring region ForVSNs the related work [16] used a hybrid sensor networkcomposed of ordinary and camera sensors for target trackingThe sensors are regularly deployed and arranged in an arraywhich is different from a random deployment The work [21]focuses on the solution of cooperative multiobject trackingamong multiple camera sensors in a VSN established witha highly correlated sensing mode In the approach globalinformation is needed for each camera sensor In [22] acentralized 2-camera system was proposed for detecting andtracking humans in a realistic indoor home environmentThe work [23] utilizes Graphics Processing Unit (GPU)coprocessors to compare image frames of a camera sensorso as to acquire human position and velocity for tracking andhandoff In [24] multiple camera sensors which detected thesame target will cooperate to track the target An optimalsensor will be selected from these sensors according to aconfidence measurement and this optimal sensor is mainlyresponsible for the tracking task When the target is detectedby a new camera sensor or there is a target movement a newoptimal camera sensor will be selected The tracking handoffscheme needs the communication of cooperation data ofthese sensors In our Voronoi-based handover protocolthere is no need of cooperation data of target detectionand tracking of multiple sensors In [25] an observationcorrelation coefficient (OCC) is defined as the ratio of theoverlap area of two cameras to the FoV of one camera It isused for the determination of activating a set of cameras toimprove the accuracy of target tracking The scheme worksunder the cooperation of multiple sensors which observedthe target More camera sensors need to be deployed in themonitoring field and bring coverage overlaps for performingthe algorithm Our proposed protocol can be performedwithout high overlapped coverage among sensors On thecontrary our scheme make camera sensors reduce FoVoverlaps to have a high overall field coverage and can trackthe target by a single camera sensor Table 1 shows a summaryof comparing the proposed scheme with the related works

3 Preliminaries

31 Voronoi Diagram In this study the properties of aVoronoi diagram is utilized to propose the sensor handover

International Journal of Distributed Sensor Networks 3

Table 1 Summary of comparison with related works

Reference Characteristic ComparisonRelated work Proposed scheme in this paper

[16]Uses a hybrid sensornetwork composed ofordinary and camerasensors for target tracking

(i) Targets are equipped with a radiotransceiver for RSSI calculation andlocation estimation(ii) Sensors are regularly deployed andarranged in an array(iii) A centralized approach

(i) No additional component to beequipped on the targets(ii) Sensors are randomly deployed(iii) A distributed approach

[21]Focuses on the solution ofcooperative multiobjecttracking among multiplecamera sensors

(i) Cooperative tracking by relevantmultiple sensor nodes and the VSN isestablished with a highly correlatedsensing mode(ii) Assumes that relative positions andsensing parameters of all sensors areknown by each sensor(iii) Multitarget tracking

(i) Tracks the target by a single camerasensor based on Voronoi cell structure(ii) Not need global information(iii) Each sensor aims to track one targetbut the algorithm can be extended totrack multiple target with one sensor

[22]

Detects and trackshumans in a realistichome environment byexploiting both color anddepth camera sensors

(i) It is designed for indoor smart homeenvironment(ii) A 2-camera system and fuses imagesfrom two channels to achieve accuracyrate(iii) A centralized system

(i) It is designed for large scale VSNs(ii) Tracks the target by a single camerasensor based on Voronoi cell structure(iii) A distributed algorithm

[23]

Utilizes GPUcoprocessors to compareimage frames and obtainhuman position andvelocity for target tracking

(i) Needs Graphics Processing Unit(GPU)(ii) Projects several hundred grid-basedellipsoids on each image frame tocompare with the image

(i) Uses general camera sensor withoutadditional GPU(ii) Uses Voronoi cells for target trackingnot grid-based structure

[24]Presents a cooperativemulticamera targettracking method based onnode selection scheme

(i) Tracking handoff scheme needs thecommunication of cooperation data ofmultiple sensors to select optimal tracker(ii) Whenever the target is detected by anew camera sensor or there is a targetmovement a new optimal camera sensorwill be selected

(i) Voronoi-based handover protocoldoes not need the cooperation data ofmultiple sensors(ii) No sensor selection procedure

[25]

Shows an observationcorrelation coefficient(OCC) which is definedand used for thedetermination ofactivating a set of camerasto improve the accuracyof target tracking

(i) At least two cameras are needed todetermine the target location(ii) Tracking under the cooperation ofmultiple sensors(iii) More camera sensors are needed tobe deployed with coverage overlaps forperforming the algorithm

(i) Tracks the target by a single camerasensor based on Voronoi cell structure(ii) Performs without high overlappedsensor coverage on the contrary overlapis reduced to increase overall coverageratio

protocol for target tracking in a VSN A Voronoi diagram asshown in Figure 1 has the following properties

(1) It divides an area into convex polygons calledVoronoi cells according to a given set of points

(2) Each of the given points lies in exactly one Voronoicell

(3) The commonshared edge of two adjacent Voronoicells is the bisector line between the two points in thetwo cells

(4) For any position 119902 which lies in the cell associatedwith point 119901 the distance between 119902 and 119901 must beshorter than the distance between 119902 and the associatedpoint within any other cell

(5) For an area with a given set of points the Voronoidiagram is unique

A well-known algorithm to construct Voronoi diagram isFortunersquos algorithm [26] which is a centralized algorithmIt can generate the Voronoi diagram with the global infor-mation of positions of a set of points in a plane In a VSNdeployed visual sensors can be treated as the set of pointsfor the construction of Voronoi diagram of the surveillanceregion For any object which appears in a Voronoi cell thesensor in the same cell can obtain the optimal sensing quality(the clarity of the picture) of the object and has the strongestreaction capacity to the object Therefore the concept ofVoronoi diagram is used in the proposed protocol Howeverin this study Voronoi diagram is constructed distributed

4 International Journal of Distributed Sensor Networks

Figure 1 A Voronoi diagram

and localized by the deployed sensors without global infor-mation It is not constructed by using a centralized algorithmwhich needs the global information Each sensor constructsits own Voronoi cell with local information Those localVoronoi cells that are constructed by all the deployed sensorswill form a complete Voronoi diagram The detail of theconstruction of Voronoi cell in this study is described in theSection 41

32 Assumptions The basic assumptions in this study aredescribed as follows

(1) The visual sensors are homogeneous and randomlydeployed in the surveillance region at initial phaseThey are stationary and direction rotatable

(2) Each visual sensor can obtain its coordinates by alocalization technology and has enough communica-tion range or can use amultihop transmissionmethodto transmit information to its neighbor sensors

(3) The objects to be tracked have no positioning compo-nent Their coordinates are originally unknown

(4) The type of target object in the tracking system isdeterminate or it can be simply recognized by theshape of detected object Accordingly the height ofthe target object can be given approximately

33 Visual Sensing Model The visual sensors used in thisstudy are directional And as mentioned in previous Sec-tion 32 of assumptions the sensors are direction rotatableThe effective sensing area (Field of View) of the visual sensoris in sector shape The sensing model for the visual sensorsin the proposed protocol is shown in Figure 2 and the relatednotations are listed in Table 2

y-axisx-axis

3

2120587

1

2120587

(xs ys)

120572

120579s

120587

0

r

Field of view

Figure 2 Sensing model for visual sensors

Table 2 Notations for the sensing model

Notation Description(119909119904 119910119904) Coordinates of the visual sensor

120572 Angle of view (AoV) of the visual sensor119903 Effective sensing radius of the visual sensor

120579119904

Sensing direction of the visual sensor(direction angle relative to the positive119909-axis)

4 Voronoi-Based Sensor HandoverProtocol (VSHP)

41 Local Voronoi Cell Construction and Sensing CoverageEnhancement In this study the concept of Voronoi diagramis utilized for dividing the surveillance region into numerousconvex polygons namely Voronoi cellsThe sensors deployedin the surveillance region can be treated as the points ina Voronoi diagram and each sensor belongs to one andonly one cell in the diagram As mentioned in Section 31a Voronoi diagram can be generated by the well-knownFortunersquos algorithm which is a centralized algorithm froma given set of points in a plane However this study aims atproposing a distributed protocol the sensors only distribut-edly construct the local Voronoi cell of their own without theglobal information of the sensor network

After the random deployment of sensors each sensor willbroadcast its coordinates to and receive coordinates fromneighbors Figure 3 illustrates the construction of a localVoronoi cell with only the local information of neighborpositions The sensor 119904

1199050receives coordinates from 119904

1199051 1199041199052

in sequence and constructs the corresponding bisector linesegments one at a time Finally 119904

1199050can obtain the structure of

its local Voronoi cell enclosed by these bisector line segmentsBecause the visual sensors are randomly deployed their

positions and sensing directions are also random at initialphase A coverage enhancement is needed to reduce theoverlaps of sensing coverage of these visual sensors Thenthe overall coverage ratio of the surveillance region can beimproved for the task of target trackingThe algorithm in ourprevious work [27] is utilized for coverage enhancement inthis study

International Journal of Distributed Sensor Networks 5

Voronoi cell

1 2 3 4

5 6 7 8

St0St0 St0 St0

St0St0St0St0

St1 St1

St1St1St1St1

St1

St2 St2

St2St2St2St2

St3

St3St3St3St3

St4 St4 St4St4

St5St5 St5

St6 St6

St7

St1

St2

St3

St4

St5

St6

St7

Figure 3 Construction of a local Voronoi cell

Background image

Updated background New subsequent image

Background update Object movement

minus

minus

=

=

Subsequent imagef(x y ti+1)

f(x y ti+1)

f(x y ti)

f(x y ti+2)

Moved object

Detected object

Figure 4 Dynamic background subtraction for object and movement detection

42 Object Detection In a tracking system object detectionis necessary prior to the object tracking For tracking in aVSN the detection can be done by object segmentation fromthe camera scenes Background subtraction [28ndash30] is animportant approach for object segmentation in visual surveil-lance system which compares two images to acquire thedifference A simple illustration of background subtractionis shown in Figure 4 A foreground intruder object can beextracted by comparing current image frame 119891(119909 119910 119905

119894+1)

with previous reference background image 119891(119909 119910 119905119894) Fur-

thermore a dynamic background subtraction approach willkeep background image updated for next comparison withnewer incoming image to detect object movement Dynamicbackground subtraction can be defined as the followingequationwhere119891diff(119909 119910 119905119894+1) is the image frame of differencebetween background frame and subsequent frame

119891diff (119909 119910 119905119894+1) =1003816100381610038161003816119891 (119909 119910 119905119894) minus 119891 (119909 119910 119905119894+1)

1003816100381610038161003816 (1)

6 International Journal of Distributed Sensor Networks

Table 3 Notations for calculation of target object direction

Notation Description

120579119905

Angle of the direction from the sensor to thetarget relative to the positive 119909-axis

120573Included angle between the line of sight fromthe sensor to the target and the right edge ofthe field of view (FoV)

119882119877

Horizontal resolution (width in pixels) of theimage

119908119905

Width (in pixels) of the distance between thetarget object center and the right edge of theimage

119908LB

Width (in pixels) of the distance between theleft edge of the detected target object and theright edge of the image

119908RB

Width (in pixels) of the distance between theright edges of the detected target object and theimage

43 Target Localization Once an incoming object is detectedby a visual sensor the sensor will estimate the actual positionof the object on the ground plane To estimate the objectposition firstly the direction of the object will be calculatedand secondly the distance of the object Regarding the objectdirection as shown in Figure 5 the view from the visualsensor is limited by its angle of view and the captured sceneis shown on the image in a rectangle shape Things in aline of sight will appear in a vertical line on the image Asthe illustration the direction of the detected target object (ahuman) can be calculated by the following equations with thenotations listed in Table 3

120573 = 120572 sdot119908119905

119882119877

= 120572 sdot119908LB + 119908RB2119882119877

120579119905= 120579119904minus120572

2+ 120573

(2)

Regarding the distance of the object as shown in Figure 6the image of real object is projected reversely on the internalimage sensor (usually a CCD or CMOS sensor) through thecamera lens and then electronic signals generated by theinternal image sensor are processed with an image processorto form the digital image Since the target object was assumedto be a height of given value the distance between the targetobject and visual sensor can be calculated by the followingequations with the notations listed in Table 4

tan 120582 =ℎ119905

119889119905

=ℎ119904

119889119904

119889119905=ℎ119905119889119904

ℎ119904

=ℎ119905119889119904

119867119878(ℎBB minus ℎTB) 119867119877

=ℎ119905119889119904119867119877

119867119878(ℎBB minus ℎTB)

(3)

If the height of the target object is not exactly equal tothe given assumed value and there is a difference of 120576 acalculation error of distance will occur As shown in Figure 6if the actual height of the object is ℎ1015840

119905= ℎ119905+ 120576 and the image

Table 4 Notations for calculation of target object distance

Notation Descriptionℎ119905 Assumed (given) height of the target object

119889119905

Distance between the visual sensor and targetobject

120582Included angle of the two lines of sight to topand bottom of the target object

119867119878 Height of the internal image sensor

ℎ119904

Height of the captured target object in theinternal image sensor

119889119904

Distance between the internal image sensor andthe camera lens

119867119877 Vertical resolution (height in pixels) of the image

ℎBB

Height (in pixels) of the distance between thebottom edge of the object and the top edge of theimage

ℎTBHeight (in pixels) of the distance between thetop edges of the object and the image

height of the captured object is the same the system believesthat the object distance is 119889

119905after the calculation However

the actual distance should be 1198891015840119905 which can be calculated by

the following equations Equation (7) shows that there will bean error ratio of 120576ℎ

119905between the calculated and actual object

distances For instance if ℎ119905is given as a value of 170 cm and

the calculated object distance 119889119905is 30m but the actual height

ℎ1015840119905of the object is 175 cm then the actual distance should have

an error of 5 cm170 cmtimes30m = 088mWe believe that thedistance error ratio of 120576ℎ

119905is tolerable

Let

119896 =119889119904119867119877

119867119878(ℎBB minus ℎTB)

(4)

119889119905= ℎ119905119896 (5)

1198891015840

119905= (ℎ119905+ 120576) 119896 = ℎ

119905119896 + 120576119896 = 119889

119905+ 120576119889119905

ℎ119905

(6)

1198891015840

119905= 119889119905(1 +

120576

ℎ119905

) (7)

As shown in Figure 7 (119909119904 119910119904) and (119909

119905 119910119905) indicate the

coordinates of the visual sensor and target object respec-tively Once both the direction and distance of the targetobject are calculated the coordinates of the object can beobtained as follows This will be used in the control of sensorhandover

119909119905= 119909119904+ 119889119905cos 120579119905

119910119905= 119910119904+ 119889119905sin 120579119905

(8)

44 Sensor Handover After the visual sensor obtains thecoordinates of the target object it adjusts the sensing direc-tion toward the target and then detects object movement bydynamic background subtraction mentioned in Section 42Once a movement of the target object is detected the new

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 3: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of Distributed Sensor Networks 3

Table 1 Summary of comparison with related works

Reference Characteristic ComparisonRelated work Proposed scheme in this paper

[16]Uses a hybrid sensornetwork composed ofordinary and camerasensors for target tracking

(i) Targets are equipped with a radiotransceiver for RSSI calculation andlocation estimation(ii) Sensors are regularly deployed andarranged in an array(iii) A centralized approach

(i) No additional component to beequipped on the targets(ii) Sensors are randomly deployed(iii) A distributed approach

[21]Focuses on the solution ofcooperative multiobjecttracking among multiplecamera sensors

(i) Cooperative tracking by relevantmultiple sensor nodes and the VSN isestablished with a highly correlatedsensing mode(ii) Assumes that relative positions andsensing parameters of all sensors areknown by each sensor(iii) Multitarget tracking

(i) Tracks the target by a single camerasensor based on Voronoi cell structure(ii) Not need global information(iii) Each sensor aims to track one targetbut the algorithm can be extended totrack multiple target with one sensor

[22]

Detects and trackshumans in a realistichome environment byexploiting both color anddepth camera sensors

(i) It is designed for indoor smart homeenvironment(ii) A 2-camera system and fuses imagesfrom two channels to achieve accuracyrate(iii) A centralized system

(i) It is designed for large scale VSNs(ii) Tracks the target by a single camerasensor based on Voronoi cell structure(iii) A distributed algorithm

[23]

Utilizes GPUcoprocessors to compareimage frames and obtainhuman position andvelocity for target tracking

(i) Needs Graphics Processing Unit(GPU)(ii) Projects several hundred grid-basedellipsoids on each image frame tocompare with the image

(i) Uses general camera sensor withoutadditional GPU(ii) Uses Voronoi cells for target trackingnot grid-based structure

[24]Presents a cooperativemulticamera targettracking method based onnode selection scheme

(i) Tracking handoff scheme needs thecommunication of cooperation data ofmultiple sensors to select optimal tracker(ii) Whenever the target is detected by anew camera sensor or there is a targetmovement a new optimal camera sensorwill be selected

(i) Voronoi-based handover protocoldoes not need the cooperation data ofmultiple sensors(ii) No sensor selection procedure

[25]

Shows an observationcorrelation coefficient(OCC) which is definedand used for thedetermination ofactivating a set of camerasto improve the accuracyof target tracking

(i) At least two cameras are needed todetermine the target location(ii) Tracking under the cooperation ofmultiple sensors(iii) More camera sensors are needed tobe deployed with coverage overlaps forperforming the algorithm

(i) Tracks the target by a single camerasensor based on Voronoi cell structure(ii) Performs without high overlappedsensor coverage on the contrary overlapis reduced to increase overall coverageratio

protocol for target tracking in a VSN A Voronoi diagram asshown in Figure 1 has the following properties

(1) It divides an area into convex polygons calledVoronoi cells according to a given set of points

(2) Each of the given points lies in exactly one Voronoicell

(3) The commonshared edge of two adjacent Voronoicells is the bisector line between the two points in thetwo cells

(4) For any position 119902 which lies in the cell associatedwith point 119901 the distance between 119902 and 119901 must beshorter than the distance between 119902 and the associatedpoint within any other cell

(5) For an area with a given set of points the Voronoidiagram is unique

A well-known algorithm to construct Voronoi diagram isFortunersquos algorithm [26] which is a centralized algorithmIt can generate the Voronoi diagram with the global infor-mation of positions of a set of points in a plane In a VSNdeployed visual sensors can be treated as the set of pointsfor the construction of Voronoi diagram of the surveillanceregion For any object which appears in a Voronoi cell thesensor in the same cell can obtain the optimal sensing quality(the clarity of the picture) of the object and has the strongestreaction capacity to the object Therefore the concept ofVoronoi diagram is used in the proposed protocol Howeverin this study Voronoi diagram is constructed distributed

4 International Journal of Distributed Sensor Networks

Figure 1 A Voronoi diagram

and localized by the deployed sensors without global infor-mation It is not constructed by using a centralized algorithmwhich needs the global information Each sensor constructsits own Voronoi cell with local information Those localVoronoi cells that are constructed by all the deployed sensorswill form a complete Voronoi diagram The detail of theconstruction of Voronoi cell in this study is described in theSection 41

32 Assumptions The basic assumptions in this study aredescribed as follows

(1) The visual sensors are homogeneous and randomlydeployed in the surveillance region at initial phaseThey are stationary and direction rotatable

(2) Each visual sensor can obtain its coordinates by alocalization technology and has enough communica-tion range or can use amultihop transmissionmethodto transmit information to its neighbor sensors

(3) The objects to be tracked have no positioning compo-nent Their coordinates are originally unknown

(4) The type of target object in the tracking system isdeterminate or it can be simply recognized by theshape of detected object Accordingly the height ofthe target object can be given approximately

33 Visual Sensing Model The visual sensors used in thisstudy are directional And as mentioned in previous Sec-tion 32 of assumptions the sensors are direction rotatableThe effective sensing area (Field of View) of the visual sensoris in sector shape The sensing model for the visual sensorsin the proposed protocol is shown in Figure 2 and the relatednotations are listed in Table 2

y-axisx-axis

3

2120587

1

2120587

(xs ys)

120572

120579s

120587

0

r

Field of view

Figure 2 Sensing model for visual sensors

Table 2 Notations for the sensing model

Notation Description(119909119904 119910119904) Coordinates of the visual sensor

120572 Angle of view (AoV) of the visual sensor119903 Effective sensing radius of the visual sensor

120579119904

Sensing direction of the visual sensor(direction angle relative to the positive119909-axis)

4 Voronoi-Based Sensor HandoverProtocol (VSHP)

41 Local Voronoi Cell Construction and Sensing CoverageEnhancement In this study the concept of Voronoi diagramis utilized for dividing the surveillance region into numerousconvex polygons namely Voronoi cellsThe sensors deployedin the surveillance region can be treated as the points ina Voronoi diagram and each sensor belongs to one andonly one cell in the diagram As mentioned in Section 31a Voronoi diagram can be generated by the well-knownFortunersquos algorithm which is a centralized algorithm froma given set of points in a plane However this study aims atproposing a distributed protocol the sensors only distribut-edly construct the local Voronoi cell of their own without theglobal information of the sensor network

After the random deployment of sensors each sensor willbroadcast its coordinates to and receive coordinates fromneighbors Figure 3 illustrates the construction of a localVoronoi cell with only the local information of neighborpositions The sensor 119904

1199050receives coordinates from 119904

1199051 1199041199052

in sequence and constructs the corresponding bisector linesegments one at a time Finally 119904

1199050can obtain the structure of

its local Voronoi cell enclosed by these bisector line segmentsBecause the visual sensors are randomly deployed their

positions and sensing directions are also random at initialphase A coverage enhancement is needed to reduce theoverlaps of sensing coverage of these visual sensors Thenthe overall coverage ratio of the surveillance region can beimproved for the task of target trackingThe algorithm in ourprevious work [27] is utilized for coverage enhancement inthis study

International Journal of Distributed Sensor Networks 5

Voronoi cell

1 2 3 4

5 6 7 8

St0St0 St0 St0

St0St0St0St0

St1 St1

St1St1St1St1

St1

St2 St2

St2St2St2St2

St3

St3St3St3St3

St4 St4 St4St4

St5St5 St5

St6 St6

St7

St1

St2

St3

St4

St5

St6

St7

Figure 3 Construction of a local Voronoi cell

Background image

Updated background New subsequent image

Background update Object movement

minus

minus

=

=

Subsequent imagef(x y ti+1)

f(x y ti+1)

f(x y ti)

f(x y ti+2)

Moved object

Detected object

Figure 4 Dynamic background subtraction for object and movement detection

42 Object Detection In a tracking system object detectionis necessary prior to the object tracking For tracking in aVSN the detection can be done by object segmentation fromthe camera scenes Background subtraction [28ndash30] is animportant approach for object segmentation in visual surveil-lance system which compares two images to acquire thedifference A simple illustration of background subtractionis shown in Figure 4 A foreground intruder object can beextracted by comparing current image frame 119891(119909 119910 119905

119894+1)

with previous reference background image 119891(119909 119910 119905119894) Fur-

thermore a dynamic background subtraction approach willkeep background image updated for next comparison withnewer incoming image to detect object movement Dynamicbackground subtraction can be defined as the followingequationwhere119891diff(119909 119910 119905119894+1) is the image frame of differencebetween background frame and subsequent frame

119891diff (119909 119910 119905119894+1) =1003816100381610038161003816119891 (119909 119910 119905119894) minus 119891 (119909 119910 119905119894+1)

1003816100381610038161003816 (1)

6 International Journal of Distributed Sensor Networks

Table 3 Notations for calculation of target object direction

Notation Description

120579119905

Angle of the direction from the sensor to thetarget relative to the positive 119909-axis

120573Included angle between the line of sight fromthe sensor to the target and the right edge ofthe field of view (FoV)

119882119877

Horizontal resolution (width in pixels) of theimage

119908119905

Width (in pixels) of the distance between thetarget object center and the right edge of theimage

119908LB

Width (in pixels) of the distance between theleft edge of the detected target object and theright edge of the image

119908RB

Width (in pixels) of the distance between theright edges of the detected target object and theimage

43 Target Localization Once an incoming object is detectedby a visual sensor the sensor will estimate the actual positionof the object on the ground plane To estimate the objectposition firstly the direction of the object will be calculatedand secondly the distance of the object Regarding the objectdirection as shown in Figure 5 the view from the visualsensor is limited by its angle of view and the captured sceneis shown on the image in a rectangle shape Things in aline of sight will appear in a vertical line on the image Asthe illustration the direction of the detected target object (ahuman) can be calculated by the following equations with thenotations listed in Table 3

120573 = 120572 sdot119908119905

119882119877

= 120572 sdot119908LB + 119908RB2119882119877

120579119905= 120579119904minus120572

2+ 120573

(2)

Regarding the distance of the object as shown in Figure 6the image of real object is projected reversely on the internalimage sensor (usually a CCD or CMOS sensor) through thecamera lens and then electronic signals generated by theinternal image sensor are processed with an image processorto form the digital image Since the target object was assumedto be a height of given value the distance between the targetobject and visual sensor can be calculated by the followingequations with the notations listed in Table 4

tan 120582 =ℎ119905

119889119905

=ℎ119904

119889119904

119889119905=ℎ119905119889119904

ℎ119904

=ℎ119905119889119904

119867119878(ℎBB minus ℎTB) 119867119877

=ℎ119905119889119904119867119877

119867119878(ℎBB minus ℎTB)

(3)

If the height of the target object is not exactly equal tothe given assumed value and there is a difference of 120576 acalculation error of distance will occur As shown in Figure 6if the actual height of the object is ℎ1015840

119905= ℎ119905+ 120576 and the image

Table 4 Notations for calculation of target object distance

Notation Descriptionℎ119905 Assumed (given) height of the target object

119889119905

Distance between the visual sensor and targetobject

120582Included angle of the two lines of sight to topand bottom of the target object

119867119878 Height of the internal image sensor

ℎ119904

Height of the captured target object in theinternal image sensor

119889119904

Distance between the internal image sensor andthe camera lens

119867119877 Vertical resolution (height in pixels) of the image

ℎBB

Height (in pixels) of the distance between thebottom edge of the object and the top edge of theimage

ℎTBHeight (in pixels) of the distance between thetop edges of the object and the image

height of the captured object is the same the system believesthat the object distance is 119889

119905after the calculation However

the actual distance should be 1198891015840119905 which can be calculated by

the following equations Equation (7) shows that there will bean error ratio of 120576ℎ

119905between the calculated and actual object

distances For instance if ℎ119905is given as a value of 170 cm and

the calculated object distance 119889119905is 30m but the actual height

ℎ1015840119905of the object is 175 cm then the actual distance should have

an error of 5 cm170 cmtimes30m = 088mWe believe that thedistance error ratio of 120576ℎ

119905is tolerable

Let

119896 =119889119904119867119877

119867119878(ℎBB minus ℎTB)

(4)

119889119905= ℎ119905119896 (5)

1198891015840

119905= (ℎ119905+ 120576) 119896 = ℎ

119905119896 + 120576119896 = 119889

119905+ 120576119889119905

ℎ119905

(6)

1198891015840

119905= 119889119905(1 +

120576

ℎ119905

) (7)

As shown in Figure 7 (119909119904 119910119904) and (119909

119905 119910119905) indicate the

coordinates of the visual sensor and target object respec-tively Once both the direction and distance of the targetobject are calculated the coordinates of the object can beobtained as follows This will be used in the control of sensorhandover

119909119905= 119909119904+ 119889119905cos 120579119905

119910119905= 119910119904+ 119889119905sin 120579119905

(8)

44 Sensor Handover After the visual sensor obtains thecoordinates of the target object it adjusts the sensing direc-tion toward the target and then detects object movement bydynamic background subtraction mentioned in Section 42Once a movement of the target object is detected the new

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 4: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

4 International Journal of Distributed Sensor Networks

Figure 1 A Voronoi diagram

and localized by the deployed sensors without global infor-mation It is not constructed by using a centralized algorithmwhich needs the global information Each sensor constructsits own Voronoi cell with local information Those localVoronoi cells that are constructed by all the deployed sensorswill form a complete Voronoi diagram The detail of theconstruction of Voronoi cell in this study is described in theSection 41

32 Assumptions The basic assumptions in this study aredescribed as follows

(1) The visual sensors are homogeneous and randomlydeployed in the surveillance region at initial phaseThey are stationary and direction rotatable

(2) Each visual sensor can obtain its coordinates by alocalization technology and has enough communica-tion range or can use amultihop transmissionmethodto transmit information to its neighbor sensors

(3) The objects to be tracked have no positioning compo-nent Their coordinates are originally unknown

(4) The type of target object in the tracking system isdeterminate or it can be simply recognized by theshape of detected object Accordingly the height ofthe target object can be given approximately

33 Visual Sensing Model The visual sensors used in thisstudy are directional And as mentioned in previous Sec-tion 32 of assumptions the sensors are direction rotatableThe effective sensing area (Field of View) of the visual sensoris in sector shape The sensing model for the visual sensorsin the proposed protocol is shown in Figure 2 and the relatednotations are listed in Table 2

y-axisx-axis

3

2120587

1

2120587

(xs ys)

120572

120579s

120587

0

r

Field of view

Figure 2 Sensing model for visual sensors

Table 2 Notations for the sensing model

Notation Description(119909119904 119910119904) Coordinates of the visual sensor

120572 Angle of view (AoV) of the visual sensor119903 Effective sensing radius of the visual sensor

120579119904

Sensing direction of the visual sensor(direction angle relative to the positive119909-axis)

4 Voronoi-Based Sensor HandoverProtocol (VSHP)

41 Local Voronoi Cell Construction and Sensing CoverageEnhancement In this study the concept of Voronoi diagramis utilized for dividing the surveillance region into numerousconvex polygons namely Voronoi cellsThe sensors deployedin the surveillance region can be treated as the points ina Voronoi diagram and each sensor belongs to one andonly one cell in the diagram As mentioned in Section 31a Voronoi diagram can be generated by the well-knownFortunersquos algorithm which is a centralized algorithm froma given set of points in a plane However this study aims atproposing a distributed protocol the sensors only distribut-edly construct the local Voronoi cell of their own without theglobal information of the sensor network

After the random deployment of sensors each sensor willbroadcast its coordinates to and receive coordinates fromneighbors Figure 3 illustrates the construction of a localVoronoi cell with only the local information of neighborpositions The sensor 119904

1199050receives coordinates from 119904

1199051 1199041199052

in sequence and constructs the corresponding bisector linesegments one at a time Finally 119904

1199050can obtain the structure of

its local Voronoi cell enclosed by these bisector line segmentsBecause the visual sensors are randomly deployed their

positions and sensing directions are also random at initialphase A coverage enhancement is needed to reduce theoverlaps of sensing coverage of these visual sensors Thenthe overall coverage ratio of the surveillance region can beimproved for the task of target trackingThe algorithm in ourprevious work [27] is utilized for coverage enhancement inthis study

International Journal of Distributed Sensor Networks 5

Voronoi cell

1 2 3 4

5 6 7 8

St0St0 St0 St0

St0St0St0St0

St1 St1

St1St1St1St1

St1

St2 St2

St2St2St2St2

St3

St3St3St3St3

St4 St4 St4St4

St5St5 St5

St6 St6

St7

St1

St2

St3

St4

St5

St6

St7

Figure 3 Construction of a local Voronoi cell

Background image

Updated background New subsequent image

Background update Object movement

minus

minus

=

=

Subsequent imagef(x y ti+1)

f(x y ti+1)

f(x y ti)

f(x y ti+2)

Moved object

Detected object

Figure 4 Dynamic background subtraction for object and movement detection

42 Object Detection In a tracking system object detectionis necessary prior to the object tracking For tracking in aVSN the detection can be done by object segmentation fromthe camera scenes Background subtraction [28ndash30] is animportant approach for object segmentation in visual surveil-lance system which compares two images to acquire thedifference A simple illustration of background subtractionis shown in Figure 4 A foreground intruder object can beextracted by comparing current image frame 119891(119909 119910 119905

119894+1)

with previous reference background image 119891(119909 119910 119905119894) Fur-

thermore a dynamic background subtraction approach willkeep background image updated for next comparison withnewer incoming image to detect object movement Dynamicbackground subtraction can be defined as the followingequationwhere119891diff(119909 119910 119905119894+1) is the image frame of differencebetween background frame and subsequent frame

119891diff (119909 119910 119905119894+1) =1003816100381610038161003816119891 (119909 119910 119905119894) minus 119891 (119909 119910 119905119894+1)

1003816100381610038161003816 (1)

6 International Journal of Distributed Sensor Networks

Table 3 Notations for calculation of target object direction

Notation Description

120579119905

Angle of the direction from the sensor to thetarget relative to the positive 119909-axis

120573Included angle between the line of sight fromthe sensor to the target and the right edge ofthe field of view (FoV)

119882119877

Horizontal resolution (width in pixels) of theimage

119908119905

Width (in pixels) of the distance between thetarget object center and the right edge of theimage

119908LB

Width (in pixels) of the distance between theleft edge of the detected target object and theright edge of the image

119908RB

Width (in pixels) of the distance between theright edges of the detected target object and theimage

43 Target Localization Once an incoming object is detectedby a visual sensor the sensor will estimate the actual positionof the object on the ground plane To estimate the objectposition firstly the direction of the object will be calculatedand secondly the distance of the object Regarding the objectdirection as shown in Figure 5 the view from the visualsensor is limited by its angle of view and the captured sceneis shown on the image in a rectangle shape Things in aline of sight will appear in a vertical line on the image Asthe illustration the direction of the detected target object (ahuman) can be calculated by the following equations with thenotations listed in Table 3

120573 = 120572 sdot119908119905

119882119877

= 120572 sdot119908LB + 119908RB2119882119877

120579119905= 120579119904minus120572

2+ 120573

(2)

Regarding the distance of the object as shown in Figure 6the image of real object is projected reversely on the internalimage sensor (usually a CCD or CMOS sensor) through thecamera lens and then electronic signals generated by theinternal image sensor are processed with an image processorto form the digital image Since the target object was assumedto be a height of given value the distance between the targetobject and visual sensor can be calculated by the followingequations with the notations listed in Table 4

tan 120582 =ℎ119905

119889119905

=ℎ119904

119889119904

119889119905=ℎ119905119889119904

ℎ119904

=ℎ119905119889119904

119867119878(ℎBB minus ℎTB) 119867119877

=ℎ119905119889119904119867119877

119867119878(ℎBB minus ℎTB)

(3)

If the height of the target object is not exactly equal tothe given assumed value and there is a difference of 120576 acalculation error of distance will occur As shown in Figure 6if the actual height of the object is ℎ1015840

119905= ℎ119905+ 120576 and the image

Table 4 Notations for calculation of target object distance

Notation Descriptionℎ119905 Assumed (given) height of the target object

119889119905

Distance between the visual sensor and targetobject

120582Included angle of the two lines of sight to topand bottom of the target object

119867119878 Height of the internal image sensor

ℎ119904

Height of the captured target object in theinternal image sensor

119889119904

Distance between the internal image sensor andthe camera lens

119867119877 Vertical resolution (height in pixels) of the image

ℎBB

Height (in pixels) of the distance between thebottom edge of the object and the top edge of theimage

ℎTBHeight (in pixels) of the distance between thetop edges of the object and the image

height of the captured object is the same the system believesthat the object distance is 119889

119905after the calculation However

the actual distance should be 1198891015840119905 which can be calculated by

the following equations Equation (7) shows that there will bean error ratio of 120576ℎ

119905between the calculated and actual object

distances For instance if ℎ119905is given as a value of 170 cm and

the calculated object distance 119889119905is 30m but the actual height

ℎ1015840119905of the object is 175 cm then the actual distance should have

an error of 5 cm170 cmtimes30m = 088mWe believe that thedistance error ratio of 120576ℎ

119905is tolerable

Let

119896 =119889119904119867119877

119867119878(ℎBB minus ℎTB)

(4)

119889119905= ℎ119905119896 (5)

1198891015840

119905= (ℎ119905+ 120576) 119896 = ℎ

119905119896 + 120576119896 = 119889

119905+ 120576119889119905

ℎ119905

(6)

1198891015840

119905= 119889119905(1 +

120576

ℎ119905

) (7)

As shown in Figure 7 (119909119904 119910119904) and (119909

119905 119910119905) indicate the

coordinates of the visual sensor and target object respec-tively Once both the direction and distance of the targetobject are calculated the coordinates of the object can beobtained as follows This will be used in the control of sensorhandover

119909119905= 119909119904+ 119889119905cos 120579119905

119910119905= 119910119904+ 119889119905sin 120579119905

(8)

44 Sensor Handover After the visual sensor obtains thecoordinates of the target object it adjusts the sensing direc-tion toward the target and then detects object movement bydynamic background subtraction mentioned in Section 42Once a movement of the target object is detected the new

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 5: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of Distributed Sensor Networks 5

Voronoi cell

1 2 3 4

5 6 7 8

St0St0 St0 St0

St0St0St0St0

St1 St1

St1St1St1St1

St1

St2 St2

St2St2St2St2

St3

St3St3St3St3

St4 St4 St4St4

St5St5 St5

St6 St6

St7

St1

St2

St3

St4

St5

St6

St7

Figure 3 Construction of a local Voronoi cell

Background image

Updated background New subsequent image

Background update Object movement

minus

minus

=

=

Subsequent imagef(x y ti+1)

f(x y ti+1)

f(x y ti)

f(x y ti+2)

Moved object

Detected object

Figure 4 Dynamic background subtraction for object and movement detection

42 Object Detection In a tracking system object detectionis necessary prior to the object tracking For tracking in aVSN the detection can be done by object segmentation fromthe camera scenes Background subtraction [28ndash30] is animportant approach for object segmentation in visual surveil-lance system which compares two images to acquire thedifference A simple illustration of background subtractionis shown in Figure 4 A foreground intruder object can beextracted by comparing current image frame 119891(119909 119910 119905

119894+1)

with previous reference background image 119891(119909 119910 119905119894) Fur-

thermore a dynamic background subtraction approach willkeep background image updated for next comparison withnewer incoming image to detect object movement Dynamicbackground subtraction can be defined as the followingequationwhere119891diff(119909 119910 119905119894+1) is the image frame of differencebetween background frame and subsequent frame

119891diff (119909 119910 119905119894+1) =1003816100381610038161003816119891 (119909 119910 119905119894) minus 119891 (119909 119910 119905119894+1)

1003816100381610038161003816 (1)

6 International Journal of Distributed Sensor Networks

Table 3 Notations for calculation of target object direction

Notation Description

120579119905

Angle of the direction from the sensor to thetarget relative to the positive 119909-axis

120573Included angle between the line of sight fromthe sensor to the target and the right edge ofthe field of view (FoV)

119882119877

Horizontal resolution (width in pixels) of theimage

119908119905

Width (in pixels) of the distance between thetarget object center and the right edge of theimage

119908LB

Width (in pixels) of the distance between theleft edge of the detected target object and theright edge of the image

119908RB

Width (in pixels) of the distance between theright edges of the detected target object and theimage

43 Target Localization Once an incoming object is detectedby a visual sensor the sensor will estimate the actual positionof the object on the ground plane To estimate the objectposition firstly the direction of the object will be calculatedand secondly the distance of the object Regarding the objectdirection as shown in Figure 5 the view from the visualsensor is limited by its angle of view and the captured sceneis shown on the image in a rectangle shape Things in aline of sight will appear in a vertical line on the image Asthe illustration the direction of the detected target object (ahuman) can be calculated by the following equations with thenotations listed in Table 3

120573 = 120572 sdot119908119905

119882119877

= 120572 sdot119908LB + 119908RB2119882119877

120579119905= 120579119904minus120572

2+ 120573

(2)

Regarding the distance of the object as shown in Figure 6the image of real object is projected reversely on the internalimage sensor (usually a CCD or CMOS sensor) through thecamera lens and then electronic signals generated by theinternal image sensor are processed with an image processorto form the digital image Since the target object was assumedto be a height of given value the distance between the targetobject and visual sensor can be calculated by the followingequations with the notations listed in Table 4

tan 120582 =ℎ119905

119889119905

=ℎ119904

119889119904

119889119905=ℎ119905119889119904

ℎ119904

=ℎ119905119889119904

119867119878(ℎBB minus ℎTB) 119867119877

=ℎ119905119889119904119867119877

119867119878(ℎBB minus ℎTB)

(3)

If the height of the target object is not exactly equal tothe given assumed value and there is a difference of 120576 acalculation error of distance will occur As shown in Figure 6if the actual height of the object is ℎ1015840

119905= ℎ119905+ 120576 and the image

Table 4 Notations for calculation of target object distance

Notation Descriptionℎ119905 Assumed (given) height of the target object

119889119905

Distance between the visual sensor and targetobject

120582Included angle of the two lines of sight to topand bottom of the target object

119867119878 Height of the internal image sensor

ℎ119904

Height of the captured target object in theinternal image sensor

119889119904

Distance between the internal image sensor andthe camera lens

119867119877 Vertical resolution (height in pixels) of the image

ℎBB

Height (in pixels) of the distance between thebottom edge of the object and the top edge of theimage

ℎTBHeight (in pixels) of the distance between thetop edges of the object and the image

height of the captured object is the same the system believesthat the object distance is 119889

119905after the calculation However

the actual distance should be 1198891015840119905 which can be calculated by

the following equations Equation (7) shows that there will bean error ratio of 120576ℎ

119905between the calculated and actual object

distances For instance if ℎ119905is given as a value of 170 cm and

the calculated object distance 119889119905is 30m but the actual height

ℎ1015840119905of the object is 175 cm then the actual distance should have

an error of 5 cm170 cmtimes30m = 088mWe believe that thedistance error ratio of 120576ℎ

119905is tolerable

Let

119896 =119889119904119867119877

119867119878(ℎBB minus ℎTB)

(4)

119889119905= ℎ119905119896 (5)

1198891015840

119905= (ℎ119905+ 120576) 119896 = ℎ

119905119896 + 120576119896 = 119889

119905+ 120576119889119905

ℎ119905

(6)

1198891015840

119905= 119889119905(1 +

120576

ℎ119905

) (7)

As shown in Figure 7 (119909119904 119910119904) and (119909

119905 119910119905) indicate the

coordinates of the visual sensor and target object respec-tively Once both the direction and distance of the targetobject are calculated the coordinates of the object can beobtained as follows This will be used in the control of sensorhandover

119909119905= 119909119904+ 119889119905cos 120579119905

119910119905= 119910119904+ 119889119905sin 120579119905

(8)

44 Sensor Handover After the visual sensor obtains thecoordinates of the target object it adjusts the sensing direc-tion toward the target and then detects object movement bydynamic background subtraction mentioned in Section 42Once a movement of the target object is detected the new

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 6: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

6 International Journal of Distributed Sensor Networks

Table 3 Notations for calculation of target object direction

Notation Description

120579119905

Angle of the direction from the sensor to thetarget relative to the positive 119909-axis

120573Included angle between the line of sight fromthe sensor to the target and the right edge ofthe field of view (FoV)

119882119877

Horizontal resolution (width in pixels) of theimage

119908119905

Width (in pixels) of the distance between thetarget object center and the right edge of theimage

119908LB

Width (in pixels) of the distance between theleft edge of the detected target object and theright edge of the image

119908RB

Width (in pixels) of the distance between theright edges of the detected target object and theimage

43 Target Localization Once an incoming object is detectedby a visual sensor the sensor will estimate the actual positionof the object on the ground plane To estimate the objectposition firstly the direction of the object will be calculatedand secondly the distance of the object Regarding the objectdirection as shown in Figure 5 the view from the visualsensor is limited by its angle of view and the captured sceneis shown on the image in a rectangle shape Things in aline of sight will appear in a vertical line on the image Asthe illustration the direction of the detected target object (ahuman) can be calculated by the following equations with thenotations listed in Table 3

120573 = 120572 sdot119908119905

119882119877

= 120572 sdot119908LB + 119908RB2119882119877

120579119905= 120579119904minus120572

2+ 120573

(2)

Regarding the distance of the object as shown in Figure 6the image of real object is projected reversely on the internalimage sensor (usually a CCD or CMOS sensor) through thecamera lens and then electronic signals generated by theinternal image sensor are processed with an image processorto form the digital image Since the target object was assumedto be a height of given value the distance between the targetobject and visual sensor can be calculated by the followingequations with the notations listed in Table 4

tan 120582 =ℎ119905

119889119905

=ℎ119904

119889119904

119889119905=ℎ119905119889119904

ℎ119904

=ℎ119905119889119904

119867119878(ℎBB minus ℎTB) 119867119877

=ℎ119905119889119904119867119877

119867119878(ℎBB minus ℎTB)

(3)

If the height of the target object is not exactly equal tothe given assumed value and there is a difference of 120576 acalculation error of distance will occur As shown in Figure 6if the actual height of the object is ℎ1015840

119905= ℎ119905+ 120576 and the image

Table 4 Notations for calculation of target object distance

Notation Descriptionℎ119905 Assumed (given) height of the target object

119889119905

Distance between the visual sensor and targetobject

120582Included angle of the two lines of sight to topand bottom of the target object

119867119878 Height of the internal image sensor

ℎ119904

Height of the captured target object in theinternal image sensor

119889119904

Distance between the internal image sensor andthe camera lens

119867119877 Vertical resolution (height in pixels) of the image

ℎBB

Height (in pixels) of the distance between thebottom edge of the object and the top edge of theimage

ℎTBHeight (in pixels) of the distance between thetop edges of the object and the image

height of the captured object is the same the system believesthat the object distance is 119889

119905after the calculation However

the actual distance should be 1198891015840119905 which can be calculated by

the following equations Equation (7) shows that there will bean error ratio of 120576ℎ

119905between the calculated and actual object

distances For instance if ℎ119905is given as a value of 170 cm and

the calculated object distance 119889119905is 30m but the actual height

ℎ1015840119905of the object is 175 cm then the actual distance should have

an error of 5 cm170 cmtimes30m = 088mWe believe that thedistance error ratio of 120576ℎ

119905is tolerable

Let

119896 =119889119904119867119877

119867119878(ℎBB minus ℎTB)

(4)

119889119905= ℎ119905119896 (5)

1198891015840

119905= (ℎ119905+ 120576) 119896 = ℎ

119905119896 + 120576119896 = 119889

119905+ 120576119889119905

ℎ119905

(6)

1198891015840

119905= 119889119905(1 +

120576

ℎ119905

) (7)

As shown in Figure 7 (119909119904 119910119904) and (119909

119905 119910119905) indicate the

coordinates of the visual sensor and target object respec-tively Once both the direction and distance of the targetobject are calculated the coordinates of the object can beobtained as follows This will be used in the control of sensorhandover

119909119905= 119909119904+ 119889119905cos 120579119905

119910119905= 119910119904+ 119889119905sin 120579119905

(8)

44 Sensor Handover After the visual sensor obtains thecoordinates of the target object it adjusts the sensing direc-tion toward the target and then detects object movement bydynamic background subtraction mentioned in Section 42Once a movement of the target object is detected the new

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 7: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of Distributed Sensor Networks 7

120573120573

120573120572

120572

120572

Image

FoV

120579s

120579s

wt

WR

120579t

x-axis

y-axis

z-a

xis

wLB

wRB

Figure 5 Calculation of object (target) direction

Image

Internal image sensor(CCD or CMOS)

HR

HShs

ds

dt998400

dt

ht ht998400

+120576

120582

hTB

hBB

Figure 6 Calculation of object (target) distance

position of the object will be calculated by the localizationscheme described in Section 43 again This is a routineprocedure for local target tracking by the visual sensor untilthe tracking task is handed over

To determine that whether the visual sensor will han-dover the task of tracking of the target object firstly thesensor utilizes the structure of the constructed local Voronoicell and divides the cell into several subregions As shown

in Figure 8 the Voronoi cell associated with the sensor 119904is divided into several triangular subregions according tothe vertices of the cell and the target object 119905 is locatedin the subregion of ΔV

119877119894119904V119871119894

with an included angle ofangV119877119894119904V119871119894= 120601

A target object 119905 is detected by a sensor 119904 and belongs toone of the subregions if and only if all the following threeconditions are satisfied

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 8: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

8 International Journal of Distributed Sensor Networksz

-axi

s

y-axis

x-axis

(xt yt)

120579t

xt

yt

dt

(xs ys)

Figure 7 Calculation of object (target) coordinates

(xs ys)

Si

(xt yt)

120601

r

t

S

R119894

L119894

(xR119894 yR119894 )

(xL119894 yL119894 )

(xs119894 ys119894 )

Figure 8 Subregions divided from a local Voronoi cell according tothe vertices

(1) The object distance is less than or equal to the sensingradius of the sensor

119889119905le 119903 (9)

(2) The distance between the object and the sensor isless than or equal to the one between the objectand the neighbor sensor on the opposite side of thebisector line (edge) facing the included angle of thesubregion

radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

le radic(119909119905minus 119909119904119894)2

+ (119910119905minus 119910119904119894)2

(10)

(3) The object is located inside the two edges of theincluded angle of the subregion

(119909119905minus 119909119904) (119909119871 119894minus 119909119904) + (119910

119905minus 119910119904) (119910119871 119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119877119894minus 119909119904)2

+ (119910119877119894minus 119910119904)2

times ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(11)

(119909119905minus 119909119904) (119909119877119894minus 119909119904) + (119910

119905minus 119910119904) (119910119877119894minus 119910119904)

ge radic(119909119905minus 119909119904)2

+ (119910119905minus 119910119904)2

(119909119871 119894minus 119909119904)2

+ (119910119871 119894minus 119910119904)2

sdot ((119909119871 119894minus 119909119904) (119909119877119894minus 119909119904) + (119910

119871 119894minus 119910119904) (119910119877119894minus 119910119904))

(12)

Equation (11) represents that the target object is notlocated outside the left edge 119904V

119871 119894and (12) represents that the

target object is not located outside the right edge 119904V119877119894 The

former is derived from the definition of inner product of twoEuclidean vectors in linear algebra

997888997888119904V119871 119894sdot997888997888119904V119877119894=10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817cos 0 (13)

cos 0 =997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

(14)

The angle angV119871119894119904119905must be less than or equal to 120601 therefore

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817cos 0

997888119904119905 sdot997888997888119904V119871 119894ge100381710038171003817100381710038171003817

997888119904119905100381710038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817sdot

997888997888119904V119871 119894sdot997888997888119904V119877119894

10038171003817100381710038171003817997888997888119904V119871 119894

10038171003817100381710038171003817

10038171003817100381710038171003817997888997888119904V119877119894

10038171003817100381710038171003817

⟨119909119905minus 119909119904119910119905minus 119910119904⟩ sdot ⟨119909

119871 119894minus 119909119904119910119871 119894minus 119910119904⟩

ge

1003817100381710038171003817⟨119909119905 minus 119909119904119910119905 minus 119910119904⟩1003817100381710038171003817

10038171003817100381710038171003817⟨119909119877119894minus 119909119904119910119877119894minus 119910119904⟩10038171003817100381710038171003817

sdot ⟨119909119871 119894minus 119909119904119910119871 119894minus 119910119904⟩ sdot ⟨119909

119877119894minus 119909119904119910119877119894minus 119910119904⟩

(15)

Then (11) is derived Similarly the angle angV119877119894119904119905 must be less

than or equal to 120601 thus 997888119904119905 sdot 997888997888119904V119877119894ge 997888119904119905

997888997888119904V119877119894 cos 0 and then

(12) also can be derivedOnce a sensor detected an object and ascertained that

the object is located in one of the subregions of the localVoronoi cell there will be four conditions while the sensorkeeps tracking the moving target object

(1) As shown in Figure 9(a) the target object can still bedetected and it is still located in the same triangularsubregion that is to say (9) (10) (11) and (12) are stillsatisfied under the same pair of vertices V

119871119894and V

119877119894

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 9: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of Distributed Sensor Networks 9

r

Si t

S(xs ys)

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 )L119894

(xs119894 ys119894 )

(a) Condition 1

(xs ys)

Si120601

r

t

S

Sj

(xs119894 ys119894 )

(xL119895 yL119895 )(xR119895 yR119895 )

L119895 R119895

(b) Condition 2

(xs ys)

r

Si

S

120601

t

(xR yR) R

(xL yL)L

sit

st

(xs119894 ys119894 )

(c) Condition 3

(xs ys)

r

Si

t

S

Sjj

120601

(xR119894 yR119894 )R119894

(xL119894 yL119894 ) L119894

(xs119894 ys119894 )

(d) Condition 4

Figure 9 Different target tracking conditions

For this condition the sensor will normally keeptracking the target object

(2) As shown in Figure 9(b) the target object can still bedetected but it has justmoved fromoriginal subregionto another subregion For this condition the sensorwill calculate and obtain a new pair of vertices V

119871119895

and V119877119895 for the target tracking in the new subregion

Equations (9) (10) (11) and (12) are still satisfied forthis condition

(3) As shown in Figure 9(c) the target object can still bedetected but the target object has just moved beyondthe bisector line V

119871 119894V119877119894

between the two sensors 119904and 119904119894 that is to say (10) is no longer satisfied and

the object moved from the original Voronoi cell to

the Voronoi cell belongs to the sensor 119904119894 This means

that the sensor 119904119894will obtain a better sensing (image)

quality than the sensor 119904 and it is more suitableto track the target object For this condition thesensor s will send a request message with the objectcoordinates to 119904

119894and handover the tracking task to

119904119894 After receiving the request message the sensor 119904

119894

will adjust its sensing direction to the target object andreply a message to 119904 for confirmation of taking overthe tracking task

(4) As shown in Figure 9(d) the target object is still in thetriangular subregion but it moved out of the sensingradius of the sensor 119904 that is to say (9) is no longersatisfied This means that the sensor lost track of

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 10: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

10 International Journal of Distributed Sensor Networks

Handover request

Randomdeployment

Broadcast sensor coordinates

Receive neighbor coordinates

Construct localVoronoi cell

Coverage enhancement

Object detectionTargetlocalization

Target movementdetection

Normal tracking

Change sensing direction

Handover

Subregion changeSend handover request to the faced sensor

No

Yes

Send handover request to theselected sensors

Condition 1 Condition 2 Condition 3 Condition 4

received

condition

Figure 10 Procedures of the proposed VSHP

the target object For this condition the sensorwill send a request message with the final detectedcoordinates of the target object to certain of its adja-cent sensors Those sensors that received the requestmessage will adjust their sensing directions to thereceived coordinates andwait (detect) for the possibleappearance of the target object For example thoughthe target object was lost track it may move aroundand appear near one of the vertices of the triangularsubregion (eg V

119877119894) Accordingly the sensor s will

send a request message to both 119904119894and 119904

119895to notify

them that the target object may move to and appearin their fields of view and an adjustment of sensingdirection is needed The following equations showthe selection of destination sensors for sending therequest message Notations (119909

119905119891 119910119905119891) and 119881

119904119896are the

final detected coordinates of the target and the set oflocal Voronoi vertices of sensor 119904

119896 respectively

V1015840 = V | V isin V119871 119894 V119877119894

radic(119909V minus 119909119904)2

+ (119910V minus 119910119904)2

gt 119903

V10158401015840 =

argminVisinV1015840

radic(119909V minus 119909119905119891)2

+ (119910V minus 119910119905119891)2

radic(119909119904119894minus 119909119904)2

+ (119910119904119894minus 119910119904)2

2gt 119903

V1015840otherwise

119878req = 119904119896 | V10158401015840

isin 119881119904119896 minus 119904

(16)

The probability of that a target can be detected by a visualsensor is equal to the overall sensing coverage ratio of theVSN In addition once a target has been detected and undertracking the probability of target missing is equal to 119890minus120582120587119903

2

where 120582 is the density of visual sensors and 119903 is the sensingradius This value of probability is 1 minus 119877omni where 119877omni is

the omnidirectional coverage (in the shape of a circle) ratio ofthe sensors It is because one visual sensor will be notified ofrotating the sensing direction and taking over the target whenthe target is going to leave apart from its current trackerThiswill be failed if no other sensor can cover the target even ifthe sensing direction is rotated

As described in Section 41 after visual sensors aredeployed initially an algorithm of sensing coverage enhance-ment is performed Whenever a sensor keeps tracking amoving target object or several sensors are notified of adjust-ing their sensing directions to wait a possible appearanceof the lost target their sensing directions will be changedand the overlapped coverage can be increased thus overallfield coverage can be reduced This will cause a negativeeffect on further object detection and tracking Therefore inthe proposed protocol those sensors that have changed thesensing direction will return to their original directions if thesensors have lost the target or waited the appearance of targetfor a long time 119905ret

45 Procedures of VSHP Figure 10 summarizes the pro-cedures of the proposed Voronoi-based sensor handoverprotocol for moving target tracking

With regard to the issue of energy consumption the sen-sor operations about computation communication directionrotation and photographing will consume energy of thesensor This could cause sensors energy exhaustion andmalfunction in the applications of wireless VSNs A recentresearch topic and current trend about energy issue of sensornetworks are the energy harvesting sensors [31ndash33] Energyharvesting is one of the promising solutions to the problem oflimited energy capacity in wireless sensor networks It is easyto foresee that a future sensor network can consist of sensordevices with the integration of the technology of energy har-vestingThis paper aims at the target localization and trackingin VSNs therefore energy issue is not concerned Howeverour proposed algorithm can still perform well while some ofthe sensors are malfunctioningThis is due to the fact that theproposed Voronoi-based algorithm makes visual sensors beable to easily reconstruct the local Voronoi cells and can keep

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 11: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of Distributed Sensor Networks 11

0

10

20

30

40

50

60

50 60 70 80 90 100Number of visual sensors

OVSN-CVSHP-C

OVSN-LVSHP-L

Figure 11 Coverage and target-detected latency (fixed 120572 = 105∘119903 = 70m)

the VSN tracking operations well performing until most (orall) of the sensors are failed It has the characteristic of faulttolerance and graceful degradation

On the other hand the tracking algorithm describedabove focuses on the case of tracking single target withone visual sensor It is applicable to the case of whichmultiple targets occur in the surveillance region and can berespectively tracked by one sensor Once the multiple targetsoccur in one Voronoi cell at the same time they only can beall tracked if (1) all of them keep locating in the FoV of theassociated sensor of the Voronoi cell or (2) each individual ofthe targets is located within the sensing radius of any othervisual sensor that can be notified of taking over the target

5 Simulation Results

This study evaluates the proposed Voronoi-based approachwith simulations There are four QoS criteria to be usedfor the evaluation (1) the overall sensing field coverage (2)the target-detected latency (3) the target-tracked ratio and(4) the average target distance The first overall coveragewill affect the detection of target objects The second target-detected latency indicates how long the time is taken by asensor detecting the target since the target moved into thesurveillance region The third target-tracked ratio indicatesthe total time of durations of which the target is tracked inits movement across the surveillance regionThe last averagetarget distance represents the image quality while the targetis tracked In the simulations a large scale visual sensornetwork is deployed randomly in the surveillance region andthe targetmoves across the surveillance regionwith a randomwaypoint mobility model [34] The target tracking serviceswith and without the proposed protocol are evaluated andcompared by the above-mentioned QoS criteria Table 5shows the parameters setting of the simulations

Figure 11 shows the simulation results of both coverageand target-detected latency OVSN-C and VSHP-C representthe overall field coverage ratios after deployment of ordi-nary visual sensor network (OVSN) and after enhancementwith proposed VSHP respectively OVSN-L and VSHP-Lrepresent the ratios of target-detected latency under OVSN

0102030405060708090

100

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 12 Target-tracked ratio (fixed 120572 = 105∘ 119903 = 70m)

0

10

20

30

40

50

60

70

50 60 70 80 90 100Number of visual sensors

OVSNVSHP

Figure 13 Average target distance (fixed 120572 = 105∘ 119903 = 70m)

and proposed VSHP approach respectively The ratio oftarget-detected latency (119903

119871) is defined as (17) where 119905

119863

represents how long the time is taken by a sensor detecting thetarget since the targetmoved into the surveillance region and119905SR is the total time of the sensor traveling in the surveillanceregion In Figure 11 the simulation result shows that thetarget-detected latency of VSHP is less than the one of OVSNno matter how many the number of deployed sensors is (n= 50sim100) VSHP reduces about 4 of the total travel timeof the target to detect the target This is due to the coverageenhancement inVSHPThe simulation result shows that thereis a coverage improvement of about 55 with VSHP

119903119871=119905119863

119905SR (17)

The total travel time of a target in the surveillance regionconsists of target-tracked and target-untracked durationsKeeping a higher target-tracked time ratio is an importantcriterion for target tracking services Figure 12 shows thatVSHPprovides a higher target-tracked ratio thanOVSNThisis due to the efficacy of the Voronoi-based sensor handovermechanism

The other criterion for measuring the QoS of targettracking in a VSN is the average distance between the sensorand the target while the target is tracked (monitored) by

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 12: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

12 International Journal of Distributed Sensor Networks

Table 5 Parameters setting of the simulations

Notation Description Value Default119865 Size of the surveillance region 800m times 800m119899 Number of the visual sensors 50sim100 (interval 10) 100120572 Angle of view (AoV) of the visual sensors 60∘ sim120∘ (interval 15∘) 105∘

119903 Sensing radius of the visual sensors 50msim90m (interval 10m) 70m

01020304050607080

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

OVS

N-C

VSH

P-C

OVS

N-L

VSH

P-L

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 14 Coverage and target-detected latency (fixed 119899 = 100)

0102030405060708090

100

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 15 Target-tracked ratio (fixed 119899 = 100)

the sensor For a visual (camera) sensor a shorter targetdistance will bring a clearer (higher) quality of image orvideo Figure 13 shows the simulation result of average targetdistance of which the target is tracked The proposed VSHPhas a shorter average target distance than OVSN doesThis isdue to the fact that the proposed handover protocol utilizesthe characteristic of Voronoi cell to select an appropriatesensor for the tracking task Moreover the average targetdistance is reduced from 40m to 35m while the number ofsensor is increased from 50 to 100 This indicates that theVSHP can select more appropriate sensors from the largernumber of deployed sensors for the target tracking

20

30

40

50

60

70

OVS

N

OVS

N

OVS

N

OVS

N

OVS

N

VSH

P

VSH

P

VSH

P

VSH

P

VSH

P

Sensing radius50m 60m 70m 80m 90m

60∘

75∘

90∘

105∘

120∘

Figure 16 Average target distance (fixed 119899 = 100)

The results given above are with various values of 119899but fixed values of 120572 and 119903 The simulation results of thecases with fixed 119899 but various 120572 and 119903 were also given asfollows Figures 14 15 and 16 show the coverage and target-detected latency the target-tracked ratio and the averagetarget distance respectively The results are similar to thosegiven above VSHP obtained higher coverage ratios andless target-detected latency in comparison with OVSN Thetarget-tracked ratio of VSHP is higher than that of OVSNAnd VSHP obtained shorter average target distance thanOVSN did In summary VSHP can provide a better qualityof target tracking in visual sensor networks

6 Conclusion and Future Works

This paper utilizes the structure and characteristic of Voronoicells and proposes a new solution for target tracking invisualcamera sensor networks The solution contains mech-anisms of coverage enhancement object detection targetlocalization and sensor handover Simulations were used forthe evaluation of effectiveness of the proposed approachFour QoS criteria for target tracking were evaluated Theresults show that the approach performs well and has animprovement in comparison with target tracking in ordinaryvisual sensor networks

Our future work is to implement a practical systemwith real camera sensors An experimental evaluation ofeffectiveness and performance for the practical system willbe made Moreover the utilizations of mobile visual sensors(eg smartphones) and the integration with cloud-based

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 13: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of Distributed Sensor Networks 13

monitoring service will be developed and realized for anextension of this study

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported byNational ScienceCouncil of Taiwanunder Grant no NSC 102-2219-E-006-001 and ResearchCenter for Energy Technology of National Cheng KungUniversity under Grant no D102-23015

References

[1] I F Akyildiz and M C Vuran Wireless Sensor Networks JohnWiley and Sons New York NY USA 2010

[2] K K Khedo R Perseedoss and A Mungur ldquoA wireless sensornetwork air pollutionmonitoring systemrdquo International Journalof Wireless and Mobile Networks vol 2 no 2 pp 31ndash45 2010

[3] E Felemban ldquoAdvanced border intrusion detection and surveil-lance using wireless sensor network technologyrdquo InternationalJournal of Communications Network and System Sciences vol6 no 5 pp 251ndash259 2013

[4] C Zhu C Zheng L Shu and G Han ldquoA survey on coverageand connectivity issues in wireless sensor networksrdquo Journal ofNetwork and Computer Applications vol 35 no 2 pp 619ndash6322012

[5] M G Guvensan and A G Yavuz ldquoOn coverage issues indirectional sensor networks a surveyrdquo Ad Hoc Networks vol9 no 7 pp 1238ndash1255 2011

[6] M Bramberger A Doblander A Maier B Rinner and HSchwabach ldquoDistributed embedded smart cameras for surveil-lance applicationsrdquo Computer vol 39 no 2 pp 68ndash75 2006

[7] S Soro and W Heinzelman ldquoA survey of visual sensor net-worksrdquo Advances in Multimedia vol 2009 Article ID 64038622 pages 2009

[8] F Viani P Rocca G Oliveri D Trinchero and A MassaldquoLocalization tracking and imaging of targets in wirelesssensor networks an invited reviewrdquo Radio Science vol 46 no5 Article ID RS5002 2011

[9] Y Charfi N Wakamiya and M Murata ldquoChallenging issues invisual sensor networksrdquo IEEEWireless Communications vol 16no 2 pp 44ndash49 2009

[10] A Okabe B Boots K Sugihara and S N Chiu SpatialTessellations Concepts and Applications of Voronoi DiagramsWiley Chichester UK 2nd edition 2000

[11] L Cheng C Wu Y Zhang H Wu M Li and C Maple ldquoAsurvey of localization in wireless sensor networkrdquo InternationalJournal of Distributed Sensor Networks vol 2012 Article ID962523 12 pages 2012

[12] N A Alrajeh M Bashir and B Shams ldquoLocalization tech-niques in wireless sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 304628 9pages 2013

[13] E Niewiadomska-Szynkiewicz ldquoLocalization in wireless sensornetworks classification and evaluation of techniquesrdquo Informa-tion International Journal of AppliedMathematics andComputerScience vol 22 no 2 pp 281ndash297 2012

[14] L Liu X Zhang and H Ma ldquoOptimal node selection fortarget localization in wireless camera sensor networksrdquo IEEETransactions on Vehicular Technology vol 59 no 7 pp 3562ndash3576 2010

[15] W Li and W Zhang ldquoSensor selection for improving sccuracyof target localisation in wireless visual sensor networksrdquo IETWireless Sensor Systems vol 2 no 4 pp 293ndash301 2012

[16] D Gao W Zhu X Xu and H C Chao ldquoA hybrid localizationand tracking system in camera sensor networksrdquo InternationalJournal of Communication Systems 2012

[17] L Liu B Hu and L Li ldquoAlgorithms for energy efficient mobileobject tracking inwireless sensor networksrdquoCluster Computingvol 13 no 2 pp 181ndash197 2010

[18] A M Khedr and W Osamy ldquoEffective target tracking mech-anism in a self-organizing wireless sensor networkrdquo Journal ofParallel andDistributedComputing vol 71 no 10 pp 1318ndash13262011

[19] W Chen and Y Fu ldquoCooperative distributed target trackingalgorithm in mobile wireless sensor networksrdquo Journal ofControl Theory and Applications vol 9 no 2 pp 155ndash164 2011

[20] P K Sahoo J P Sheu and K Y Hsieh ldquoTarget trackingand boundary node selection algorithms of wireless sensornetworks for internet servicesrdquo Information Sciences vol 230pp 21ndash38 2013

[21] Z Chu L Zhuo Y Zhao and X Li ldquoCooperative multi-object tracking method for wireless video sensor networksrdquoin Proceedings of the 3rd IEEE International Workshop onMultimedia Signal Processing (MMSP rsquo11) pp 1ndash5 HangzhouChina November 2011

[22] J Han E J Pauwels P M de Zeeuw and P H N deWith ldquoEmploying a RGB-D sensor for real-time tracking ofhumans across multiple Re-entries in a smart environmentrdquoIEEE Transactions on Consumer Electronics vol 58 no 2 pp255ndash263 2012

[23] W Limprasert A Wallace and G Michaelson ldquoReal-timepeople tracking in a camera networkrdquo IEEE Journal on Emergingand Selected Topics in Circuits and Systems vol 3 no 2 pp 263ndash271 2013

[24] YWang DWang andW Fang ldquoAutomatic node selection andtarget tracking in wireless camera sensor networksrdquo Computersand Electrical Engineering 2013

[25] W Li ldquoCamera sensor activation scheme for target trackingin wireless visual sensor networksrdquo International Journal ofDistributed Sensor Networks vol 2013 Article ID 397537 11pages 2013

[26] S Fortune ldquoA sweepline algorithm for Voronoi diagramsrdquoAlgorithmica vol 2 no 1ndash4 pp 153ndash174 1987

[27] T W Sung and C S Yang ldquoVoronoi-based coverage improve-ment approach for wireless directional sensor networksrdquo Jour-nal of Network and Computer Applications vol 39 pp 202ndash2132014

[28] A Elgammal Background Subtraction Theory and PracticeAugmented Vision and Reality Springer Berlin Germany 2013

[29] M Alvar A Sanchez and A Arranz ldquoFast background sub-traction using static and dynamic gatesrdquo Artificial IntelligenceReview vol 41 no 1 pp 113ndash128 2014

[30] O Barnich and M Van Droogenbroeck ldquoViBe a universalbackground subtraction algorithm for video sequencesrdquo IEEETransactions on Image Processing vol 20 no 6 pp 1709ndash17242011

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 14: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

14 International Journal of Distributed Sensor Networks

[31] N M Roscoe and M D Judd ldquoHarvesting energy frommagnetic fields to power condition monitoring sensorsrdquo IEEESensors Journal vol 13 no 6 pp 2263ndash2270 2013

[32] L Xie Y Shi Y T Hou and H D Sherali ldquoMaking sensornetworks immortal an energy-renewal approach with wirelesspower transferrdquo IEEEACMTransactions onNetworking vol 20no 6 pp 1478ndash1761 2012

[33] S Sudevalayam and P Kulkarni ldquoEnergy harvesting sensornodes survey and implicationsrdquo IEEE Communications Surveysand Tutorials vol 13 no 3 pp 443ndash461 2011

[34] C Bettstetter H Hartenstein and X Perez-Costa ldquoStochasticproperties of the random waypoint mobility modelrdquo WirelessNetworks vol 10 no 5 pp 555ndash567 2004

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 15: Research Article A Voronoi-Based Sensor Handover Protocol for Target Tracking …downloads.hindawi.com/journals/ijdsn/2014/586210.pdf · 2015. 11. 22. · for target tracking in [

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of