A collaborative and voice configured electric powered wheelchair control system based...

5
A collaborative and voice configured electric powered wheelchair control system based on EEG and head movement Fatma BEN TAHER, Nader BEN AMOR, Mohamed JALLOULI, Aicha BEN HAMMOUDA, Olfa DGHIM Computer and embedded system (CES) Ecole Nationale d’Ingenieurs de sfax University of Sfax, Tunisia Abstract—The statistics realized in 2013 by the Ministry of Social Affairs, indicate that the rate of disability in Tunisia is estimated at 2%, 43.9% is a mobility impairment. A large number of them have a difficulty or are enable to use a usual electrical powered wheelchair. As a solution, some researchers replace the standard joystick by eye, voice, head, etc... control. In this paper, the purpose is to control the EPW using EEG signal and head movement to increase the system performance. The voice recognition is used to configure, activate and deactivate the control system. KeywordsEEG signal; head movement ; voice recognition ; EPW control; human-computer interaction I. I NTRODUCTION Thanks to technological progress, the Electric Wheelchairs (EPW) has allowed many people with severe motor deficits to overcome their mobility handicap. This had a positive impact on their quality of life because of the various fields of activity become accessible. Despite the importance of wheelchairs, some types of motor disability are incapable to use usual EPW. In fact, According to experiments conducted with individu- als with severe disabilities [1] in France, the use of power wheelchairs is very restricted and requires a personalization of the EPW. To allow these persons to regain independent mobility, many researchers were interested in the development of smart wheelchair by adding automatic features of the chair or by using alternatives methods of monitoring. Our approach is to provide the EPW users an alternative way to control the EPW. The control of the EPW is entirely the responsibility of the user, the machine simply translate signals into commands. The proposed control system combines brain signals and head movement to provide a simple and reliable control. To configure the control interface, activate and deactivate the movement of the EPW, we propose the voice recognition to avoid a third part assistance. This paper is structured as follows: section 2 describes a state of art of smart wheelchair. In section 3, the proposed methodology is described and section 4 reviews experiences and results. Finally, section 5 presents conclusion. II. STATE OF ART The field of robotics for disabled has seen progress in recent years. The aim is to facilitate handicapped persons lifes and help them to integrate the society. Many researches are interested in wheelchairs called ”intelligent” inspired mainly from mobile robotics. Especially since the development of computer systems has provided new opportunities for research in the field of human-computer interaction A. EEG based wheelchair system Electroencephalography (EEG) is the recording of the electrical activity produced by neurons in the brain [2] [3]. Registration is obtained by placing electrodes on the scalp of a gel or a conductive paste. Usually used in medical field to detect brain anomaly, but recently used in robotic domain. Minguez et al. [4] propose to make a virtual reconstruction of the displayed environment with a set of points in the free space that can be selected using P300 brain interface machine. The selected destination will be reached automatically. In [5], Carlson et al. propose a shared control architecture that couples the intelligence and desires of the user with the precision of the EPW based on asynchronous motor imagery protocol. The user imagines the kinesthetic movement of the left hand, right hand or both feet, which gives three separate classes. During the training process, the authors select the two most discriminable classes to provide reliable mapping to control actions. When one of these navigation commands is not provided by the user, a third non-voluntary control implicit class exists. In [6], EEG signals are used to detect the direction of the eyes to control the EPW. Indeed, Tran et al. propose an algorithm that processes online EEG signals to control the chair in real time. Experimental results have shown that the accuracy rate is 82.6% in the indoor environment. Some researchers believe that EEG wheelchair control is the best and the safest way to help completely paralyzed person to gain their independence. These projects could be seen as a good starting point for future applications although they lack maturity and the bulkiness of headset also constitutes a major problem. B. Head movement based wheelchair In field of head movement recognition there are two main techniques: the first is based on computer vision techniques, it needs a camera and preferably a very powerful computer and the second is based on sensors placed on the user’s head. Pajkanovic et al. detect in [7] the movement of the user’s head by placing an accelerometer on the user’s head. The system offers 4 different directions to the user. First, the gravity component must be removed. To have optimum values the authors apply a low pass filter to output values of the accelerometer. The setting of thresholds that represent the four corners of possible directions is done before starting the driving. To deliver the perfect control: the user begins to lean his head towards the desired direction then turns his head to the first position and then revisit his head.

Transcript of A collaborative and voice configured electric powered wheelchair control system based...

Page 1: A collaborative and voice configured electric powered wheelchair control system based …ipco-co.com/PET_Journal/Acecs-2016/11.pdf · 2016. 9. 15. · A collaborative and voice configured

A collaborative and voice configured electric powered wheelchair control system

based on EEG and head movement

Fatma BEN TAHER, Nader BEN AMOR, Mohamed JALLOULI, Aicha BEN HAMMOUDA, Olfa DGHIMComputer and embedded system (CES)

Ecole Nationale d’Ingenieurs de sfaxUniversity of Sfax, Tunisia

Abstract—The statistics realized in 2013 by the Ministry of SocialAffairs, indicate that the rate of disability in Tunisia is estimatedat 2%, 43.9% is a mobility impairment. A large number ofthem have a difficulty or are enable to use a usual electricalpowered wheelchair. As a solution, some researchers replacethe standard joystick by eye, voice, head, etc... control. In thispaper, the purpose is to control the EPW using EEG signaland head movement to increase the system performance. Thevoice recognition is used to configure, activate and deactivate thecontrol system.

Keywords–EEG signal; head movement ; voice recognition ;EPW control; human-computer interaction

I. INTRODUCTION

Thanks to technological progress, the Electric Wheelchairs(EPW) has allowed many people with severe motor deficits toovercome their mobility handicap. This had a positive impacton their quality of life because of the various fields of activitybecome accessible. Despite the importance of wheelchairs,some types of motor disability are incapable to use usual EPW.In fact, According to experiments conducted with individu-als with severe disabilities [1] in France, the use of powerwheelchairs is very restricted and requires a personalizationof the EPW. To allow these persons to regain independentmobility, many researchers were interested in the developmentof smart wheelchair by adding automatic features of the chairor by using alternatives methods of monitoring.Our approach is to provide the EPW users an alternativeway to control the EPW. The control of the EPW is entirelythe responsibility of the user, the machine simply translatesignals into commands. The proposed control system combinesbrain signals and head movement to provide a simple andreliable control. To configure the control interface, activate anddeactivate the movement of the EPW, we propose the voicerecognition to avoid a third part assistance.This paper is structured as follows: section 2 describes astate of art of smart wheelchair. In section 3, the proposedmethodology is described and section 4 reviews experiencesand results. Finally, section 5 presents conclusion.

II. STATE OF ART

The field of robotics for disabled has seen progress inrecent years. The aim is to facilitate handicapped persons lifesand help them to integrate the society. Many researches areinterested in wheelchairs called ”intelligent” inspired mainlyfrom mobile robotics. Especially since the development ofcomputer systems has provided new opportunities for researchin the field of human-computer interaction

A. EEG based wheelchair systemElectroencephalography (EEG) is the recording of the

electrical activity produced by neurons in the brain [2] [3].Registration is obtained by placing electrodes on the scalp ofa gel or a conductive paste. Usually used in medical field todetect brain anomaly, but recently used in robotic domain.

Minguez et al. [4] propose to make a virtual reconstructionof the displayed environment with a set of points in the freespace that can be selected using P300 brain interface machine.The selected destination will be reached automatically.In [5], Carlson et al. propose a shared control architecturethat couples the intelligence and desires of the user with theprecision of the EPW based on asynchronous motor imageryprotocol. The user imagines the kinesthetic movement of theleft hand, right hand or both feet, which gives three separateclasses. During the training process, the authors select thetwo most discriminable classes to provide reliable mappingto control actions. When one of these navigation commands isnot provided by the user, a third non-voluntary control implicitclass exists.In [6], EEG signals are used to detect the direction of the eyesto control the EPW. Indeed, Tran et al. propose an algorithmthat processes online EEG signals to control the chair in realtime. Experimental results have shown that the accuracy rateis 82.6% in the indoor environment.Some researchers believe that EEG wheelchair control is thebest and the safest way to help completely paralyzed personto gain their independence. These projects could be seen as agood starting point for future applications although they lackmaturity and the bulkiness of headset also constitutes a majorproblem.

B. Head movement based wheelchairIn field of head movement recognition there are two main

techniques: the first is based on computer vision techniques, itneeds a camera and preferably a very powerful computer andthe second is based on sensors placed on the user’s head.Pajkanovic et al. detect in [7] the movement of the user’shead by placing an accelerometer on the user’s head. Thesystem offers 4 different directions to the user. First, thegravity component must be removed. To have optimum valuesthe authors apply a low pass filter to output values of theaccelerometer. The setting of thresholds that represent thefour corners of possible directions is done before starting thedriving. To deliver the perfect control: the user begins to leanhis head towards the desired direction then turns his head tothe first position and then revisit his head.

PC
Typewriter
Proceedings of Engineering & Technology (PET)
PC
Typewriter
pp. 60-65
PC
Typewriter
Copyright IPCO-2016
PC
Typewriter
PC
Typewriter
ISSN: 2356-5608
User1
Typewritten Text
3rd International Conference on Automation, Control, Engineering and Computer Science (ACECS'16)
User1
Typewritten Text
Page 2: A collaborative and voice configured electric powered wheelchair control system based …ipco-co.com/PET_Journal/Acecs-2016/11.pdf · 2016. 9. 15. · A collaborative and voice configured

In [8], Berjon el al. propose to move the chair using a cameraconnected to a laptop to track the face of the person sitting inthe wheelchair. For this, the Haar-like features are used withthe optical flow algorithm [9].Generally a head movement tracking system based on com-puter vision must face several constraints:

• What to do when the head comes out of the camera’sfield?

• Can we install the system on the chair?

• How to deal with light variations?

• How to detect head motion for people with differentappearances?

Techniques based on the use of sensors must respect theconstraints of cost, simplicity and robustness.

C. Voice recognition based wheelchair

In [10], authors present voice controlled EPW. Input signalusing a headphone was processed with Matlab then transmittedto the processor.A Speed Control of Wheel Chair for Physically ChallengedUsing Arduino was presented in [11], voice is provided to theArduino which is a Microcontroller used to control the motionof wheelchair. HM 2007 is used for voice recognition purpose.The Arduino controls the movement of wheelchair based onthe input signal received.Another proposal in [12] uses voice to control the EPW. Infact, the EPW is controlled through the voice command anda reactive fuzzy logic controller to overcome low sound. HM2007 IC is used for the voice recognition purpose.The major inconvenient of voice based control is the noisewhen navigating in outdoor environment. Which reduce theaccuracy rate of voice based techniques.

D. Multi modal control techniques

Data fusion integrates data from multiple sensors to providebetter analysis and decision; it provides a better result becausethe strengths of one type can compensate for the weaknesses ofanother type. In literature the combination of EEG with othersource like eye tracking was to control robotic system withoutusing data fusion techniques to ameliorate system control [13]EEG and gyroscope signals can be most effectively combinedto provide a more accurate detection of head-movement arti-facts in the EEG. To this end, several methods of combiningthese physiological and physical signals at the feature, decisionand score fusion levels are examined. Results show that combi-nation at the feature, score and decision levels is successful inimproving classifier performance when compared to individualEEG or gyroscope classifiers [14]According to previous works, multi-modal control amelioratesthe system performance compared to the one source control.But, in our knowledge data fusion techniques between EEGand head movement was not used on controlling an EPW eventhat EEG and head tracking were previously used in roboticsystem. To configure and activation/deactivation the controlsystem without a third part assistance, the voice recognitionwas used.

III. WHEELCHAIR NAVIGATION SYSTEM

Our technique offers three different ways to control thewheelchair: EEG signals, head movement and EEG/head fu-sion. In this section, each technique will be presented. As thissystem target persons with severe motor disability the systemconfiguration is done via voice recognition

A. The Emotiv EPOC headsetTo acquire EEG signals, we have used the EEG headset

Emotiv EPOC(Figure 1(a)) [15]. It has the ability to pro-vide expressive, cognitive and emotional information (Figure1(b)). It does not need the assistance of a medical staff orgel application on scalp. The headset has a lithium batterythat provides 12 hours of continuous use. The neuroheadsetacquires brain Neuro-signals with 14 saline sensors placed onthe users scalp. It also integrates two internal gyroscopes toprovide users head position information. The communicationof this device with a PC occurs wirelessly by means ofa Bluetooth USB dongle. Emotiv provides software in twoways: (i) developed applications (Control Panel) are providedwith graphical interface to process brain signals, to train thesystem, and to test the neuroheadset; and (ii) an applicationprogramming interface (API) to allow users to develop Cor C++ software to be used with the neuroheadset Emotiv

Figure 1. (a) EMOTIV Epoc neuro-headset (b) connection between epocheadset and the pc

delivers the Emotiv API in the form of a dynamically linkedlibrary (dll). Emotiv Control Panel (Figure 2(a))provides aGUI (graphical user interface) that interfaces with EmotivEmoEngine through the Emotiv API. The Control Panel userinterface showcases the EmoEngine’s capabilities to decipherbrain signals and present them in useful forms using Emotiv’sdetection suites. The Cognitiv Suite panel (Figure 2(b)) uses

Figure 2. (a): EMOTIV control panel (b) Cognitiv suite panel

a virtual 3D cube to display an animated representation of theCognitive detection output. This 3D cube is also used to assistthe user in visualizing the intended action during the trainingprocess. The Power gauge to the left of the 3D display is anindicator of the ”action power”, or relative certainty that theuser is consciously visualizing the current action

B. Control technique descriptionOnce our application (Figure 3) is connected to Emotiv

control panel via Emotiv API the user could:

Page 3: A collaborative and voice configured electric powered wheelchair control system based …ipco-co.com/PET_Journal/Acecs-2016/11.pdf · 2016. 9. 15. · A collaborative and voice configured

• have all the information about the signal quality, thebattery level, the control mode, etc.

• choose the control mode from 5 choices: Expressive,cognitive, Expressive/cognitive, head movement andfusion mode

• chooses the desired language: Arabic, French andEnglish

To avoid a third part assistance when using the control interfaceall the needed configuration are done using Windows SpeechRecognition integrated in Visual Studio 2008. Also, the acti-vation or deactivation of the interpretation is done using voicecommands.

Figure 3. The Control technique interface

1) EEG based modes: The EEG control technique offers3 control modes [16]: Expressive mode, cognitive mode andExpressive/cognitive mode. Each mode offers the possibilityto control the electric wheelchair with an easy and safe way.

• Expressive mode The expressive mode use eye di-rection (right and left) and face expression like browand lips to control the EPW. Once the expressiveinterpretation is activated, it will be possible to controlthe chair.◦ To turn to the left the user must fix his gaze

on the left◦ To turn to the right the user must fix his gaze

on the right◦ To go forward the user has to raise or furrow

his brow◦ To go backward the user has to smirk to the

left or to the right• Cognitiv mode Before using this mode a training

step is needed using the Cognitiv Suite panel. In factthis mode uses cognitive data detected via the Emotivheadset to control the wheelchair. So the user hasto think about the action that he desires to do: turnleft/right, go forward/backward. To increase the usersafety the concentration rate and the action powermust be more than 50%.

• Exressiv/Cognitiv mode This mode presents the com-bination between the expressiv and the cognitiv mode.To move the chair for example forward the user mustraise or furrow his brow and think about the desiredaction with a concentration rate more than 50%.

2) Head movement command mode: This mode uses theEmotiv Headset gyroscope which indicates the head positionat any time. In order to convert the head position in commandfor the EPW, the control system must follow several steps:

• Head position calibration: The first step in thistechnique is the calibration of the position of the head.For this the user must maintain stationary head to theend this step. The end of this stage will be indicated bya feedback in the graphical interface and a beep too.Indeed, the aim is to find the position (x, y) = (0, 0)(figure 4) for the control system through a function ofthe SDK EPOC headset.

Figure 4. Head position calibration

• Threshold values calibration:In this mode, 9 threshold values must be fixed, 2 foreach direction:◦ thFmin et thFmax: the minimum and maxi-

mum threshold the forward direction◦ thBmin et thBmax: the minimum and max-

imum threshold the backward direction◦ thRmin et thRmax: the minimum and maxi-

mum threshold the right direction◦ thLmin et thLmax: the minimum and maxi-

mum threshold the left directionTo set the threshold values, the system prompts theuser to lean his head slightly toward a specific direc-tion and then to return to the starting position, thenthe user must increase the angle of leaning of thesame direction and then return to the starting position.This will be done for each direction of the 4 possibledirections.

• Reading and interpreting the values of the positionof the head:Once the threshold values are set, the system’s in-terpretation of the head movements begins. On eachmodification of the head position, the values of the xand y positions will be supplied to the system which inhis turn classifies these values into three subsystems,Figure 5 illustrates the movement of head forward◦ If the values of x or y is between 0 et thmin

then the value will be considered as a validvalue but does not influence the system be-havior to ensure stability.

◦ If the values of x or y is between thmin etthmax then the value will be considered as avalid value and influences the system behavior.Hence to control the wheelchair, it must:

Page 4: A collaborative and voice configured electric powered wheelchair control system based …ipco-co.com/PET_Journal/Acecs-2016/11.pdf · 2016. 9. 15. · A collaborative and voice configured

y ∈ [thFmin, thFmax] to move forwardy ∈ [thBmin, thBmax] to move back-wardx ∈ [thRmin, thRmax] to turn to therightx ∈ [thLmin, thLmax] to turn to the left

◦ If the value of x or y is above thmax thenthe value will be considered as invalid valueand the system disables the interpretation andrequires calibration of the position of the head.

Figure 5. Interpretation sample of head position values

To maintain system stability and hardware performance, wechose to return to the ”Stop” state after each movement. Forexample, if the user moves his head forward to move forwardthe chair, and then moves his head directly to the right, thesystem interprets this movement as a stop command.

3) Fusion control mode: Data from different mode can befused at both the feature level and the decision level. In thispaper, we applied these two fusion strategies and evaluatedtheir performance.For feature level fusion, each variable in each vector willbe considered as clue of the desired direction to form thefinal vector Vfusion. Initially, each element in Vfusion isequal to zero. Then, the correspondent element in Vfusion

will be increased. For example, if eye direction is left thenVfusion(4) = Vfusion(4) + 1. The final decision will be themaximum element in Vfusion.

V1 =

Eye directionBROW StateSMIRK StateCognitive action

Cognitive action powerConcentration degree

V2 =

(X head positionY head positionCalibration state

)

(1)

Vfusion =

forwardbackward

stopleftright

(2)

For decision level fusion, each mode operates separately anddelivers its own decision. We was inspired from the votingtheory [17] to find the final decision. If the three modes givethree different decisions then the final decision is ”Stop”.

IV. RESULTS

The objective of this study was to assess the performanceand adaptability of the single and multi sources control tech-niques of an EPW. All our candidates are healthy, aged be-tween 25 and 35 and none of them had ever utilized an electricwheelchair before. In this paper, a 3D wheelchair movementsimulator realized in our laboratory was used for experimentalvalidation. The candidates start with getting used to the 3Dsimulator, rooms, wall, etc. The objective is to command thewheelchair from the principal door to the principal bedroom(figure 6) and evaluate the time and the rate error for eachcontrol technique.

Figure 6. Users objective

The main observation in this experiment was that users couldactually control the wheelchair in the 3D simulator, inde-pendently of control mode, and with just a small amount ofpractice. This is indicated by the low error rate (table I).

In separate control technique the EEG signals movementwith cognitive mode was judged significantly more reliablethan the others and the head movement technique was ratedslightly easier.

When using data fusion technique especially with featurelevel, error rate was decreased. However, the time needed toreach the destination has augmented. This may be explainedby the time required for the fusion.

TABLE I. Performance evaluation of wheelchair control techniques

Used Technique % accuracy Mean Time (sec)EEG expressive 85 77EEG cognitive 86.32 90

EEG expressive/cognitive 88.32 90Head movement 71.65 54

Fusion (feature level) 91.33 95Fusion (decision level) 90.64 92

Page 5: A collaborative and voice configured electric powered wheelchair control system based …ipco-co.com/PET_Journal/Acecs-2016/11.pdf · 2016. 9. 15. · A collaborative and voice configured

TABLE II. Multi sources based technique comparison for EPW control

Used Technique % accuracy[18] 80[19] 88.66

The proposed technique 91.33

Finally, we compared the proposed fusion technique withother technique based multi sources. Table II shows that theproposed control technique based on the fusion guarantee goodresults against other techniques.

V. CONCLUSION

In this paper, we have presented different techniques tocontrol an EPW via EEG signal and head movement. Fordata acquisition we have used a low cost and non invasiveheadset (EPOC). To reduce the error rate when using singlesource control technique, we have proposed a data fusiontechnique using two fusion levels: feature and decision levels.We succeeded to get better control results and reduce error rate.Our work aims to realize news techniques to control EPW forseverely disabled persons without risking their security, givethem the opportunity to gain their autonomy.

REFERENCES[1] S. Brochard, J. Pedelucq, A. Cormerais, A. Cormerais, M. Thiebaut, and

O. Remy-Neris, “Enquete de satisfaction sur lquipement technologiquedes personnes ttraplgiques par blessure medullaire,” Annales de radap-tation et de mdecine physique, vol. 50, 2007, pp. 78–84.

[2] H. Berger, “Uber das electrenkephalogramm des menschen,” Uber dasElectrenkephalogramm des Menschen, 1929.

[3] E. Niedermeyer and F. L. da Silva, Niedermeyer’s Electroencephalogra-phy: Basic Principles, Clinical Applications, and Related Fields, 1999.

[4] I. Iturrate, J. Antelis, A. Kubler, and J. Minguez, “Non-invasive brain-actuated wheelchair based on a p300 neurophysiological protocol andautomated navigation,” IEEE Transactions on Robotics, vol. 25, 2009,p. 614627.

[5] T. Carlson and J. del R Millan, “Brain-controlled wheelchairs: A roboticarchitecture,” Robotics Automation Magazine, IEEE, vol. 20, no. 1,2013, pp. 65–73.

[6] H. Tran, H. Nguyen, H. Phan, V. Van Toi, T. Ngo, and C. Bui-Thu, “Aneeg-controlled wheelchair using eye movements,” in 5th InternationalConference on Biomedical Engineering in Vietnam, 2015, vol. 46, pp.470–473.

[7] A. Pajkanovic and B. Dokic, “Wheelchair control by head motion,”SERBIAN JOURNAL OF ELECTRICAL ENGINEERING, vol. 10(1),2013, pp. 135–151.

[8] R. Berjon, M. Mateos, A. Barriuso, I. Muriel, and G. Villarrubia, HeadTracking System for Wheelchair Movement Control, ser. Advances inIntelligent and Soft Computing, vol. 89, ch. Highlights in PracticalApplications of Agents and Multiagent Systems, pp. 307–315.

[9] Optical Flow guide. [Online]. Available: http://robots.stanford.edu/cs223b05/notes/CS%20223-B%20T1%20stavensopencv optical flow.pdf

[10] S.D.Suryawanshi, J. Chitode, and S.S.Pethakar, “Voice operated in-telligent wheelchair,” International Journal of Advanced Research inComputer Science and Software Engineering, vol. 3, 2013.

[11] M.Prathyusha, K.S.Roy, and M. A. Shaik, “Voice and touch screenbased direction and speed control of wheel chair for physically chal-lenged using arduino,” International Journal of Engineering Trends andTechnology (IJETT), vol. 4, 2013.

[12] G. Pires and U. Nunes, “A wheelchair steered through voice commandsand assisted by a reactive fuzzy-logic controller,” Journal of Intelligen-tand Robotic Systems, vol. 34, 2002, p. 301314.

[13] G. Massimo, S. Giacomo, C. Silvia, S. Maurizio, and D. Tommaso, “To-wards a brain-activated and eye-controlled wheelchair in: Internationaljournal of bioelectromagnetism,” International Journal of Bioelectro-magnetism, vol. 13, 2011, pp. 44– 45.

[14] O. S and M. W., “Multimodal detection of head-movement artefacts ineeg,” Journal of Neuroscience Methods.

[15] Emotiv EPOC Software Development Kit, 1.0.0.3. [Online]. Available:http://www.emotiv.com/

[16] F. Ben Taher, N. Ben Amor, and M. Jallouli, “Eeg control of an electricwheelchair for disabled persons,” in Individual and Collective Behaviorsin Robotics (ICBR), 2013 International Conference on, 2013, pp. 27–32.

[17] A. Urken, “Voting theory, data fusion, and explanations of socialbehavior,” 2011.

[18] K. Ahmed, “Wheelchair movement control via human eye blinks,”American Journal of Biomedical Engineering, vol. 1, 2011, pp. 55–58.

[19] F. Ben Taher, N. Ben Amor, and M. Jallouli, “A multimodal wheelchaircontrol system based on eeg signals and eye tracking fusion,” in 2015International Symposium on Innovations in Intelligent SysTems andApplications (INISTA),, 2015, pp. 1–8.