Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and...

11
Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior T´ ecnico Lisbon, Portugal e-mail: [email protected] Octavian Postolache Instituto de Telecomunicac ¸˜ oes, ISCTE-IUL Lisbon, Portugal e-mail: [email protected] Pedro Silva Gir˜ ao Instituto de Telecomunicac ¸˜ oes, DEEC/IST Lisbon, Portugal e-mail: [email protected] Abstract—In this document, there are described two frame- works intended to be a composed health monitoring tool with a strong Electronic Health Record component, focused on physiotherapy stroke patients. Using the low cost Microsoft Kinect sensor for tracking human motion, a technical framework intends to classify the correction of the patients’ exercises. For quantifying the patients’ performance, a mechanism for arm poses and arm gestures recognition was built using a sequence of Artificial Neural Networks. Also using Kinect, a second framework applies virtual reality to physiotherapy through the use of a serious game, and provides an objective evaluation of physiotherapy sessions, as well as a complete Electronic Health Record for each patient. Besides including a serious game that captures 3D data regarding the joints’ amplitudes and velocities, it also incorporates a web information system application that enables patients’ data analysis and visualization. Through the web application, each session is personalized by the therapist in order to allow the serious game to adapt to the patient’s physical limitations. Hereupon, this framework intends to be a useful and effective tool for remote physiotherapy sessions allowing for a considerable cost reduction. Index Terms—electronic health record, motion analysis, artifi- cial neural network, physical rehabilitation, serious game, remote therapy. I. I NTRODUCTION Despite being the second worldwide leading cause of death according to the World Health Organization (WHO), stroke disease consequences are still overlooked. It is known that half of European stroke survivors make an incomplete recovery and that half of them need continuous assistance in the daily life [1]. Although it is indispensable for improving motor disabilities and quality of life, physiotherapy is often a slow, painful and high cost process requiring motivation that is often difficult to achieve. Some studies have identified serious games as an effective tool to fight against the discouragement provoked by the monotony of the exercises [2], [3]. The most novel approach is the use of virtual and augmented realities, creating a more enthusiastic interaction between the user and the application. In some cases of stroke rehabilitation, it was noticed that there is a higher chance of improvement for This work was supported by a grant from the Fundac ¸˜ ao para a Ciˆ encia e Tecnologia and Instituto de Telecomunicac ¸˜ oes through the Grant PTDC PTDC/DTP-DES/1661/2012 and the individual scollarship PEst- OE/EEI/LA0008/2013 - A. patients using this technology as training appears to be more challenging, intensive and task-specific motivating [4]. Nowadays, the world is moving towards a convergence of data and the emergence of Smart Cities is a natural consequence of such behavior. In a Smart City background [5], remote therapy can be helpful for allowing home environment monitored sessions and even some cost reduction. Furthermore, physiotherapists’ registering and assessment capabilities are very important as, normally, the evaluation of the patients’ improvements is made by direct observation. In order to assure an objective assessment of the rehabilitation progress, the usage of sensors that extract the motion infor- mation in an unobtrusive way followed by appropriate motion analysis can be a good solution. In this context, this paper promotes two rehabilitation- related frameworks, developed for physiotherapy patients char- acterized by historical stroke events, which focus on different problems. Firstly, it is presented an Artificial Neural Network (ANN) framework for technically recognizing correct or incor- rect arm physiatric gestures. The user is analyzed and tracked through a low cost Microsoft Kinect sensor. The aim of the ANN framework is to quantifying the patients’ performance and to provide a mechanism for arm poses and arm gestures recognition. It is intended to be a high accuracy tool to be used just for analyzing specific gestures, not having entertainment purposes. Secondly, also using the low cost Microsoft Kinect sensor, this project promotes a physiotherapy assessment framework that includes a 3D serious game for Kinect as well as a web information system application, which offers quantitative data about the patient’s performance during physical therapy sessions and materialize a basic electronic health record for physiotherapy. The serious game, developed through Unity 3D,intends to help the patient performing therapeutic move- ments. The application is responsible for creating an inter- active game using some background information regarding the patient’s physical limitations. On the other hand, the web information system provides technical data about the patients that could help therapists during physiotherapy assessments, as well a strong Electronic Health Record (EHR) component which can be edited by the therapist. All the therapy data is stored in a Microsoft SQL server database and is accessed through a .NET WCF Service.

Transcript of Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and...

Page 1: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

Electronic Health Record and Kinect Natural UserInterfaces Games for Physiotherapy

Francisco CaryInstituto Superior Tecnico

Lisbon, Portugale-mail: [email protected]

Octavian PostolacheInstituto de Telecomunicacoes, ISCTE-IUL

Lisbon, Portugale-mail: [email protected]

Pedro Silva GiraoInstituto de Telecomunicacoes, DEEC/IST

Lisbon, Portugale-mail: [email protected]

Abstract—In this document, there are described two frame-works intended to be a composed health monitoring tool witha strong Electronic Health Record component, focused onphysiotherapy stroke patients. Using the low cost MicrosoftKinect sensor for tracking human motion, a technical frameworkintends to classify the correction of the patients’ exercises. Forquantifying the patients’ performance, a mechanism for armposes and arm gestures recognition was built using a sequenceof Artificial Neural Networks. Also using Kinect, a secondframework applies virtual reality to physiotherapy through theuse of a serious game, and provides an objective evaluation ofphysiotherapy sessions, as well as a complete Electronic HealthRecord for each patient. Besides including a serious game thatcaptures 3D data regarding the joints’ amplitudes and velocities,it also incorporates a web information system application thatenables patients’ data analysis and visualization. Through theweb application, each session is personalized by the therapist inorder to allow the serious game to adapt to the patient’s physicallimitations. Hereupon, this framework intends to be a useful andeffective tool for remote physiotherapy sessions allowing for aconsiderable cost reduction.

Index Terms—electronic health record, motion analysis, artifi-cial neural network, physical rehabilitation, serious game, remotetherapy.

I. INTRODUCTION

Despite being the second worldwide leading cause of deathaccording to the World Health Organization (WHO), strokedisease consequences are still overlooked. It is known that halfof European stroke survivors make an incomplete recovery andthat half of them need continuous assistance in the daily life[1].

Although it is indispensable for improving motor disabilitiesand quality of life, physiotherapy is often a slow, painful andhigh cost process requiring motivation that is often difficultto achieve. Some studies have identified serious games as aneffective tool to fight against the discouragement provokedby the monotony of the exercises [2], [3]. The most novelapproach is the use of virtual and augmented realities, creatinga more enthusiastic interaction between the user and theapplication. In some cases of stroke rehabilitation, it wasnoticed that there is a higher chance of improvement for

This work was supported by a grant from the Fundacao para aCiencia e Tecnologia and Instituto de Telecomunicacoes through the GrantPTDC PTDC/DTP-DES/1661/2012 and the individual scollarship PEst-OE/EEI/LA0008/2013 - A.

patients using this technology as training appears to be morechallenging, intensive and task-specific motivating [4].

Nowadays, the world is moving towards a convergenceof data and the emergence of Smart Cities is a naturalconsequence of such behavior. In a Smart City background [5],remote therapy can be helpful for allowing home environmentmonitored sessions and even some cost reduction.

Furthermore, physiotherapists’ registering and assessmentcapabilities are very important as, normally, the evaluation ofthe patients’ improvements is made by direct observation. Inorder to assure an objective assessment of the rehabilitationprogress, the usage of sensors that extract the motion infor-mation in an unobtrusive way followed by appropriate motionanalysis can be a good solution.

In this context, this paper promotes two rehabilitation-related frameworks, developed for physiotherapy patients char-acterized by historical stroke events, which focus on differentproblems. Firstly, it is presented an Artificial Neural Network(ANN) framework for technically recognizing correct or incor-rect arm physiatric gestures. The user is analyzed and trackedthrough a low cost Microsoft Kinect sensor. The aim of theANN framework is to quantifying the patients’ performanceand to provide a mechanism for arm poses and arm gesturesrecognition. It is intended to be a high accuracy tool to be usedjust for analyzing specific gestures, not having entertainmentpurposes.

Secondly, also using the low cost Microsoft Kinect sensor,this project promotes a physiotherapy assessment frameworkthat includes a 3D serious game for Kinect as well as aweb information system application, which offers quantitativedata about the patient’s performance during physical therapysessions and materialize a basic electronic health record forphysiotherapy. The serious game, developed through Unity3D,intends to help the patient performing therapeutic move-ments. The application is responsible for creating an inter-active game using some background information regardingthe patient’s physical limitations. On the other hand, the webinformation system provides technical data about the patientsthat could help therapists during physiotherapy assessments,as well a strong Electronic Health Record (EHR) componentwhich can be edited by the therapist. All the therapy data isstored in a Microsoft SQL server database and is accessedthrough a .NET WCF Service.

Page 2: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

II. RELATED WORK

A. Serious Games and Motion Capture Approaches

There are several approaches which can be taken forcapturing humans’ motion data while exercising. Vicon [6]commercializes high precision systems for optical motioncapture capable of retrieving information about a body’s ve-locity, distance and joints’ angles. Nevertheless, besides beingvery expensive [7], this approach requires markers or sensorsattached to the user which are difficult to calibrate and provokediscomfort.

Other intrusive, however less costly solutions involve tech-niques such as electromyography, where the electrical activityin a subject’s body is used for muscular monitoring [8], andaccelerometry-based ones, where accelerometers are bondedto the user’s body enabling the collection of motion data [9].Arteaga et al. [9] analyzed the posture of stroke survivors withthis technique.

A common source of human tracking sensors are the onesdeveloped by the video game industry. Tanaka et. al. [10]studied and compared the reliability of Microsoft Kinect, SonyPlayStation Move and Nintendo Wii systems for rehabilitationpurposes. Their availability combined with their low pricegives them an important advantage in medical applications.Deutsch et. al. [11] studied the evolution of a patient withcerebral palsy while performing several sessions playing Nin-tendo Wii. Although improvements were obvious, these gameswere not made for rehabilitation purposes, therefore they lackof technical data which can help physiotherapy assessments.

From the low cost solutions, Kinect is a non-intrusive oneand seems appropriate for motion analysis in medical applica-tions. Furthermore, several studies had confirmed that althoughpresenting expected discrepancies, this sensor achieves com-petitive tracking measurements compared with high precisionoptical systems [7], [12], [13].

Regarding Kinect, research has been made on physiotherapygaming applications using this sensor. Rahman et. al. [14] usedKinect to interact with the well-known Second Life virtualworld [15] to record physiotherapy sessions. In another work[3], five games from Kinect Adventures were played in orderto restore some functionalities in a patient with a historyof traumatic brain injury. Although the benefits of virtualreality were highlighted in both, these games were not madefor physiotherapy purposes so they are less efficient for thepatient’s rehabilitation and do not provide useful informationabout the patient’s performance.

Moreover, some serious games were developed specificallyfor physiotherapy [2], [16]–[21]. Although these approacheswere identified as a source of encouragement for patients,some of them lack of integration with a web applicationenabling remote sessions to be hereafter analyzed [2], [16],[18], [19], and the others do not provide significant technicalinformation which can help the physiotherapy assessment,underestimating the EHR component [17], [20], [21].

Frameworks such as the ones previously referred can beeasily integrated in a Smart Health environment. Therefore,

inserted in such a networked context, our solution enables anincrease of efficiency and service quality and a consequentdecrease in costs [5], [22]–[24].

B. Machine Learning and Motion Analysis

To classify the patient’s exercises, it is crucial to effectivelyrecognize human gestures. A lot of research has been madeon this topic as it is useful for a wide range of applications,such as gaming, surveillance, monitoring or motion capture.To analyze the quality of the movements of a given patientin rehabilitation terms, the body joints’ amplitudes should beconsidered. In order to choose an approach, previous work onhuman pose and movement recognition was studied [25]–[31].

Patsadu et al. [25] compared four approaches for pose clas-sification: Backpropagation Neural Network (BNN), SupportVector Machine (SVM), Decision Tree and Naive Bayes. Al-though the implemented systems were only built to distinguishamong three simple poses, the BNN and SVM protrude overthe other approaches. The Naive Bayes classifier assumesthat all the features contribute independently for the finalclassification. However, body joints positions characterizingthe same pose are obviously dependent between them. Fur-thermore, using Decision Trees for the classification processyielded good results, although worse than for BNN and SVM.Nevertheless, it should not be forgotten that with a bigger setthan three poses these results may not generalize.

Miranda et al. [30] proposed a method for recognizingposes and gestures based on SVMs’ classification. Despitea high recognition rate obtained, the classifier was trainedfor very different gestures, characterized by low correlatedmotions. Considering a different approach, Lee [27] used aHidden Markov Model (HMM) for the movement recognitionachieving good results. Hachaj et al. [31] proposed a rule-based approach for pose and movement recognition, introduc-ing a Gesture Description Language. Although with some highrecognition rates, this approach might be too complex for anested space of gestures.

A method for pose and gesture recognition involving MLis presented in this document. This system must effectivelydetect a correct gesture, but should be capable of adaptingto limited users, such as physiotherapy patients. Our approachinvolves a sequence of ANNs organized in a hierarchy, furtherexplained in detail.

III. KINECT NEURAL NETWORK MOTION ANALYSIS

This application intends to improve physiotherapy assess-ments providing information to the therapists regarding thecorrection of the exercises performed. It is known that morethan half stroke survivors experience higher damage in theupper limb, than in the lower limb [32]. Furthermore, Kinecthas a limited tracking range. For example, tracking a userduring gait with Kinect is a difficult task. In summary, it canbe said that Kinect may be a best solution for upper limbanalysis. For these reasons, this application is focused on armand shoulder rehabilitation for stroke patients. We choose anANN approach since it achieved positive results in previous

Page 3: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

works [25], [26], it is less complex to implement using C# andit works well with dependent variables, such as body jointsrelated with the same body pose. A preliminary work on thissubject was previously described by Cary et al. [33].

A. Processing Skeleton Data

Fig. 1. Twenty tracked joints of the user’s body using the Kinect SkeletalTracking.

Recognizing a gesture, involves a lot of tasks. Firstly, it isnecessary to chose a representation for the skeleton data. Ide-ally, our skeleton representation should guarantee invariance tothe sensor’s position, to the body’s distance to the sensor andto the user’s body structure. The Skeletal Tracking System,from Microsoft Kinect SDK, provides for each frame a set oftwenty joints represented as 3D points, as shown in figure 1.Several approaches for treating Kinect’s data were considered,but the one followed is inspired by the work developed byMiranda et al. [30] and Raptis et al. [28].

Torso Joints: The seven joints characterizing the torso arenot considered critical for identifying a pose. Besides, torsojoints are characterized by a strongly correlated motion andcan almost be seen as a rigid body. Principal ComponentAnalysis (PCA) [34] is performed over these points (hip,spine and shoulders). As the skeleton is represented in a threedimensional space, three orthonormal vectors are obtained {~u,~s, ~t}, resulting in a 3D basis called torso basis. If the mostcommon case is considered, the user is in front of the sensorperforming natural gestures and the directions of the principalcomponents can be predicted. The first vector ~u will havethe same direction as the longer dimension of the torso whilethe second principal component ~s will be aligned with theshoulders. The third principal component is orthogonal to thelast two, which can be calculated as a cross product betweenthem.

The remaining joints are represented in spherical coordi-nates in relation to a local basis and are divided in two maingroups defined hierarchically. The joints directly connected tothe torso, such as the elbows and the knees, are called first leveljoints. Following the same logic, the joints directly connectedto the first level ones, such as the wrists and ankles, are calledsecond level joints. The head, the hands and the feet are not

considered to influence our targeted physiotherapy movementsand are ignored.

1) First Level Joints: First level joints are the ones adjacentto the torso. Both elbows and both knees are included in thisgroup. Considering as an example the right elbow, let ~o bethe vector defined between the directly connected joint fromthe torso, i.e., the right shoulder, and the one considered. Thevector ~o can be represented in spherical coordinates in relationto the torso basis.

Recalling the spherical coordinate system, a vector can bedefined by three variables: the radius r, the zenith angle θand the azimuth angle φ. For this application, ~u was chosenas the zenith direction and the plane defined by ~s and ~t waschosen as the reference plane. Figure 2 represents the sphericalcoordinates of a vector ~o in relation to a random configurationof the vectors from the torso basis. The zenith angle θ isthe angle between the vector ~o and the zenith direction ~u.Furthermore, the azimuth angle φ is the angle between thevector ~s and the projection of ~o onto the plane defined by ~sand ~t.

s

u

t

θ

φ

o

r

Fig. 2. Spherical coordinates of vector ~o in relation to the torso basisconsidering a random configuration.

With this approach, each first degree joint is represented bytwo angles, θ and φ. In this case, the radius r = ‖~o‖ is thedistance between the right elbow and the right shoulder andis not considered for classifying the joint, as it only dependson the user’s body structure.

2) Second Level Joints: Second level joints are the onesadjacent to the first level joints, composed by the wrists and theankles. Considering as an example the right wrist, let ~p be thevector defined between the directly connected joint, in this casethe right elbow, and the one considered. Following the samelogic, the vector ~p can be represented in relation to a specifiedreferential. In most common situations, the second principalcomponent ~s has the same direction as the shoulders line.The torso basis is rotated by the angle α = arccos(~o · ~s)around the axis ~a = ~o × ~s, defining a new basis {~unew,~snew, ~tnew}.

The vector ~snew is aligned with the vector ~o. For thiskind of joints, ~snew was chosen as the zenith direction andthe plane defined by ~unew and ~tnew as the reference plane.This seems a good approach as the vector ~snew is the mostdescriptive vector of the second level joint deviation aroundits natural position. With this configuration, the zenith angleθ is the angle between the vector ~p and the zenith direction~snew. Furthermore, the azimuth angle φ is the angle between

Page 4: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

the projection of ~p onto the plane defined by ~unew and ~tnew

and the vector ~tnew, from the new basis.As in the previous case, the radius r = ‖~p‖ , representing

the distance between the right elbow and the right wrist, is notconsidered for classifying the joint, as it only depends on theuser’s body structure. Therefore, a second level joint is onlydefined by the two angles, θ and φ.

3) Final Feature Vector: In the solution achieved, each firstand second level joints are represented by two angles. Jointssuch as the hands, the head and the feet are not considered todescribe the targeted physiotherapy gestures. The feature set isdefined by the four pairs of angles, θ and φ, representing thejoints and by the angle β = arccos(~t · ~z), which describesthe inclination of the body, where ~z is Kinect’s depth axis.Therefore, the feature set is a 17-dimensional vector ~f =(θ1, φ1, ..., θ8, φ8, β) ∈ R17.

B. Static Pose Recognition

The method used for classifying human poses involves ahierarchy of fully-connected feed-forward Artificial NeuralNetworks, shown in figure 3, where each circle representsan ANN. This approach was chosen in order to diminish thedata complexity for each neural network. The inputs of allthe ANNs are the features described in the previous section,which represent a human pose captured by Kinect, while theoutputs take values between 0 and 1 (log-sigmoid function).Furthermore, all of them were trained with the Levenberg-Marquardt backpropagation algorithm [35], using the MAT-LAB Neural Network Toolbox, which supports supervisedlearning for feed-forward ANNs.

The set of poses used for training the system is shown infigure 4. As shown in figure 3, the ANN hierarchy is formedessentially by four ANNs: first-level ANN, right-arm ANN,left-arm ANN and both-arms ANN. The first-level of thehierarchy is formed by one ANN. The first ANN classifies theinput pose between three possible outputs: a right-arm-move,a left-arm-move or a both-arms-move. The second-level of thehierarchy is composed by three ANNs, each one developed todistinguish the pose according to the first-level classification.For example, if the pose is a right-arm-move, there is oneANN which classifies the pose between the ones representedin figures 4(b), 4(c) and 4(h). Similar second level ANNs

First level ANN

Right arm ANN

Left arm ANN

Both arms ANN

POSE

Fig. 3. ANN tree with two levels.

distinguish poses classified in the first level as a left-arm-moveor a both-arm-move.

(a) Pose 1 (b) Pose 2 (c) Pose 3 (d) Pose 4 (e) Pose 5

(f) Pose 6 (g) Pose 7 (h) Pose 8 (i) Pose 9 (j) Pose 10

Fig. 4. Set of ten body poses used for the training algorithm.

The set of poses used only involves arm movements. Nev-ertheless, this method is scalable and supports an extendeddata set, including different poses’ classifications. One optionis to increase the number of outputs of the first ANN andtrain new ANNs for the second-level. For example, if thereis an intention to classify a new data set which includes legmovements, the first ANN could have two more outputs fordistinguishing a right-leg-move and a left-leg-move. Therefore,two more ANNs should be created in the second-level todistinguish between all the pose possibilities in the case thefirst ANN output is a right-leg-move or a left-leg-move.

However, if the poses cannot be classified easily with twolevels, the hierarchy length can be increased too. Picking theprevious example, if the set includes leg poses, a zero-levelcan be introduced to classify between leg poses and arm poses.The next level would contain two ANNs, one for classifyingthe arms poses into categories, equivalent to the one alreadydescribed, and another one used for classifying the leg posesinto categories, such as right-leg-move and left-leg-move. Thesecond-level is again responsible for providing a final poseclassification. The arms’ ANNs can remain the same, but it isnecessary to create two ANNs, one for classifying the posescharacterized by a left-leg-move and another for classifyingthe poses characterized by a right-leg-move.

C. Arm Gesture Recognition

Similarly to Miranda et al. [30], a gesture is defined as asequence of detected poses. A gesture is only recognized ifall the poses which define it are detected. The seven gesturesbuilt for our application are described in figure 5. For example,looking at figure 4, it is possible to observe that if a userperforms a gesture equivalent to lifting both arms up, thesequence of poses detected is composed by pose 1, 6 and7, described respectively by figures 4(a), 4(f) and 4(g).

The gesture recognition was implemented as a DecisionTree with only one branch, where each node represents a pose.

Page 5: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

1 2 3

(a)

1 4 5

(b)

1 6 7

(c)

1 8

(d)

1 9

(e)

1 10

(f)

6 1 6

(g)

Fig. 5. Sequence of poses defining arm gestures.

When a pose is detected all the trees are tested. Each gesturedetection tree has a pointer indicating the next expected posein the sequence for the specific gesture. If the pose equals theexpected one, the pointer moves one position further on thetree. When the last pose is reached, an identifier flag is setto true and the gesture is correctly recognized. Additionally,when building a tree, it is defined a maximum time delaybetween two detected poses, which was set for all trees as 1.5seconds. As soon as this limit is reached and if the expectedpose is not recognized, the tree pointer is reseted to the firstposition and the timer is reseted to zero.

IV. REMOTE PHYSIOTHERAPY SYSTEM OVERVIEW

The second framework developed is composed by threemain independent modules: a serious game, a web applicationand an intermediate server. A preliminary work on this subjectwas already described by Cary et al. [36]. The middle server’spurpose is to provide an interface between the other twomodules and the database managed by a Microsoft SQLServer. A .NET WCF Service is available for enabling simplecommunication between the modules. The serious game wasdeveloped using Unity 3D and C# scripting. It uses Kinectsensor to capture the patient’s data and sends it to the databasethrough the middle server. Finally, the web application’s aimis to allow a simple visualization of the data stored aboutthe patient’s performance mainly through the use of charts.It is also built using Microsoft technology, more specificallyASP.NET.

Middle Server Database

House

C

Unsupervised Patient

Physiotherapy Clinic

Supervisied Patient

Therapist

KinectSensor

Internet

C

Kinect Sensor

Home PC

TherapistSupervising Session

Web App

Web App

Fig. 6. System global overview.

Figure 6 represents the global architecture of the proposedsystem. The three modules interact through an Internet con-nection. Patients and physiotherapists are the two kind ofusers defined. While playing the serious game the first typeof users interacts with the system and technical data about

the gestures performed is recorded. It is possible that thesessions are performed in some health facilities with thepatient being under supervision or in a home environmentwithout supervision. Following this, the therapists access thisinformation through the web application, using a computer ora tablet. The patient’s motion data is presented through charts.A report is generated allowing for the physiotherapists to usethis system as tool for helping them in their assessments.

V. SERIOUS GAME

Microsoft Kinect was chosen for developing the seriousgame because of its non-intrusive properties and its low-cost/high-precision compromise. Paavola et al. [3] analyzedthe benefits of Kinect in the renewal of some functionalitiesin a patient with a traumatic brain injury. In this work [3], a setof games for X-box, called Kinect Adventures, was used andafter some sessions improvement was observed. This mightmean that existing games have the necessary properties to helpin physiotherapy approaches. However, this assumption shouldbe complemented with a precise evaluation.

Serious games are games designed for other purposes ratherthan just entertainment. Following this definition, the gamedeveloped has two main purposes. Firstly it is intended toprovide an entertainment factor often related with seriousgames [2] which has an enormous importance in providingencouragement, motivation and well-being during physiother-apy sessions. The other aim of this serious game is to providefeedback according to the patient’s performed motion. Theserious game does not replace the Web Application. Thefeedback given during the game about the user’s performanceis provided in order to enable self-correction while playing. Adeveloped database will assure the physiotherapy data storageas part of an Electronic Health Record.

1) Therasoup Gameplay: A screenshot of the developedserious game - Therasoup - is shown in figure 7. During thegame, the player controls the avatar through the Kinect sensorand is supposed to pick the ingredients landed in shelves andto put them in a pan at the center, such as it can be observed infigure 7(a). This game tries to simulate a daily life activity suchas cooking. The box in the bottom-left corner indicates witha semaphore if any hand has an object attached. Moreover,the box at the top-right corner, which is shown in detail infigure 7(b), displays information about the amplitudes of theshoulders joints and the velocities of each hand in real time.Finally, the box in the top-left corner presents the playtime andthe score chart. The score chart, also represented in figure 7(c),is updated in real time according to the height of the shelvesreached by the player and (it follows a decreasing exponentialdecay along time in order to stimulate dynamic game playing).

In summary, this game intends to work on two parallelskills: coordination and motor capabilities. In fact, whilepicking the objects from shelves and putting them in a panthe patient is improving his coordination skills. Besides thisadvantage, it is obviously working on the patient’s physicalcapabilities. This serious game is focused on arms movements,more specifically on arm extension and shoulder abduction.

Page 6: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

(a)

(b) (c)

Fig. 7. Therasoup game-play: (a) Picking an object, (b) Shoulder amplitudesand hand velocities, (c) Time-decreasing score chart.

The lateral shelves are divided in three intervals of height andwere chosen such that the user must perform 60, 90 and 120degrees amplitude in the shoulder joint with the arm stretched.The frontal shelves are divided in four intervals of height andwere chosen such that the user must perform 60, 90, 120and 150 degrees amplitude in the shoulder joint with the armstretched. As the required amplitude increases, the shelf ischaracterized with a flag that has one of four colors: green,yellow, red and black. As the required amplitude to reach theshelf increases, the flag color goes from green to black, andthe score associated with the award of picking an object fromthat shelf also increases.

2) Therasoup Results Interface: After the playtime expires,the windows shown in figure 8 are presented to the player.Firstly, the user is asked for classifying the amount of painfelt during the game. This is a qualitative analysis that shouldbe provided by editing the colored bar by moving the grayrectangle. The patient should open his right or left arm movingthe bar to the right or to the left, respectively, such as it isrepresented in figure 8(a). After choosing the bar’s value, theplayer should open both arms symmetrically and the windowrepresented in figure 8(b) appears.

At this stage, the patient is given some information abouthis performance. In the top-left corner it is possible to observethe game duration, the points scored, and the maximumamplitudes achieved by the shoulder joints. In the top-rightcorner there is a smiley face representing the pain felt by theuser. Finally, in the bottom there are shown two line graphs.In both the x-axis represents time, the red dots refer to the leftarm and the green dots refer to the right arm. Moreover, theleft graph’s y-axis represents the hand joints’ velocities and theright graph’s y-axis represents the shoulder joints’ amplitudes.

(a)

(b)

Fig. 8. Therasoup game windows: (a) Pain scale interface, (b) Rehabilitationmetrics interface.

Each pick of an object performed during gameplay representsa different instant of time.

The information provided in the serious game must besummarized and simple to analyze. Its purpose is to motivatethe patient showing a quick overview of the progress made.It can also be verified by the physiotherapist, although moredetailed information can be found in the web application.

VI. WEB INFORMATION SYSTEM

One of the web information system’s main purpose is toprovide a simple interface where the physiotherapists can visu-alize the patient’s medical information. It provides a graph-toolfor analyzing the physiotherapy session data, captured by theKinect sensor and by the serious game. Furthermore, anotheraim of this framework is to offer an EHR component, sothat medical information regarding the patients’ health can beinserted and consulted. This application was developed usingthe ASP.NET framework, compatible with HTML5, CSS3 andJavaScript, and following the Mode-View Controller (MVC)pattern. The graphical interface is composed by several views,i.e., web pages. The interaction between the web application’sviews, starting at Home, can be seen in figure 9. Each greensquare represents one web page and the bullets inside themrepresent the information that is provided in those views. Allthe links between pages are represented by arrows.

Page 7: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

HomeContact

About

Register

Manage Login

PatientAnalysis

Search Patient list

Edit personal info

Admin

Search User list

PatientManageUser

Patient personal info Patient session list Patient session s data

PatientSessionExportPDF

Patient personal info Patient medical info Patient session s data

PatientElectronicHealthRecord

Patient personal info Patient medical info

EditPatientElectronicHealthRecordRead

Edit patient personal info Edit patient medical info

Fill personal info

Administrator user link

Therapist user link

Patient user link

All users link

Links available by user type:

Fig. 9. Web application view organizational model.

A user can be cataloged in one of three types: Administrator,Therapist or Patient. Different users have access to differentviews. In figure 9, the links are distinguished by users’ typeof access through the use of different colors. For example, thePatientManageUser view is available to all users and displaysrehabilitation-related data for a given patient. The informationpresented include graphs similar with the ones shown inthe serious games, however more detailed. Furthermore, itis in this page that, for each arm, the therapist can modifythe existent restrictions regarding arm abduction and armextension exercises. This information is stored in the databaseso the serious game can automatically adapt to the definedconstraints. In this view, there is a button which redirects theuser to the patients’ EHR page (PatientElectronicHealthRecordview), where it is displayed information which include weight,body mass index, past medical history, secundary healthproblems, important environmental factors, important personalfactors, previous exams history and current medication. Suchas it is described in figure 9, a Therapist or an Administratorcan edit the information shown in this page.

VII. EXPERIMENTAL RESULTS

A. Arm Motion Classification - Results and Discussions

Before training the ANN framework for arm motion recog-nition, it was necessary to collect real data to build a trainingset containing pairs of inputs and desired outputs for eachANN. Firstly, it was developed a software used to build thetraining set by capturing human motion data from Kinect. Eachsubject from a group of twenty six people was placed at adistance between 2.30 and 2.60 meters from the sensor andwas asked to mimic a pre-programed skeleton displayed bythe training software, which performed the sequence of tenposes defined in figure 4. The users were tracked during theprocess. Afterwards, the data was used to train the networksusing MATLAB Neural Network Toolbox.

In order to test the algorithms described, it was created an-other software for human pose and arm gesture classification,which strictly follows the ANNs’ configurations obtained afterthe training process. The software interface is shown in figure10 for a user, in green, performing gesture b, i.e., lifting the

left arm up. The gesture b) is identified as a sequence of threedetected poses: 1, 4 and 5. Recall that all the poses and allgestures were defined through figures 4 and 5. During testing,five subjects were placed at a distance between 2.30 and 2.60meters from the sensor, performing each of the ten poses fivetimes and each of the seven gestures ten times.

Fig. 10. Software developed for testing the arm motion recognition system:Pose 5 identified - gesture b) recognized.

The system’s assessment was made considering the poserecognition and the gesture recognition classifiers as separatedmodules, so that they could be tested independently. Theexperimental results for the pose recognition stage are shownin table I. As it is easily observed, although for all posesthe accuracy is significantly high, for the last three poses itis slightly smaller. This effect occurs because, while obtainingthe training set, these poses were the ones with higher variancebetween candidates, as they were more difficult and lessprobable to be executed in the same precise way. This ledto higher training errors and therefore to higher classificationerrors.

TABLE IRESULTS FOR THE DETECTION OF TEN POSES FOR FIVE INDIVIDUALS

PERFORMING THE POSES FIVE TIMES EACH.

Poses 1 2 3 4 5 6 7 8 9 10User 1 5 5 5 5 5 4 5 5 3 4User 2 5 4 5 5 5 5 5 4 5 5User 3 5 5 5 5 5 5 5 3 5 4User 4 5 4 3 5 4 4 5 5 2 3User 5 5 5 5 5 5 4 5 4 5 5

Accuracy % 100 92 92 100 92 96 100 84 80 84

Table II presents the results for the gesture recognition withthe same five different users. One can easily notice that ingeneral the accuracy is lower than the one verified in thepose recognition table. This was expected since the trainingset just includes data retrieved from users performing staticposes, however the recognition is performed over movingusers. In fact, during a gesture, the poses are achieved witha given momentum provoking some oscillations in the body.This can be seen as noise to the gesture classifier and couldbe suppressed with a training set including poses capturedfrom users in movement, i.e., captured from users performinga gesture continuously, instead of the captured static poses.

Page 8: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

TABLE IIRESULTS FOR THE DETECTION OF SIX GESTURES FOR FIVE INDIVIDUALS

PERFORMING THE GESTURES TEN TIMES EACH.

Gestures a b c d e f gUser 1 8 7 4 7 8 7 9User 2 7 8 6 5 8 7 6User 3 7 8 10 9 7 9 8User 4 7 10 8 7 6 7 10User 5 9 10 9 9 8 9 9

Accuracy % 76 86 74 74 74 78 84

Furthermore, we have identified a problem in the classifi-cation. While the user is performing the gestures, some posesexperience an unstable classification. What happens is thatsometimes a pose is wrongly recognized for milliseconds,followed after by the recognition of the correct pose. As all theoutputs from the poses’ classifier are inserted in the decisiontrees for the gesture classification, this can lead to a correctgesture classification followed by an incorrect one or viceversa. One way to circumvent this problem is to combinethe gesture classifier with some rules based on the joints’amplitudes achieved, eliminating impossible classifications.Other solution would be to apply some sort of low pass filterover the outputs of the poses classifier, in order to eliminatethese high frequency classification errors.

B. Kinect Rehabilitation Framework - Case Study

As already described, during each session playing theTherasoup game, the user is tracked through the MicrosoftKinect sensor, being all the data sent to a server and stored ina database. Afterwards, using the web application, the infor-mation referring to the user’s performance can be observed.In order to test the proposed framework, it was developeda case study where one user performed ten sessions of oneminute length each and the overall results were analyzed.Moreover, data regarding two sessions was compared in detail.To visualize the data, the database was accessed throughMATLAB and graphs similar to the ones presented in the webapplication were plotted.

1) Overall Performance Evolution: In the end of eachsession, the patient is invited to provide the Pain Score, i.e., aqualitative measure of the pain felt during the game. The PainScore joined with the Game Score are sent to the database. TheGame Score during the game increases whenever an ingredientis picked up correctly from a shelf and dropped in the centralpan. An object pick is considered as correct if the shoulderamplitudes are what is expected for a given shelf. In the caseof the object being picked up with an incorrect body position,it is added a negative score, i.e., this variable Game Scoredecreases. Furthermore, the higher the ingredients are pickedup from, the higher it is the reward in points.

Accessing the database it is possible to compare the overallevolution of the Pain Score and Game Score variables, whichare a measure of the patient’s evolution. With this in mind,it was created the graph shown in figure 11, representing thePain Score and the Game Score values over the previous ten

sessions performed by the user that is testing the application.The x axis represent each of the referred ten sessions. Thered bars display the Pain Score obtained in each session. As itis possible to observe in the right-side y axis, the Pain Scoretakes values between 0 and 1. The blue bars represent theGame Score achieved for each session, characterized by theleft-side y axis. This variable takes undiscriminated positivevalues.

Fig. 11. Pain Score and Game Score history over the previous ten rehabili-tation sessions.

In a first sight, it is possible to observe that the Game Scoreincreases over the previous ten sessions. When the differencebetween two sessions is small, it means that the user haveimproved along the sessions the overall body positioning whenpicking up objects, avoiding the negative points that wouldbe added to the score in the case of incorrect positioning,but maintaining the average height of the shelves were theingredients were being picked from. When the differencebetween two sessions is significant, it probably means that theuser have continuously achieved higher shelves from sessionto session. This behavior is rewarded with higher points addedto the score.

The Pain Score classification is obviously related with theGame Score and similar conclusions can be taken. Whilethe user’s average self height remains the same, the user isimproving the body position and it is natural that the PainScore decreases as the body is frequently exercised. On theother hand, when the user starts achieving higher shelves, it isprobable that the Pain Score increases in a short-term, althoughthe Game Score can increase. This behavior can be observedwhen comparing sessions 6 and 7, or sessions 9 and 10. Afterenough exercising, the user Pain Score classification decreases.

2) Session Comparison: During each session, informationabout the user’s motion is captured at a constant rate of 4samples per second. Moreover, whenever an object is pickedup, an extra sample is taken. Using MATLAB, the joints’ datafrom two followed sessions was retrieved from the databaseand displayed in the graphs shown in figures 12 and 13. Forsession 1 and session 2, figures 12(a) and 13(a) represent theshoulders’ amplitudes performed by the patient, while figures12(b) and 13(b) represent the hands’ velocities achieved in thesame instants of time. In these graphs, points are distinguishedbetween samples taken at a constant rate (small crosses) andsamples taken when an object was picked up (big circles).

Page 9: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

(a)

(b)

Fig. 12. Graphs representing data from the first session: (a) Shoulder joints’amplitudes evolution, (b) Hand joints’ velocities graphical representation.

Furthermore, the left arm is characterized by red dots, whilethe right arm is described by green dots. It is important tonotice that the samples taken in a pick-object action (bigcircles) are more descriptive of the patients’ difficulties duringthe game.

In the first session, the user is in an early rehabilitationstage. Looking at figure 12, the physiotherapist can easilyconclude that, in session 1, similar maximum amplitudes andvelocities were achieved for both arms, since the red andgreen big circles have similar maximum values in both graphs.Moreover, it is possible to verify that, along time, both hands’speed decreases from a maximum around 1.0 to a minimumof 0, which suggests that the patient becomes tired veryquickly. This assumption is confirmed by the fact that thepatient shoulders’ amplitude decreases jointly with the hands’velocity. At this point, the physiotherapist can give the patientsome advice based on the information observed. In the secondsession, the user can use the feedback received regarding thefirst session.

Looking at the second session’s motion data representedin figure 13, the physiotherapist can analyze the evolutionbetween the two sessions and check the user’s performance.At a first sight, is possible to observe that, although the patientpicked up one object less than in session 1 (previous 12objects, current 11 objects), much higher shoulder amplitudesand hand velocities were achieved with both arms (140degrees vs. the 90 degrees from the first session). Moreover,

(a)

(b)

Fig. 13. Graphs representing data from the second session: (a) Shoulder joints’amplitudes evolution, (b) Hand joints’ velocities graphical representation.

such as it is observed in the first session, the hands’ speed alsodecreases along time. Nevertheless, in this case, this behavioris followed by high values on the shoulders’ amplitudes graph.This means that the patient has more difficulty to achievehigher shelves - in another words, higher shoulder amplitudes- and reaches them with lower velocities when compared tothe lower shelves.

VIII. CONCLUSIONS

Using the low cost Microsoft Kinect sensor, this thesispresents two rehabilitation frameworks developed for sepa-rated purposes: one for technically classifying arm gestures,and the other for helping in the physiotherapy assessmentthrough the use of a serious game, connected with a webapplication. The aim of the developed ANN is to track thepatient’s motion through Kinect and analyze the data througha sequence of ANNs. In the end, the system classifies each ges-ture as correct or incorrect. Using the Microsoft Kinect sensor,the serious game captures 3D data during the patient’s session,while providing entertainment. The application is responsiblefor creating an interactive game using some background in-formation regarding the patient’s physical limitations.

This framework was presented to several physiotherapistsin June, during a workshop in Vila Nova de Gaia, and inJuly, during another workshop at ISCTE, Lisbon. We receivedenthusiastic feedback as well as many suggestions. There islot of data that can be used to build more Key Performance

Page 10: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

Indicators. For example, the application could be improved inorder to include information regarding more joints and morerehabilitation games could be developed. In the end, it wasconsensual that this system can improve patient’s motivationand that all of this data can help physiotherapists improvingtheir assessments. Moreover, this framework enables remotephysiotherapy sessions, which can help decreasing rehabilita-tion costs and increasing the frequency of the exercises.

REFERENCES

[1] T. Truelsen, B. Piechowski-Jwiak, R. Bonita, C. Mathers,J. Bogousslavsky, and G. Boysen, “Stroke incidence and prevalencein europe: a review of available data,” European Journal ofNeurology, vol. 13, no. 6, pp. 581–598, 2006. [Online]. Available:http://dx.doi.org/10.1111/j.1468-1331.2006.01138.x

[2] P. Fikar, C. Schoenauer, and H. Kaufmann, “The sorcerer’s apprentice aserious game aiding rehabilitation in the context of subacromial impinge-ment syndrome,” in Pervasive Computing Technologies for Healthcare(PervasiveHealth), 2013 7th International Conference on, 2013, pp.327–330.

[3] U. K. Paavola JM, Oliver KE, “Use of x-box kinect gaming consolefor rehabilitation of an individual with traumatic brain injury: A casereport.” J Nov Physiother 3:129, 2013.

[4] G. Saposnik, M. Levin, and for the Stroke Outcome Research Canada(SORCan) Working Group, “Virtual reality in stroke rehabilitation: Ameta-analysis and implications for clinicians,” Stroke, 2011.

[5] M. Roscia, M. Longo, and G. C. Lazaroiu, “Smart city by multi-agentsystems,” in Renewable Energy Research and Applications (ICRERA),2013 International Conference on, Oct 2013, pp. 371–376.

[6] V. M. S. Ltd. Vicon systems. [Online]. Available: http://www.vicon.com/[7] C.-Y. Chang, B. Lange, M. Zhang, S. Koenig, P. Requejo, N. Somboon,

A. Sawchuk, and A. Rizzo, “Towards pervasive physical rehabilita-tion using microsoft kinect,” in Pervasive Computing Technologies forHealthcare (PervasiveHealth), 2012 6th International Conference on,2012, pp. 159–162.

[8] S. M. Cowan, K. L. Bennell, and P. W. Hodges, “Thetestretest reliability of the onset of concentric and eccentricvastus medialis obliquus and vastus lateralis electromyographicactivity in a stair stepping task,” Physical Therapy in Sport,vol. 1, no. 4, pp. 129 – 136, 2000. [Online]. Available:http://www.sciencedirect.com/science/article/pii/S1466853X00900361

[9] S. Arteaga, J. Chevalier, A. Coile, A. W. Hill, S. Sali,S. Sudhakhrisnan, and S. H. Kurniawan, “Low-cost accelerometry-based posture monitoring system for stroke survivors,” inProceedings of the 10th International ACM SIGACCESS Conferenceon Computers and Accessibility, ser. Assets ’08. New York,NY, USA: ACM, 2008, pp. 243–244. [Online]. Available:http://doi.acm.org/10.1145/1414471.1414519

[10] K. Tanaka, J. Parker, G. Baradoy, D. Sheehan, J. Holash, andL. Katz, “A comparison of exergaming interfaces for use inrehabilitation programs and research,” 2012. [Online]. Available:http://journals.sfu.ca/loading/index.php/loading/article/view/107/118

[11] J. E. Deutsch, M. Borbely, J. Filler, K. Huhn, and P. Guarrera-Bowlby,“Use of a low-cost, commercially available gaming console (wii)for rehabilitation of an adolescent with cerebral palsy,” PhysicalTherapy, vol. 88, no. 10, pp. 1196–1207, 2008. [Online]. Available:http://ptjournal.apta.org/content/88/10/1196.abstract

[12] A. Fernandez-Baena, A. Susın, and X. Lligadas, “Biomechanicalvalidation of upper-body and lower-body joint movements of kinectmotion capture data for rehabilitation treatments,” in Proceedings ofthe 2012 Fourth International Conference on Intelligent Networkingand Collaborative Systems, ser. INCOS ’12. Washington, DC, USA:IEEE Computer Society, 2012, pp. 656–661. [Online]. Available:http://dx.doi.org/10.1109/iNCoS.2012.66

[13] R. A. Clark, Y.-H. Pua, K. Fortin, C. Ritchie, K. E. Webster,L. Denehy, and A. L. Bryant, “Validity of the microsoftkinect for assessment of postural control,” Gait & Posture,vol. 36, no. 3, pp. 372 – 377, 2012. [Online]. Available:http://www.sciencedirect.com/science/article/pii/S0966636212001282

[14] M. Abdur Rahman, A. M. Qamar, M. A. Ahmed, M. Ataur Rahman,and S. Basalamah, “Multimedia interactive therapy environment forchildren having physical disabilities,” in Proceedings of the 3rd ACMconference on International conference on multimedia retrieval, ser.ICMR ’13. New York, NY, USA: ACM, 2013, pp. 313–314. [Online].Available: http://doi.acm.org/10.1145/2461466.2461522

[15] S. Minocha, “Introducing second life, a 3d virtual world, to studentsand educators,” in Technology for Education (T4E), 2010 InternationalConference on, 2010, pp. 206–208.

[16] J.-D. Huang, “Kinerehab: a kinect-based system for physicalrehabilitation: a pilot study for young adults with motor disabilities,”in The proceedings of the 13th international ACM SIGACCESSconference on Computers and accessibility, ser. ASSETS ’11. NewYork, NY, USA: ACM, 2011, pp. 319–320. [Online]. Available:http://doi.acm.org/10.1145/2049536.2049627

[17] B. Lange, S. Koenig, E. McConnell, C. Chang, R. Juang, E. Suma,M. Bolas, and A. Rizzo, “Interactive game-based rehabilitation using themicrosoft kinect,” in Virtual Reality Short Papers and Posters (VRW),2012 IEEE, 2012, pp. 171–172.

[18] I. Pastor, H. Hayes, and S. Bamberg, “A feasibility study of anupper limb rehabilitation system using kinect and computer games,”in Engineering in Medicine and Biology Society (EMBC), 2012 AnnualInternational Conference of the IEEE, 2012, pp. 1286–1289.

[19] M. Fraiwan, N. Khasawneh, A. Malkawi, M. Al-Jarrah, R. Alsa’di,and S. Al-Momani, “Therapy central: On the development of computergames for physiotherapy,” in Innovations in Information Technology(IIT), 2013 9th International Conference on, 2013, pp. 24–29.

[20] B. Processing, “Seeme rehabilitation,” Setpember 2013. [Online]. Avail-able: http://www.virtual-reality-rehabilitation.com/products/seeme/what-is-seeme

[21] H. Sugarman, A. Weisel-Eichler, A. Burstin, and R. Brown, “Use ofnovel virtual reality system for the assessment and treatment of unilateralspatial neglect: A feasibility study,” in Virtual Rehabilitation (ICVR),2011 International Conference on, June 2011, pp. 1–2.

[22] J. Lee, “Smart health: Concepts and status of ubiquitous health withsmartphone,” in ICT Convergence (ICTC), 2011 International Confer-ence on, Sept 2011, pp. 388–389.

[23] H. Demirkan, “A smart healthcare systems framework,” IT Professional,vol. 15, no. 5, pp. 38–45, Sept 2013.

[24] M. Chan, D. Estve, C. Escriba, and E. Campo, “A review of smart home-spresent state and future challenges,” Computer Methods and Programsin Biomedicine, vol. 91, no. 1, pp. 55 – 81, 2008. [Online]. Available:http://www.sciencedirect.com/science/article/pii/S0169260708000436

[25] O. Patsadu, C. Nukoolkit, and B. Watanapa, “Human gesture recognitionusing kinect camera,” in Computer Science and Software Engineering(JCSSE), 2012 International Joint Conference on, 2012, pp. 28–32.

[26] J. Nagi, F. Ducatelle, G. Di Caro, D. Ciresan, U. Meier, A. Giusti,F. Nagi, J. Schmidhuber, and L. Gambardella, “Max-pooling convo-lutional neural networks for vision-based hand gesture recognition,”in Signal and Image Processing Applications (ICSIPA), 2011 IEEEInternational Conference on, 2011, pp. 342–347.

[27] S.-W. Lee, “Automatic gesture recognition for intelligent human-robotinteraction,” in Automatic Face and Gesture Recognition, 2006. FGR2006. 7th International Conference on, 2006, pp. 645–650.

[28] M. Raptis, D. Kirovski, and H. Hoppe, “Real-time classification ofdance gestures from skeleton animation,” in Proceedings of the 2011ACM SIGGRAPH/Eurographics Symposium on Computer Animation,ser. SCA ’11. New York, NY, USA: ACM, 2011, pp. 147–156.[Online]. Available: http://doi.acm.org/10.1145/2019406.2019426

[29] J. Suarez and R. Murphy, “Hand gesture recognition with depth images:A review,” in RO-MAN, 2012 IEEE, 2012, pp. 411–417.

[30] L. Miranda, T. Vieira, D. Martinez, T. Lewiner, A. W. Vieira, andM. F. M. Campos, “Real-time gesture recognition from depth datathrough key poses learning and decision forests,” 2012 25th SIBGRAPIConference on Graphics, Patterns and Images, vol. 0, pp. 268–275,2012.

[31] T. Hachaj and M. Ogiela, “Rule-based approach to recognizing humanbody poses and gestures in real time,” Multimedia Systems, pp. 1–19,2013. [Online]. Available: http://dx.doi.org/10.1007/s00530-013-0332-2

[32] F. d. N. A. P. Shelton and M. J. Reding, “Effect oflesion location on upper limb motor recovery after stroke,”Stroke, vol. 32, no. 1, pp. 107–112, 2001. [Online]. Available:http://stroke.ahajournals.org/content/32/1/107.abstract

Page 11: Electronic Health Record and Kinect Natural User Interfaces … · Electronic Health Record and Kinect Natural User Interfaces Games for Physiotherapy Francisco Cary Instituto Superior

[33] F. Cary, O. Postolache, and P. Silva Girao, “Kinect based system andartificial neural networks classifiers for physiotherapy assessment,” inMedical Measurements and Applications (MeMeA), 2014 IEEE Interna-tional Symposium on, June 2014, pp. 1–6.

[34] J. Shlens, “A tutorial on principal component analysis,” in SystemsNeurobiology Laboratory, Salk Institute for Biological Studies, 2005.

[35] A. Reynaldi, S. Lukas, and H. Margaretha, “Backpropagation andlevenberg-marquardt algorithm for training finite element neural net-work,” in Computer Modeling and Simulation (EMS), 2012 Sixth UK-Sim/AMSS European Symposium on, 2012, pp. 89–94.

[36] F. Cary, O. Postolache, and P. Silva Girao, “Kinect based system andserious game motivating approach for physiotherapy assessment andremote session monitoring,” in International Conference on SensingTechnology (ICST), September 2014, pp. 1–6.