Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient...

19
a a a b a,* a b

Transcript of Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient...

Page 1: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

Simulation of Human Behaviours for the

Validation of Ambient Intelligence Services: a

Methodological Approach

Teresa Garcia-Valverde a, Francisco Campuzano a Emilio Serrano a Ana Villa b and

Juan A. Botia a,∗

aUniversidad de Murcia, Campus Espinardo, Murcia, SpainE-mail: {mtgarcia, fjcampuzano, emilioserra, juanbot}@um.esbAmI2, Ambiental Intelligence and Interaction, CEEIM, Campus Espinardo, Murcia, SpainE-mail: {ana.villa}@ami2.net

Abstract. An Ambient Intelligence (AmI) system is a pervasive system in which services have some intelligence inorder to smoothly interact with users immersed in the environment. Users are the main entity in AmI systems.Thus, the services are user-centred with an adaptive interaction that provides users with more personalized facilitiesin a non-intrusive way. The deployment of AmI services is a hard task, especially when the system includes complexusers' behaviours or complex deployments in terms of the devices and software participating at the system. Sinceone of the intrinsic requirements in these services is smooth interaction with users, the user, or at least a model ofthe user, should be incorporated in the development process. This makes the process of veri�cation and validationof such services quite di�cult. This paper proposes the use of Agent Based Social Simulation for a quick andfeasible validation of AmI systems. Simulation in the �rst stages enables easy validation of the services before realenvironment deployment. The main challenge in this approach is how to simulate humans with realistic behaviours.In this paper, Social Simulation (SS) is used to tackle this problem. Speci�cally, this paper presents a methodologyfor the validation of AmI systems by SS. The realistic modelling of users and its validation are key elements of thisprocess and involve challenges that the techniques proposed here are able to deal with. A real application examplewhich shows the level of reality reached in the users' models and the bene�ts of the methodology is presented inthis paper.

Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling

1. Introduction

An Ambient Intelligence [1] system is a set of ap-pliances, services and applications which silentlysurrounds and interacts with the user in an intelli-gent manner. Some of the commonest examples ofthese projects are: MavHome [2], iSpace [3], AwareHome [4] or the House_n [5]. In these projectsthe environment has intelligence and makes deci-

*Corresponding author. E-mail: [email protected].

sions regarding the current context and interac-tions with users.The last years have seen substantial progress

in AmI [6]. There have been lots of developmentsand projects which apply the issues of ambient in-telligence in di�erent contexts: smart homes [7],health monitoring and assistance [8], hospitals [9],transportation [10], emergency services [11], ed-ucation [12], workplaces [13], etc. A state-of-the-art in-depth view of AmI applications in di�erent�elds can be found in [14]. In such scenarios, theuser is the central entity. Services and applications

Page 2: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

are built using the user as the central element ofproduct creation.One of the central issues of AmI services de-

velopment is veri�cation and validation of suchservices by testing. Testing software is the pro-cess of executing a program in order to �nd errorsin the code [15]. Such errors must then be de-bugged. According to the IEEE Standard Glossaryof Software Engineering Terminology [16]: �Valida-tion is the process of evaluating a system or com-ponent during or at the end of the developmentprocess to determine whether it satis�es speci�edrequirements�. Software validation is, accordingly,the process of checking that what it does it what itis supposed to do. A failure is found when it is de-tected that the software does something di�erentto what it should do by its design. Testing andvalidation are concepts that are tightly related tosoftware quality [17].Considering an AmI service or application as the

system under test (SUT), some of the errors maybe found by using a Unit approach (e.g. out-of-range variable values, shoddy checking of returnvalues from methods and so on). Robustness ofthe code is crucial here. However, in this particu-lar kind of SUT, and provided that robustness isalready guaranteed (e.g. by previous Unit testingand debugging), the correct functionality of theSUT must be veri�ed.There is a variety of AmI based ubiquitous com-

puting systems. In terms of the number of devicesand software subsystems implied, a simple exam-ple is a continuous monitoring system for elderlypeople who live independently in their own houses.A more complex system would be an intelligentbuilding in charge of controlling the whole build-ing. All of them have elements in common: theyinclude users and devices, mainly sensors and ac-tuators. How do we design and perform a valida-tion process of the SUT when users and devices arecentral elements, without real users and devices?To put the question in another words, how can weattain su�cient SUT quality in a stage prior todeployment of the real system, and so minimizethe risk of �nding errors when the real system isinstalled?This work rests on the following hypothesis: it

is possible to validate the SUT by recreating usersand devices in a simulated physical environment(e.g. a house or a building). This statement will bevalidated below. It is important to note that simu-

lation is not an overall solution. Probably, morevalidation tests would be needed after deployingit in a real environment. But integrating the realSUT in a previous simulation process may help tovalidate the SUT, or at least minimize the risksof �nding errors when the AmI system is deployedin production mode (i.e. as a product). Moreover,it is possible to achieve more realistic simulationsusing injection of real elements within the simu-lated world. Thus, simulations are composed ofboth real and simulated elements. This realism in-creases the reliability of validating AmI serviceswhich are deployed in simulated environments.Our particular approach for simulation is based

on social simulation (SS). SS is a cross disci-plinary research and application �eld which com-bines computer simulation and social science [18].Human societies are often complex adaptive sys-tems with a high number of complex interactionsbetween their members. SS allows us to deal withthese complex social processes since traditionalcomputational and mathematical models have dif-�culties doing it [19]. To that end, SS uses an ex-tremely simple version of the agent metaphor tospecify single components and interactions amongthem. But using SS models in simulations impliesthat such models should also be validated. Valida-ting the SUT with guarantees of success requiresthat simulation models re�ect reality. So, suchmodels can be seen as social models under test(SMUT) and some validation process should beapplied to them before they are used for validatingthe SUT. The process of validating SMUTs anduse them to validate SUTs is not trivial becauseit implies a lot of complex components. Thus, thepaper shows a methodology intended to guide thedeveloper in the process. Note that, although themethodology copes with both validation of SMUTsand SUTs, the paper only illustrates the applica-tion of the methodology for the validation of aSMUT. This illustration is based on a real domain.The application domain is AAL (Ambient Assis-ted Living). An AAL system is an ICT (Informa-tion Communication and Technology) based solu-tion which is devoted to augmenting the quality oflife for elderly people. In this case, the interest isfocused on a system called Necesity [20]. The pa-per shows how the methodology is applied to vali-date models for the elderly. These models can thenbe used to validate any AmI service to be built for

Page 3: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

such domain by following an approach based onsimulation.The paper is structured as follows. Section 2 in-

troduces the methodology. Section 3 details theAmI system used to illustrate the proposal. Sec-tion 4 introduces the computational models em-ployed to arti�cially reproduce the behaviour ofthe user. Section 5 presents the application of themethodology to validate such computational mo-dels. Section 6 discusses some related works. Fi-nally, the conclusions and future works are pre-sented.

2. Proposal of the Methodology

Suppose an AAL system capable of monitoringelderly people who live alone at home [20]. It conti-nuously monitors the elderly person and takes de-cisions about their state by a �xed mechanism (i.e.it behaves in the same manner with any elderlyperson and uses no adaptation). Such mechanismdetects when something is wrong, by using datacoming mainly from activity sensors deployed atthe house. Suppose that the requirements of theproblem slightly change so that a new service isneeded. New adaptive capabilities are needed: i.e.the mechanism in charge of detecting abnormalsituations should now adapt to the elderly people'spatterns of behaviour in order to produce a moree�ective response (i.e. adaption to particular usersimplies that the system can manifest a quicker re-action). How can such a new service be develo-ped? Moreover, how do we assess the e�ectivenessof such a service and demonstrate that it behavesbetter than the earlier, rigid version? A �rst engi-neering choice would be to program a new servicein the lab, make a controlled test of its adaptivecapabilities and then test it during some weekswith real users. Some drawbacks are in the way:what if we do not have such real users availablefor test purposes? What if real users are availablebut no abnormal situations are produced duringthe test phase? An approach based on simulationwould be equally e�ective to validate the adaptiveservice, provided that the house, the sensors andappropriate elderly people can be simulated. Rea-lity should be preferred to simulation, but when areal set up is not a�ordable, simulation may help.Moreover, simulation models can be reused when

new AmI services are needed within the same orsimilar domains.This section proposes a methodology to address

this problem: veri�cation and validation of AmIservices by using simulation. Thus, the methodo-logy also addresses the development of simulationmodels (including their validation as SMUTs) asthey are needed to validate the SUT. It is an exten-sion of the methodology promulgated by Gilbertand Troitzsch [21] for the development and use ofSS models.Gilbert and Troitzsch's methodology [21] is in-

teresting in this context for a number of reasons.The �rst is that it pays the necessary attentionto model building (not only is a good design ofa model important, but also a good implementa-tion). Other similar methodologies, such as the oneproposed by Fishwick [22,23], do not do this. Thesecond is that it is the most popular methodologyfor developing social simulations [24].We have extended Gilbert and Troitzsch's me-

thodology with two new and necessary elements:(1) the existence of a step to generate simpler si-mulation models to cope with high complexity and(2) the consideration of including real elements inthe simulation to get a more e�ective validation ofSUTs. The methodology is expressed as a �owchartin Figure 11. It is called AVA, which stands forAmbient intelligence services VAlidation.Gilbert and Troitzsch [21] de�ne the target of a

social simulation as some real world phenomenonin which the researcher is interested. In this case,the whole phenomenon refers to an ambient intelli-gence system deployed at a physical space in whichusers evolve. Thus, the phenomenon includes aphysical environment, persons acting within theenvironment and the devices and software of theAmI system. Let us suppose that the AmI soft-ware is made up by services, applications and amiddleware in charge of supporting services andapplications. Let us also suppose that the middle-ware and some services and applications have beenvalidated previously (i.e. they all are already ma-ture). Then, the methodology is proposed to vali-

1The �owchart uses standard elements of the classic�owcharts [25] as �ow of control (represented as arrows),processes (represented as rectangles), decisions (rhombu-ses), input/output (parallelograms), start and end symbols(ovals) and prede�ned processes (rectangles with verticallines at the sides).

Page 4: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

0.Start

1.Identifymain

elements

2.1.Physicaldevices 2.2.Users 2.5.Mature

software2.4.Middlew

are2.3.Physicalenvironment 2.6.SUT

3. Model Design

4. Model Implementation5. Simulation

7.SMUT too

complex?6. Forensic

analysis

10. Validate and verificateSMUT?

8. Isolate complexelements

9. Apply methodology

12. Inject real elements?

14. End

NO

YES

NO

13. Debugging

YES

NO

11. ValidateSUT?

YESYES

NO

Fig. 1. Proposal of a methodology for the validation of AmIservices

date those services and applications which are al-ready developed but not totally mature (i.e. theSUTs) by means of creating, validating and usingSS models (i.e. the SMUTs) for physical environ-ment, users and devices.The application of the methodology is based on

the concept of cycle and iteration. A cycle compre-hends the application of the methodology to va-lidate a single SUT (e.g. a location service basedon Bluetooth for an intelligent building) and it iscompound of a number of iterations. An iterationinside a cycle refers to the application of the �owdiagram process of Figure 1 in the context of vali-dating the SUT. Within a cycle, a number of ite-rations will be executed until the SMUTs are va-lidated and, after that, the SUT is validated also.Thus, given n SUTs to validate, the methodologyshould be applied by means of n cycles, one foreach SUT.Step 1 of the methodology, labelled with Iden-

tify main elements includes the identi�cation andaccounting of physical devices (element 2.1), kindand number of users (element 2.2), a descriptionof the physical environment (element 2.3), themiddleware (element 2.4) and the separation of

services and applications into two subsets: thosewhich are already mature (element 2.5) and thosewhich are chosen to be validated (element 2.6). Athird subset is left out and includes these serviceswhich need be validated but will not be in thisapplication cycle of the methodology. This iden-ti�cation of an accounting task is a necessary fa-miliarization with the domain required by the si-mulation models developer. Now, the developer ofthe models is ready to design the models [21].Step 3 of the methodology refers to the design

of the SS models for elements in 2.1, 2.2 and 2.3(physical devices, users and physical environment)and how they are integrated in the simulation withthe real software (i.e. middleware (2.4), maturesoftware (2.5) and SUT (2.6)). Note that an e�ec-tive validation of the SUT needs realistic modelsfor users, mainly.Step 4 deals with the necessary activities of im-

plementation. On the one, the model design out-come must be coded into the language of the spe-ci�c simulation platform used. On the other hand,elements 2.4, 2.5 and 2.6 (middleware, maturesoftware and SUT) must be integrated with thespeci�cations of the simulation models. The out-come from step 4 should be something ready for si-mulation. To pass from a simulation model designto a model implementation ready for simulation ina concrete SS platform is not a trivial task [24].A general programming language is not a feasi-ble option because some elements physical devices(2.1), users (2.2) and physical environment (2.3)usually have considerable complexity. FurthermoreSS platforms can help to alleviate the complexityof coding the design of models (i.e. tools for seve-ral of the typical tasks in the construction of SShave been included in this kind of software pack-ages [21]). The web of the Open Agent Based Mo-delling Consortium2 nowadays lists over twenty ofthese SS platforms.Step 5 is in charge of simulation. The �rst itera-

tions within a methodology application cycle aredevoted to validating the SMUTs. In subsequentiterations within the same cycle, once the SMUTshave been validated, the focus of the simulations isput on validating the SUT. SS platforms will o�erthe possibility of easily performing di�erent expe-riments of interest and generate simulation data

2OpenABM Consortium website:http://www.openabm.org/

Page 5: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

for analysis in a convenient way. It is up to the de-veloper how to design the experiments, dependingon the particular scenario and also if the target isa SMUT or a SUT. Note that these simulation ex-periments are actually software tests. Simulationmodels are used to make tests on the SUT.Once simulations are run, it is time to analyze

data. Forensic analysis (Step 6) is an o�ine si-mulation data analysis to be conducted on thedata generated in the previous step. The analy-sis should consider three important questions thatshould guide the process. The �rst has to do withthe complexity of SMUTs. If they are a�ordable,the second one addresses whether the SMUT ofthe corresponding methodology iteration is correctin order to validate it. Finally, the third question,once all the SMUTs have been validated, is aboutthe correctness of the SUT in the correspondingmethodology cycle.One of the innovative points in the methodology

is covered in Step 7. It consists of including thepossibility of dividing complex simulation modelsinto simpler ones, i.e. creating and validating a setof SMUT1, . . . , SMUTm from a complex SMUT(Step 8). Hence, an iteration of the methodologyis applied to each SMUT1, . . . , SMUTm (Step 9).Once all the SMUT1, . . . , SMUTm are validated,it is easier to validate the SMUT. Note that aSMUT is considered as too complex when it isdi�cult to assess whether the behaviour of thatSMUT is the expected one. For example, some be-haviours of users can be so complex that they needto be evaluated in isolation. An example of thistype of behaviour would be the resolution of colli-sions on the motion of a large number of dynamicentities. This behaviour is a problem to be studiedby itself and its validation as a SMUT would bemuch more complicated with additional elementsin the simulation (additional users' behaviours, anenvironment model, the SUT and the rest of thesoftware).Step 10 is a branch which indicates that if the

SMUT of the corresponding iteration is not ma-ture, the model design must be revised. Only whenall the SMUTs have been validated, is it time totry validate the SUT (Step 11). In other words,when the simulation models are validated, thenthe whole simulation is ready and the focus ofattention can be directed to the SUT validation.Step 12 is another innovative aspect of the me-

thodology: the integration of the simulation with

real elements. Think, for example, of a very sim-ple simulation in which a real activity sensor isconnected with the simulation through the simu-lation platform. This sensor is associated to a con-crete room in the simulated environment of themethodology's element 2.3. In the rest of rooms,the corresponding activity sensors are simulated.This kind of set up would be useful for validatingaspects like the integration of real sensors with themiddleware in charge of reading, storing and redis-tributing sensor data to others by comparing rea-dings generated by the real sensor with those gene-rated by simulated ones. Thus, the bene�t of thisinnovation is that the simulation gains in realism,in some sense, when real devices interact with si-mulated elements (i.e., injection of real elements).If no injection is necessary in this methodology cy-cle, and the SUT is still not ready, then debug-ging the SUT code (Step 13) is necessary (i.e. it issupposed that in Step 5, the SUT was tested andStep 6 generated evidences about possible errors).As debugging is locating and �xing errors in theSUT, it does not imply changing the simulationmodel produced in Step 4. Thus, from Step 13, theprocess goes again to simulation, Step 5. A cycleon the methodology, i.e. the validation of a con-crete SUT, ends when its behaviour is correctlyveri�ed.

2.1. Model Design

The design of a simulation model is performedin Step 3 of the methodology depicted in Figure1. This step is in charge of producing designs forsimulation models of physical devices (2.1), users(2.2) and physical environment (2.3). Such ele-ments, in the form of simulation models, are theso called SMUTs. Step 3 can be further re�ned asa �ow chart. It appears in Figure 2.The �rst point at the design of a SS model is

the de�nition of classes [21]. Recall from the lastsection that models have to be designed for physi-cal devices (2.1), users (2.2) and physical environ-ments (2.3). Thus, classes referring to such cate-gories are needed. Additional classes can be spe-cializations of these, depending on the domain andscenario. Such classes must be separated into twomain categories: agent-based classes and the rest.A class must be modelled as an agent if objects ofthat class must evolve over time. A typical com-plex element which evolves over time is a user. But

Page 6: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

3.0.Startmodel design

3.1. Defineclasses

3.7. End

3.2. Select classhierarchy

3.3. Select agents

3.4. Define attributes

3.5. Describe agents' behaviors3.6 Finish

Fig. 2. Flowchart for the model design process of the me-thodology

simple elements could also be modelled as agents.For example, regarding classes for the physical en-vironment, a wall in a building does not change, itis �xed over time. However, a door can be openedor closed, but there is no evolution over time if thechange is caused exclusively by a user opening orclosing it. Nevertheless, if a door must react au-tonomously according to external events (e.g., beclosed in some periods or after some alarms), itcould be modelled as an agent.For each concept modelled as a class, the co-

rresponding attributes describing their propertiesmust be speci�ed. The next step is to describe thebehaviour of those elements modelled as agents.These behaviours must include interactions bet-ween elements (i.e. interactions between indivi-dual agents and the environment or among agentsthemselves, when appropriate [21]). Model dyna-mics may be modelled with any abstract machine,e.g. �nite and deterministic automata, petri nets,etc.It is important to stop modelling at the right

level of complexity (i.e. the level of complexity re-quired for a correct validation of the SUT). Forexample, if the SUT is a location service which de-pends on the user's trajectory in a house, the userbehaviour should perform interesting trajectoriesin order to generate the necessary tests to evaluatethe location service's accuracy. If the SUT is anadaptive service which depends on the particularuser habits at home, the behaviour should be ableto arti�cially reproduce such habits. Moreover, itshould be customizable so as to reproduce di�e-rent users. For all of this, it is necessary to clearly

identify to what extent behaviours are realistic. Inother words, when does the developer stop mode-lling? (�nish condition in Figure 2).The condition that must be checked to decide

when to stop is whether the model is minima-list (i.e. it includes only the relevant elements forthe problem at hand) in order to simplify theanalysis phase as much as possible [26,21]. Butat the same time, the model must describe rea-lity in some sense. This trade o� introduces theconfrontation between the KISS approach (�Keepit Simple Stupid�) [27] and the KIDS approach(�Keep it Descriptive Stupid�) [28] for social simu-lation based modelling. The KIDS approach leadsto as descriptive as possible design models, butoften these models are quite complex. Later, themodel can be simpli�ed if it is su�ciently justi-�ed. On the other hand, the KISS approach aimsto make models as simple as possible. These mo-dels only become more complex if they are unableto reproduce possible and interesting phenomenathat may occur in reality. The KISS approach pro-posed by Axelrod [27] is the mainstream approachin social simulation. The goals of KISS and KIDSapproaches are di�erent. KISS simulations studythe macro-micro links in the arti�cial society whileKIDS simulations are used to produce realistic si-mulations [29]. To correctly validate a SUT, whosecorrect behaviour depends on users, realistic simu-lations rather than simple simulations are better,in principle. Thus, the proposed methodology fo-cuses on KIDS models.

2.2. Forensic Analysis

The forensic analysis step (Step 6 in Figure 1)refers to the study of the results from the simula-tion runs. Such study covers data selection, datatransformation and data representation and ana-lysis tasks. However, no conclusions are drawn atthis stage. They are generated afterwards, in steps7, 10 or 11 depending whether the SMUT is toocomplex, or when it is ready to be validated orwhen the SUT is ready to be validated, respec-tively. Note that conclusions drawn in steps 7, 10and 11, are used to decide if the object underevaluation should be validated. But, when a pilotstudy follows the application of the methodology,such conclusions should again be checked with thetestbed [24].

Page 7: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

Steps 10 and 11 work similarly: they both ad-dress veri�cation �rst and then validation. In ve-ri�cation, bugs are searched for in the simulationdata analysis results. When no bugs are detected,the object (either a SMUT or a SUT) is veri�ed.Then comes validation. Validation checks, againby using simulation data analysis results, that thebehaviour of the object is the desired one by pro-ving that requirements are accomplished. One ofthe most popular ways for verifying SS models isthe Unit approach. It can detect implementationbugs such as values of variables out of range orincorrect return values from methods. It works bywriting tests, simultaneously with the coding ac-tivity, and performing these tests automaticallyeach time the code is modi�ed. Note that, if the si-mulation engine incorporates displays of some in-teresting aspects of the simulation, such displayscan play a relevant role in the veri�cation pro-cess. Through them, a developer can quickly de-tect bugs in the simulation. In this sense, displaysare equivalent to monitoring the evolution of alarge number of variables. SS models validation isa more abstract task. Most authors perform thevalidation by using numerical evidence tests [26].For example, SMUT simulation results may be

compared with real data obtained from the realphenomenon. This data can be obtained from thesensors in particular, the context in general andinformation external to the system, such as theusers' pro�les. Naturally, to have real data availa-ble is not always possible. In the experiments ofthis paper, this type of approach is used (see sec-tion 5). Another e�ective but costly option for va-lidation of simulation models is to implement themodel twice by using two di�erent target simula-tion platforms and two di�erent programmers [21].Then, the behaviour of the two simulations arecompared. Note that, the veri�cation of a SUT issimilar to that applied to a SMUT. The Unit ap-proach is usually addressed. In fact, veri�cation ofthe SMUT is by the application of the methodo-logy. It uses the Unit approach as each simulationrun implies that the SMUT is used many timeswith di�erent input values. Such input values areusually conditioned by sensor data, which at thesame time, are conditioned by the evolution of theuser is the environment and how sensors perceiveit. The same discussion on validation of the SMUTmay be applied to the SUT.

Step 7 detects if the SMUT is too complex whenthe simulation data analysis results give evidencesindicating that the SMUT can not be validated.Note that when there is no possible validation, itcan be due for two reasons. The �rst one is thatthere are bugs in the model design. In this case, the�ow of control should return to Step 3. The secondone is that requirements are not accomplished butnot for a bug. This is due to handling a extremelycomplex model.

2.3. Software Tools for Application of theMethodology

The methodology presented above depends onthe availability of software tools for a success-ful application when a real problem is faced. Forexample, in order to validate a concrete SUT (e.g.a location service), and following the methodology,a simulation engine and the corresponding simula-tion models for elements 2.1 (physical devices), 2.2(users) and 2.3 (physical environment) are needed.Besides, the necessary middleware and additionalservices needed by the SUT to correctly run mustalso be available. Furthermore, additional tools toperform forensic analysis should be also availablein order to avoid performing simulation data ana-lysis from scratch. For this purpose, the UbikSim3

tool [30] has been created together with the me-thodology described. It has the following main fea-tures:

� It is based on the MASON social simulationengine; thus, implementation of the modelsmust adjust to this engine.

� Basic SS models for physical environments(e.g. o�ces building �oors, hospital �oors,etc.) are already developed and validated.

� Basic SS models for humans are already in-cluded in the framework and validated forspeci�c environments like, for example pro-fessors in universities, caregivers in hospitals,etc. They were created to be used in formerprojects and can be reused and/or tailored insimilar situations.

� The same goes for simple SS models for sen-sors (e.g. presence, pressure, open door sen-

3UbikSim website: http://ubiksim.sourceforge.net.The website o�ers several videos to view the operation andevolution of UbikSim.

Page 8: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

sors and Bluetooth and RFID tags and an-tennas).

� All the releases of UbikSim at the moment areintegrated with a speci�c ubiquitous compu-ting middleware (i.e. the 2.4 element in Figure1).

� Basic ubiquitous computing and AmI services(i.e. 2.5 element in Figure 1) have been deve-loped, veri�ed and tested as SUTs and can beused as mature software.

The architecture of UbikSim is structured insoftware components. It is based on an OSGiservice oriented architecture [31]. OSGi brings avery important bene�t to the UbikSim architec-ture: �exibility and capacity of evolving throughtime. For example, UbikSim's current con�gura-tion is based on MASON but this could be easilychanged by integrating any simulation platform(e.g. Repast or Netlogo) into a bundle with thesame API (Application Program Interface). OSGiis responsible for coordinating components duringruntime. Figure 3 shows the interrelations amongcomponents by means of a collaboration diagram.OSGi and MASON are 100% Java. Therefore,

SMUTs (which are simulated by MASON) andSUTs (which must be integrated into an OSGibundle) should also be based in Java. OSGi com-ponents are called bundles. Bundles in Figure 3are labelled with Bn, where n is the index of thebundle. For example, B1 is an environment mo-delling tool based on Java3D for modelling peo-ple and their environment. It is possible to createbuildings on a 2D plane and it also o�ers a 3Dnavigable view (Figure 4). B2 is a 3D viewer thatshows how the simulation evolves while it is run-ning. The use of this display is also very useful forveri�cation (see section 5.1).B3 is the MASON simulator engine of UbikSim.

MASON is the platform of choice because it is oneof the most popular and powerful opensource SSplatforms. B4 is a middleware for context informa-tion management. Recall that to deploy an ubiqui-tous computing system, a middleware is necessaryto allow deploying context-awareness and �exiblemanagement of entities. The middleware used inUbikSim is Open Context Platform (OCP), deve-loped in our lab [32]. OCP was OSGi based beforeUbikSim existed, so integrating it was extremelyeasy. As the bundle has a clear API, other middle-wares (e.g. JCAF or the Context Toolkit to name

Fig. 3. Collaboration diagram for the UbikSim simulatorframework

a few) might be alternatives to OCP. Note fromthe Figure that OCP works by using ontologiesfor the representation and storage of context. Thismeans any other bundle can use past sensor lec-tures or other user context for its own purposes.B5 is the SUT. Note that the SUT interacts mainlywith B4. It could use user context (e.g. locationinformation) or user's past history to develop anadaptive service by machine learning.

3. Introducing the Running Example

This section is the �rst part of three devotedto giving an example of how to apply the me-thodology. This �rst part introduces the Necesity

Page 9: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

system, veri�ed and validated by using the me-thodology and how the physical environment anddevices were created using UbikSim. The secondpart, in section 4, explains the kind of model weuse to simulate the user of this example. The thirdpart, in section 5, explains how to verify and vali-date such model. Thus, a concrete iteration of themethodology is illustrated, with the user simula-tion model as a SMUT.Necesity4 is an Ambient Assisted Living (AAL)

system. An AAL system is any ICT system whichcontributes to enlarge the amount of time thatelderly people living independently can maintainsuch independent living. The Necesity system isdesigned to detect falls or other domestic incidentsinvolving elderly people who live alone in their ownhouses. The system aims to reduce the long wai-ting times before a person is �nally attended aftersu�ering a fall or a faint. Additionally, it avoidsthis incident passing unnoticed. It is able to de-tect automatically and non-intrusively the situa-tions while preserving subject's privacy.The Necesity system is in charge of continuously

monitoring elderly people who live alone andhealthy in their own houses. Such monitoring isbased on a wireless sensor network deployed athome. Sensors connected to the network capturepresence, pressure in armchairs and open doorevents. All the events are sent to a miniPC (i.e. thesize of a small laptop) which takes decisions aboutthe state of the subject (i.e. the elderly person be-ing monitored). When a problem is detected (e.g.long periods of inactivity in a context in whichthis should not happen according to patterns ofbehaviour learnt from the subject) an emergencyactuation protocol is raised (usually an alarm issent to an emergency response team through 3Gcommunication). Such patterns of behaviour arekey here. One of the most important elements ofNecesity is the adaptation algorithm used to learnfrom user habits. This algorithm is used in a ser-vice in charge of quickly detecting when an un-expected situation is happening at the same timethat false positives (i.e. false alarms) are mini-mized and false negatives (i.e. the algorithm doesnot detect an undesired situation which is occu-rring) are avoided. This service is the SUT consi-dered in this paper. In order to test this algorithm

4Necesity website: http://www.necesity.net/

Fig. 4. A plane model in UbikSim's editor and its 3D rep-resentation

over the long term, some mechanism to generatearti�cial sensor data is needed (i.e. real data isscarce and it usually takes a long time to get adata series which is long enough). Thus, the ideais to create a SS model for general subjects. Amodel which could be tailored to manifest di�erenthabits to di�erentiate between subjects. This SSmodel is the SMUT that will be validated in therunning example by applying the methodology.Let us start illustrating the application of the

methodology by explaining how to create SS mo-dels for elements 2.1 and 2.3 of the methodo-logy. After the creation and validation of these ele-ments, they are incorporated as validated SMUTsthat are used, in subsequent iterations of the me-thodology, to validate the SMUT regarding sub-ject's behaviour. The creation of such models isbased on using the UbikSim editor (see bundle B1in Figure 3) [33]. A snapshot of the editor, witha simple apartment already created appears in Fi-gure 4 (it is based on SweetHome3D editor andadapted for the particular domain of ubiquitouscomputing and AmI).The editor includes basic furniture (i.e. passive

objects) and active objects (i.e. sensors in thiscase, which include presence, open door and pres-sure among others). The Figure shows the physi-cal model of a simple house with a kitchen, a bath-room, a bedroom and a living room. A 2D viewedited by the user appears in the upper part, andthe corresponding 3D view, which is created on the�y, in the lower part. In the model, presence sen-sors are included in every room of the house. Fur-

Page 10: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

thermore, a sensor for open door (it is necessaryto know when the elderly person leaves the home)is also included in the front door. Pressure sensorsare installed in the bed and the subject's favouritearmchair. They allow the system to know when thesubject is resting. Simulated sensors are connectedto the middleware (i.e. see bundle B4 of the ar-chitecture of UbikSim) by actually using the bun-dle's API. Thus, the process of sending events fromsensors, in simulation time, is exactly the same aswhen the system actually runs in a real house.

4. Realistic Users' Behaviour Modelling

The kind of monitored subject under the Nece-sity system is a typical aged person, who lives in-dependently and alone in his own house. In thisscenario, a probable situation is that the personsu�ers some sudden health problem and stays im-mobilized on the �oor, as a consequence, for a longtime before anybody comes and notices that some-thing is wrong. Detecting such situations and ac-ting in consequence is the main task of the Nece-sity system. Moreover, this section introduces theapplication of the methodology to validate a SSmodel of the subject. This SMUT is used after-wards to validate, among other things, the me-chanism in charge of detecting that something iswrong in the house by using the UbikSim tool.Basic assumptions for the subjects are: (1) the

simulated elderly person should be necessarily si-mulated 24 hours a day, repeatedly for a deter-mined number of weeks; (2) the day is dividedinto time slots (i.e. morning, noon, afternoon andnight); (3) the simulated subject behaves depen-ding on the time slot the simulation is in.The development of this SMUT follows the pro-

cess de�ned in Figure 2. The MonitoredSubjectclass is created and added to the model's class hi-erarchy. The MonitoredSubject class must be de-clared as an agent: it represents an object that isgoing to evolve along the simulation. Let us des-cribe the SMUT's dynamics. In the approach follo-wed in this work, the SMUT for a subject is mode-lled probabilistically. The basic idea is to identifysimple behaviours (e.g. sleeping, eating, having ashower, cleaning hands, and so on). The simula-tion model should then decide, depending on thetime of the day, what is the basic behaviour dis-played by the subject at any time. Simple beha-

viours can be aggregated into higher abstractionlevel ones (e.g. having a shower and cleaning handscould be aggregated into a having a wash). Thus,a hierarchical automaton is a natural solution torepresent possible transitions from one behaviourto others and hiding the unnecessary complexitydepending on the granularity level needed. Figure5 represents the highest abstraction level automa-ton in (a) with the corresponding Q set of statesand transitions represented as arrows at the graph.At the (b) part, all the automata which are encap-sulated into states in Q are represented. For exam-ple, when at state a2 ∈ Q, the subject is havinga meal. So Q2 and the corresponding automatonrepresent that while eating, the subject can go tothe toilet and come back to �nish eating. A fur-ther re�nement of the a20 state is represented bythe automaton in (c). It represents a sequence ofactivities done to prepare meal before eating (thisgranularity level is not used in this paper as ex-plained above).Depending on how they occur, there are three

di�erent types of behaviours:

� Monotonous behaviours: the kind of behaviourthe subject manifest always approximately inthe same time slot, and on a daily basis (e.g.sleeping, having meals, medication and so on).

� Non monotonous behaviours: the kind of be-haviour the subject usually manifests, notbounded by a concrete time slot, and repeatedwithin a non constant period (e.g. going tothe toilet, having a shower, cleaning the houseand so on).

� Anytime behaviours: these behaviours are in-terrupted by others and represent the state ofthe subject when she is not doing anything ofinterest for the system. The subject displaysthis behaviour when not displaying any other,either monotonous or non monotonous (i.e.anytime behaviours represent spare time).

We have just introduced the interrelation ofthe subject's behaviours in terms of the swit-ching between them. Such switching has the formof a hierarchical automaton. But a mechanism isneeded to simulate when to switch and to what be-haviour. For this purpose, a combination of an in-terpreter and probability distribution (pdf) func-tions is used. These pdfs are members of the ex-ponential family [34,35].

Page 11: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

Q = {a0 = SpareT ime, a1 = MedicationTime,a2 = MealT ime, a3 = SleepTime,

a4 = Anomalous}

(a)

Q0 = {a00 = SpareT ime, a01 = ToiletT ime,a02 = ShowerTime, a03 = CleanTime}

Q1 = {a10 = MedicationTime,a11 = ToiletT ime}

Q2 = {a20 = MealT ime, a21 = ToiletT ime}Q3 = {a30 = SleepT ime, a31 = ToiletT ime}

(b)

Q20 = {a200 = GoingToFridge, a201 = GoingToCooker,a202 = Cooking, a203 = GoingToTable, a204 = Eating}

(c)

Fig. 5. (a) Level 0 automaton, (b) Level 1 automata, (c)Level 2 automaton for state a20

Monotonous behaviours must be activated atspeci�c time hours or within speci�c time inter-vals. For example, in the case of MedicationTime(see Figure 5(a)), a transition to this state will begenerated exactly at the time of taking medicines.In the case of MealTime and SleepTime, the acti-vation is generated within a time interval. Hence,the pdf which models these transitions is boundedto this time slot. Also, it must be assured thatthe simulated subject eats and sleeps every day.Because of that, if no transition is probabilistica-

lly generated within the time slot, it is necessarilygenerated at the end of the slot. Such time slotsare similar for all subjects (i.e. a person usuallygoes to sleep or has dinner at the same hours).The gamma pdf is usually used as a probabilitymodel for waiting times, in this case, the time theinterpreter has to wait to launch the behaviour.Thus, a gamma distribution may be used for mo-delling when monotonous behaviours occur. Withthe same reasoning, non monotonous behaviour'soccurrence is simulated with an exponential pdf.Non monotonous behaviours are also waiting timemodels, but they are de�ned in awake periods. Ac-tually, this is a special case of the gamma distribu-tion. Finally, the group of anytime behaviours arebehaviours which will be often interrupted by theother kind of behaviours. The time that it takes forsuch a behaviour to be displayed is small, due tointerruptions. This causes a characteristic heavy-tailed pdf. A well known example of a pdf showingthis appearance is the Pareto II or Lomax pdf.All these assumptions will be empirically demon-strated in section 5.The relation between behaviours and the basic

mechanisms used to simulate when to make thetransition to a new behaviour are explained above.Now, the interpreter, which simulates the subject'sdynamics, is introduced. Figure 6 presents a pseu-docode for it. For this SMUT, only levels 0 and 1of the automaton introduced in Figure 5 are used.Only such granularity of reality level is needed.First of all, behaviours of level 0 and 1 are initia-lized. For example, line number 15 initializes, forall anytime behaviour, the instants of time, fromwhen the simulation starts until it ends, when theyshould be activated. The same goes for monoto-nous and non monotonous behaviours at lines 20and 22, respectively. The case of monotonous be-haviour, as mentioned above, is special: the pdfused to generate times for activation is bounded tothe interval in which the behaviour should occur.At line 35, the automaton starts to execute. Ateach iteration, simulation time advances (line 61)and when it is equal to the time the active be-haviour must end, the automaton returns to theinitial behaviour (lines 37-39). At each iterationthe priority of the �rst behaviour in the pt list (i.e.list of pending tasks ordered by priority) is com-pared with the priority of the active behaviour (i.e.currentState). If the priority is higher for thatbehaviour on the list, the active behaviour is in-

Page 12: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

terrupted and is put at the pt list. The behaviourwith higher priority is then activated (lines 43-50).The next lines, from 51 to 64 are devoted to main-taining pt, adding new behaviours that should beactivated now.The interpreter can be customized using some

con�guration parameters. The �rst is the timelimit in each room before entering an anomalousstate (tmax), i.e. if the subject stays more thantwo hours in the bath, an alarm should be raised,thus, the automata should go to Anomalous. Thesecond parameter is, obviously, the pdf used foreach kind of behaviour. The third and last one isthe set of time slots of monotonous behaviours.

5. Forensic Analysis of Users' Behaviour

At this point, the SMUT is designed and im-plemented. Following the methodology, the modelis already designed in Step 3, and implemented inStep 4. This section is devoted to veri�cation andsubsequent validation of the SMUT.

5.1. Veri�cation

The veri�cation of SMUT ensures that theimplementation of these models is correct. TheSMUT has been veri�ed by using the unit testpackage JUnit5. JUnit is open source and inten-sively used in the open-source community. Thispackage allows developers to create unit testswhich, once written, are performed automatically,giving an error report in response.An example of unit test is the veri�cation of the

transitions in the automata. Figure 7 shows theJUnit code for the automaton of level 2 shown inFigure 5.Furthermore, UbikSim contributes to veri�ca-

tion by means of the displays. The simulation dis-plays are very useful to build the model and ob-serve that the arti�cial society behaviour is appro-priate and realistic. At the same time, it can helpto detect some simple simulation errors. UbikSimworks on MASON and can use its features, forexample inspectors. They are a means of graph-ically visualizing the evolution of variables of in-terest for the simulation. A large number of ins-pectors for various simulation variables can be

5JUnit website: http://www.junit.org/

used and monitored dynamically as the simulationevolves. They can be used to check that such vari-ables always take reasonable values. Additionally,UbikSim shows a 3D display of the physical en-vironment, devices installed, furniture and otherphysical elements and simulated humans. This vi-sualization allows designers to evaluate some sim-ple features of the system. Examples of such fea-tures are the coherence of the environment (a doormust be in a wall) or some behaviour of the users(a user only can move through free areas, i.e. with-out obstacles).

5.2. Validation

The SMUT in the running example is a model ofthe elderly person. This model refers to a realisticbehaviour in terms of where the user is now, to-wards where he will move and when he will move.Eventually, he can sit or lie down on the bed. Whythis level of abstraction? Why not model the sub-ject cooking or reading a book? The reason is sim-ple: the Necesity system can not perceive such gra-nularity of activities. It only perceives presence,pressure on beds or armchairs and open or closeddoors. As a consequence we can say that the gra-nularity of behaviours to simulate will be given bythe activities the system can perceive. Thus, theSMUT should be validated at the same granularitylevel.There are many techniques for validating simu-

lations [36,37] and more speci�cally, for valida-ting simulated models based on agents [38]. In thiswork, we start with fragments of real data ob-tained from Necesity logs, thanks to a pilot ex-periment with an early release of the Necesitysystem. This pilot experience involved about 25elderly people living independently in their ownhouses. The basic idea consists of comparing realbehaviour data with synthetic data generated bysimulations of the SMUT. The rest of the sectionexplains how this idea was enabled by two phases.The �rst is in charge of processing real data tomake it ready for SMUT validation. The secondphase, model diagnosis, assesses whether that theproposed SMUT resembles real data.

5.2.1. Data PreprocessingFor validation of the SMUT, we selected data

from three subjects of the pilot experiment men-tioned above. These data refer to monitoring the

Page 13: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

1 Let pt be a list of pending tasks ordered by priority2 Let states be a list with all posible automata's states3 Let times be a list of time instants when transitions4 are going to be generated5 Distribution Functions to model the ocurrence6 of monotonous behaviours:7 Let mb be the function to model monotonous behaviours8 Let nb be the function to model non monotonous behaviours9 Let ab be the function to model anytime behaviours1011 //Instants of time initialization1213 for all s in states do14 if isAnytime(s) then15 times(s)<-ab()16 else if isMonotonous(s) then17 //Initial and final instants of bounded time slot18 i <- ini(s)19 e <- end(s)20 times(s)<-mb(i, e)21 else if isNonMonotonous(s) then22 times(s)<-nb()23 endif24 endfor2526 currentTime <- 027 //current state is defined by 3 numbers, one per level28 level0 <- 029 level1 <- 030 level2 <- 031 currentState<-newState(level0,level1,level2)

32 //Automata begin to move when the times are initialized3334 //If an anomalous state is reached, the execution stops35 while(level0(currentState)<>4)36 //If current task ends, initial state is activated37 if timeLeft(currentState)=0 then38 currentState <- newState(0,0,0)39 endif40 //If next state has higher priority than current state,41 //current state becomes a pending task and it is stored42 //in pt43 if (size(pt))>044 nextState <- first(pt)45 if priority(nextState)>priority(currentState) then46 add(pt,currentState)47 currentState<-nextState48 remove(pt,nextState)49 endif50 endif51 for all s in states do52 time <- first(times(s))53 //If current time matches first time instant in54 //times, the task associated with this time instant55 //is added to pending tasks56 if currentTime=time then57 remove(times(s),time)58 add(pt,s,priority(s))59 endif60 endfor61 currentTime <- currentTime + 162 //Decrement remaining time for finishing current task63 time <- timeLeft(currentState) - 164 setTimeLeft(currentState, time)65 endwhile

Fig. 6. Pseudocode for the abstract machine interpreter which simulates subject's behaviour

1 @Test2 public void mealTimeLevel2Transitions() {3 actualState <- newState(2,0,0); //first state of meal time in level 24 for(int i=1;i<=4;i++){5 currentState.nexState();//generating transition6 //for a201 the index in level 0 is 2, index in level 1 is 0, and so on.7 int j = currentState.getStateIndexForLevel(2);8 String alertMessage="The transition from" + currentState.toString() + "should be to state a20" + i;9 assertEquals(alertMessage,i,j); //check if transition is right10 }11 }

Fig. 7. JUnit code for the automaton of level 2

elderly people over two months. Source data is in

the form of a log �le. This log shows in which room

of the house the subject is located (including also

if the subject is leaving the house, seated or sleep-

ing).

In this approach, validating the SMUT means

assuring that the right probability distribution

function is used to reproduce the transition bet-

ween the di�erent states of the automation (i.e.

behaviours). Thus, preprocessing log data is trans-

forming it into a data series which consists of

the instants of time in which transition between

behaviours take place. This data series is then

compared with a similar data series obtained by

means of SMUT simulations in the same condi-

tions. Comparison is made by a goodness of �t

statistical test.

In the case of the non monotonous behaviours

and anytime behaviours, preprocessing the log

data involves the extraction of the time series of

the moments in which each event is produced.

These time series only include values within the

typical awake period of the corresponding user.

Obviously, while the user is sleeping, his daytime

routines change. The log treatment for monoto-

nous behaviours is slightly di�erent. This kind of

behaviour is usually produced in a bounded time

slots. For example, having dinner, having lunch or

Page 14: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

sleeping are behaviours which occur during spe-ci�c daily time periods. So, the preprocessing taskinvolves extracting behaviour events inside suchtime slots. Note that time slots can be slightlydi�erent for each elderly person, but it is possi-ble to de�ne an approximation of them which willbe valid for all (section 4 shows the con�gurationparameters). Finally, time intervals between eachevent are measured. The extracted time series arecomposed of these time intervals.

5.2.2. Model DiagnosisSS models which describe social processes, like

the model proposed here, are generally hard to va-lidate. In this approach, the behaviour is probabi-listically modelled. A good approach to assess thatthe model explains real data with a reasonable ac-curacy is to use a statistical test. This process iscalled model diagnosis. This section explains themodel diagnosis made on the SMUT. In this con-text, the most serious problem we can �nd is thelack of real data [39]. Fortunately, as explainedabove, data is available from three real subjects.From these preprocessed data, some histograms

for di�erent behaviours and people are shown inFigure 8. The sample pdf is shown with a solidline. The dashed line shows the pdf of the theo-retical distribution (see section 4) which best �tsthat behaviour.Graphs (a) and (b) in Figure 8 show two mo-

notonous behaviours: sleeping and having dinner(having lunch is roughly similar). This kind of be-haviour is modelled with a gamma distribution inthis approach because it is a suitable distributionas a probability model for waiting times. Figure8 (c) shows a non monotonous behaviour; goingto the toilet. This behaviour is �tted with an ex-ponential distribution in this work, as a specialcase of the gamma distribution. Finally, Figure8 (d) shows an anytime behaviour; spare time.This behaviour is often interrupted by other be-haviours (e.g. going to the toilet). These interrup-tions make the probability density function of thesample heavy-tailed and �tted by the Lomax dis-tribution. At this point, it is already possible tovisually check that the sample density is similar toa theoretical density for all the four graphs. Howe-ver, more statistical evidence supporting the use ofgamma, exponential and Lomax distributions formonotonous, non monotonous and anytime beha-viours respectively is given now.

Table 1

P-values for each subject used in the study and the corres-ponding behaviours.

Behaviour

Person Sleep Dinner Eat Toilet Spare time

A 0.404 0.311 0.361 0.111 0.086

B 0.488 0.467 0.542 0.079 0.108

C 0.337 0.489 0.575 0.137 0.103

It is possible to estimate the distance between atime series generated by a sample (i.e. real data)and over generated by simulation of theoreticaldistributions (i.e. theoretical distributions whichare used in the SMUTs). One of the most well-known statistical tests to be applied in this caseis the Kolmogorov-Smirnov (K-S) test [40]. TheK-S test is a nonparametric and distribution-freegoodness-of-�t test. It does not rely on parameterestimation or precise distributional assumptions[41]. Considering that the SMUT, as designed,does not assume any concrete probability distri-bution and does not require parameter estimation,then the K-S test is suitable here. Note that, al-though both K-S and the χ2 tests are the mostcommonly used and for large size sample bothtests have the same power, the χ2 test requires asu�cient sample size to obtain a valid chi-squareapproximation [42,43]. The K-S is a goodness-of-�t test to indicate whether it is reasonable or notto assume that a random sample comes from a spe-ci�c distribution. It is a form of hypothesis testingwhere the null hypothesis says that sample datafollow the stated distribution. The hypothesis re-garding the distributional form is rejected if thetest statistic, Dn, is greater than the critical value(i.e. if the p-value is lower than the signi�cancelevel). The signi�cance level is �xed in this work at0.05, which is the value usually given in statisticalliterature.Table 1 shows the p-values obtained from the K-

S test for each validated behaviour with the ade-quate distribution. The null hypothesis is that thebehaviour sample data come from the stated dis-tribution and it is rejected if p-value is lower thanthe signi�cance level. From these results, none ofthe stated null hypothesis can be rejected. There-fore behaviour of the three considered subjects (i.e.A, B and C) can be �tted by the speci�ed pdfs, asdescribed above.From the table, all simulated behaviours sta-

tistically occur in a similar manner to real beha-viours from the three subjects. Thus, the SMUT

Page 15: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

(a) (b)

(c) (d)

Fig. 8. (a) Minutes between 21:00 hours and the instant subject C goes to bed, (b) Time between dinners for subject B, (c)Time between uses of the toilet for subject A, (d) Time between spare time for subject A

is validated. By validating this SMUT, the Ubik-Sim tool is ready to work on the validation ofSUTs which need such subject behaviour to com-ply with the functionality required. Speci�c exam-ples of SUT development may be found in otherpapers [13,44,45].

6. Related Works

A number of approaches dealing with testing,validation and veri�cation of AmI systems can befound in the literature. All of them are based onusing support tools. A well-known tool is Ubiwise[46]. Ubiwise focuses on the use and analysis of en-vironment models for ubiquitous computing sys-tems. For this purpose, sensors and its communi-cations can be de�ned. People are considered indi-vidually on a simulation engine based on Quake II,where real people, acting as players in a game, ge-

nerate information about their own context, whichis captured through simulated sensors. Multipleusers can link up to the same server to create in-teractive ubiquitous computing scenarios. Howe-ver, the virtual subjects are not autonomous, sincethey must be controlled by the users. The subjectsinteract with the environment in order to validateits deployed services.TATUS [47] is another tool which allows experi-

mentation of adaptive ubiquitous computing sys-tems. It is based on a graphic engine called Half-life. In this case, the main novelty is that mul-tiple SUT may be connected to the engine. TheSUT is adaptive so it makes decisions about chan-ging its behaviour in reaction to user's movementsand behaviour. Other environmental factors suchas network conditions, ambient noise or social set-ting can also be considered. In this case, some vir-tual subjects could behave autonomously due tosimple AI scripts, although the tool is mainly fo-

Page 16: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

cused on user-controlled characters and their in-teractions with the SUT.Another interesting tool is UbiREAL [48]. It lets

the users intuitively grasp how devices are con-trolled, depending on temporal variation of con-texts in a virtual space. The main contributionof UbiREAL is that it simulates physical quanti-ties (e.g. temperature or humidity) and includes anetwork simulator. This network simulator allowscommunication between virtual devices and realdevices. This is a very important aspect since it ispossible to inject reality into the simulations. Thebehaviour of the virtual subjects must be precon-�gured (it is possible to de�ne a route and some ac-tions to perform). Those existing simulators onlysimulate behaviours manually or based on simpleautomaton. UbikSim contributes to the area withthe introduction of autonomous virtual humans tothe simulations.Regarding engineering realistic human beha-

viours, various approaches have been proposedto create autonomous characters. The approachto behavioural autonomy presented in section 4is based on the proposal presented by Garcia-Valverde et al. [44], where behaviours are de�nedas states in a hierarchical automata and transi-tions between behaviours are probabilistic. Theproposed environment was an o�ce building wherethe workers did usual routines in a day of work.The simulations obtained allow them to build AmIadvanced services which are able to adapt to themovements of the agents within a building.In approaches [49] and [50], the behavioural mo-

dels described also use a hierarchical structureof �nite state automata similar to the model de-scribed in [44]. In these cases, each behaviour ofa behaviour sequence is called a behaviour cell.There is a behaviour entity with a �nite state au-tomaton composed of at least one behaviour cellat the top of the structure. An elementary be-haviour is situated at the bottom of the hierarchi-cal decomposition and encapsulates a specializedbehaviour which directly controls one or more ac-tions. The lack of both approaches is the absence ofprobability. In the work of Chittaro et al. [51], thebehaviour sequences are modelled through proba-bilistic automata (Probabilistic Finite-State Ma-chine, PFSMs). Probabilistic personality in�uenceimplies that one cannot fully predict how a cha-racter will react to a stimulus. Because of that, aprobabilistic approach gives more realism to simu-

lated behaviours. The work of Garcia-Valverde etal. [44] includes both ideas. They de�ne a proba-bilistic hierarchical automaton to achieve realisticand complex behaviours.An important feature added to our model is the

idea of a list of prioritized events. This idea isbased on Temime et al. [52], where human agentshave a pending task list. At each time tick, agentsof the users check their task list in order to out-standing tasks. Following this approach, the moreimportant tasks must be done before the lowerones, as the list is ordered by priorities. In the pa-per presented, the same idea is applied to tran-sitions in the automata. They are weighted witha constant priority value. Then, if a transition toa new state is generated but its priority is lowerthan the current state, this transition is stored in apending task list. These processes which are basedon priorities give more realism to human beha-viours.Other related approaches have been reviewed.

For example, Arthur [53] develops agents that actand choose in the way actual humans do. Theagents are represented using parameterised deci-sion algorithms, and choose and calibrate these al-gorithms so that the agents' behaviour matchesreal human behaviour observed in the same deci-sion context. For this purpose, they use a parame-terised learning automaton with a vector of asso-ciated actions that can be weighted to choose ac-tions over time the way humans would. The struc-ture of the automata is similar to the one presentedin [44] but this work is more focused on machinelearning. The virtual humans take decisions andlearn from them. For this purpose, a detailed studyabout reactions to di�erent stimulus of real sub-jects must be done and this is not within the scopeof this paper. Anastassakis et al. [54] present anapproach where every character is provided with asmall KBS (Knowledge-Based System) for intelli-gent reasoning. A reasoning system is also used byNoser et al. [55]. Such methods are very �exible,but de�ning the knowledge base is a complex andtime-consuming task.SS models have already been used for enginee-

ring AmI systems in the literature. Reynolds et al.[56] simulate sensors, actuators, and the environ-ment in an initial work on the design of a genericsimulation tool. Liu et al. [57] propose a scalableframework for testing mobile context-aware appli-cations based on the use of a multi-agent sys-

Page 17: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

tem which models complex and dynamic user's be-haviour. Martin et al. [58] propose a simulationwhich separates agents, environment and a contextde�ned by variables and maps. UbikSim is morecomplete than the aforementioned approaches be-cause (1) it covers environment, context-aware,users and adaptation models and not a subset ofthese parts of an AmI service; (2) the users modelcan easily be extended by adding levels or statesto the hierarchical automaton; (3) it provides amethodological proposal which guides the processof validating an AmI service.

7. Conclusion and Future Works

This work proposes a general methodology forthe validation of AmI services and applications.It is a contribution to the engineering process ofubiquitous systems in general and AmI systems inparticular. It is based on the simulation paradigm.When a new AmI system is going to be built, thetypical physical environment, the devices (i.e. sen-sors and actuators) to be used, kind of users, ser-vices and applications must be identi�ed. Then,the environment, devices and users are modelledin the form of SMUTs and validated. This valida-tion must be combined, by applying the methodo-logy, with the validation of the corresponding ser-vices and applications (i.e. the SUTs). Validationis based on simulation. Simulation runs generatedata, which must be analyzed by means of forensicanalysis. This forensic analysis gives clues aboutthe maturity of the software component being va-lidated. Also it is interesting to note that, if theSMUT is too complex as a model, a divide-and-conquer strategy is used to divide it into simpleSMUTs in which the methodology is applied again.We have shown the support tools which come

with the methodology. UbikSim is a realistic en-vironments simulator which can be used to testSMUTs and SUTs, to verify and validate themunder the application of the methodology. Theapplication of the methodology has been shown bymeans of a running example. This example was fo-cused on the Necesity product and how to validatea particular SMUT. This validation can be usedafterwards to validate SUTs (e.g. an adaptive al-gorithm to detect abnormal situations at the el-derly people's home). The approach used to gene-rate realistic user behaviour is shown in the pa-

per along with how the corresponding simulationmodel of subjects is validated using a statisticaltest based on goodness of �t (i.e. the Kolmogorov-Smirno� test).The methodology and UbikSim tool are curren-

tly being applied in a number of projects relatedto ambient intelligence. Obviously, Necesity is aclear example. But other projects also use our pro-posal. For example, in CARONTE6, the methodo-logy is being used to provide assistance in emer-gency situations which may occur in geriatrics.CARDINEA7 develops AmI services for caregiversin hospitals. The approach of this paper is be-ing used for veri�cation and validation of services.Note that in these projects the scenarios are geri-atrics or hospitals, i.e. multi-user environments. Infact, we are working on the model of users in thesekinds of environments where the methodology andusers' model are still suitable, taking into accountthe relations and interactions between the agentsof the simulation. This is possible thanks to the useof social simulations. Although more experimentsare still needed, preliminary tests have shown thatour approach is able to model the behaviour of theusers in multi-inhabitant environments.Future works also include a deep study of source

data. Incorporating data from all the availablesubjects into a deeper study would help to create ataxonomy of elderly people's behaviours (i.e. kindof mobility, habits and so on). Such a taxonomywould be useful for an automatic parameter tu-ning of the models of elderly people. The user ofthe simulator, instead of con�guring parametersby hand, would simply choose between a catalogueof elderly people's patterns of behaviour.

Acknowledgements

This research work is supported by the Spa-nish Ministry of Science and Innovation undergrants AP2007-04269 and AP2007-04080 of theFPU program and in the scope of the ResearchProjects TSI-020302-2009-43 and TSI-020302-2010-129, through the Fundación Séneca withinthe Program �Generación del Conocimiento Cien-tí�co de Excelencia� (04552/GERM/06)

6CARONTE website:http://caronte.germinus.com/inicio

7CARDINEA website:http://cardinea.grupogesfor.com/home

Page 18: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

References

[1] E. Aarts and J. L. Encarnacao. True visions: Tales onthe realization of ambient intelligence. In Into Am-bient Intelligence, Chapter 1. Springer Verlag, Berlin,Heidelberg, New York, 2005.

[2] D. J. Cook, M. Youngblood, E. O. Heierman III,K. Gopalratnam, S. Rao, A. Litvin, and F. Khawaja.Mavhome: An agent-based smart home. In Perva-sive Computing and Communications, 2003.(PerCom2003). Proceedings of the First IEEE InternationalConference on, pages 521�524. IEEE, 2003.

[3] F. Doctor, H. Hagras, and V. Callaghan. A fuzzy em-bedded agent-based approach for realizing ambient in-telligence in intelligent inhabited environments. Sys-tems, Man and Cybernetics, Part A: Systems and Hu-mans, IEEE Transactions on, 35(1):55�65, 2005.

[4] C. Kidd, R. Orr, G. Abowd, C. Atkeson, I. Essa,B. MacIntyre, E. Mynatt, T. Starner, and W. Newstet-ter. The aware home: A living laboratory for ubiqui-tous computing research. Cooperative buildings. Inte-grating information, organizations, and architecture,pages 191�198, 1999.

[5] S. S. Intille. Designing a home of the future. PervasiveComputing, IEEE, 1(2):76�82, 2002.

[6] E. Aarts and B. de Ruyter. New research perspectiveson ambient intelligence. Journal of Ambient Intelli-gence and Smart Environments, 1(1):5�14, 2009.

[7] L. Rudolph. Project oxygen: pervasive, human-centriccomputing�an initial experience. In Advanced Infor-mation Systems Engineering, pages 1�12. Springer,2001.

[8] F. Cardinaux, D. Bhowmik, C. Abhayaratne, andM. S. Hawley. Video based technology for ambi-ent assisted living: A review of the literature. Jour-nal of Ambient Intelligence and Smart Environments,3(3):253�269, 2011.

[9] S. Marzano. People as a source of breakthrough in-novation. Design Management Review, 16(2):23�29,2005.

[10] A. Pentland. Perceptual environments. Smart Envi-ronments, pages 345�359, 2005.

[11] S. Dashtinezhad, T. Nadeem, B. Dorohonceanu,C. Borcea, P. Kang, and L. Iftode. Tra�cview: a driverassistant device for tra�c monitoring based on car-to-car communication. In Vehicular Technology Confer-ence, 2004. VTC 2004-Spring. 2004 IEEE 59th, vol-ume 5, pages 2946�2950. IEEE, 2004.

[12] Y. Shi, W. Xie, G. Xu, R. Shi, E. Chen, Y. Mao, andF. Liu. The smart classroom: merging technologies forseamless tele-education. Pervasive Computing, IEEE,2(2):47�55, 2003.

[13] T. Garcia-Valverde, A. Garcia-Sola, and J. A. Botia.Improving RFID's location based services by means ofhidden markov models. European Conference of Ar-ti�cial Intelligence (ECAI' 10). Lisbon, Porto., pages4�7, 2010.

[14] D. J. Cook, J. C. Augusto, and V. R. Jakkula. Ambi-ent intelligence: Technologies, applications, and oppor-tunities. Pervasive and Mobile Computing, 5(4):277�298, 2009.

[15] G. J. Myers, C. Sandler, T. Badgett, and T. M.Thomas. The Art of Software Testing, Second Edition.Wiley, June 2004.

[16] Institute O. Electrical and Electronics E. (ieee). IEEE90: IEEE Standard Glossary of Software EngineeringTerminology. 1990.

[17] W. R. Adrion, M. A. Branstad, and J. C. Cherniavsky.Validation, veri�cation, and testing of computer soft-ware. ACM Comput. Surv., 14(2):159�192, 1982.

[18] P. Davidsson. Agent based social simulation: A com-puter science view. J. Arti�cial Societies and SocialSimulation, 5(1), 2002.

[19] X. Li, W. Mao, D. Zeng, and F. Wang. Agent-basedsocial simulation and modeling in social computing.In Proceedings of the IEEE ISI 2008 PAISI, PACCF,and SOCO international workshops on Intelligenceand Security Informatics, PAISI, PACCF and SOCO'08, pages 401�412, Berlin, Heidelberg, 2008. Springer-Verlag.

[20] J. A. Botía, A. Villa, J. T. Palma, D. Pérez, andE. Iborra. Detecting domestic problems of elderlypeople: simple and unobstrusive sensors to generatethe context of the attended. In First InternationaWorkshop on Ambient Assisted Living, IWAAL, Sala-manca, Spain, 2009.

[21] N. Gilbert and K. G. Troitzsch. Simulation for the So-cial Scientist. Open University Press, February 2005.

[22] P. A. Fishwick. Simulation model design and execution: building digital worlds. Prentice Hall internationalseries in industrial and systems engineering. PrenticeHall, Englewood Cli�s, NJ, 1995.

[23] P. A. Fishwick. Computer simulation: growth throughextension. Trans. Soc. Comput. Simul. Int., 14:13�23,March 1997.

[24] A. Drogoul, D. Vanbergue, and T. Meurisse. InProceedings of the Third International Workshop onMulti-Agent-Based Simulation MABS 2002, Bologna,Italy. Berlin Heidelberg.

[25] Flowcharting techniques. IBM GC20-8152-1 edition,1969.

[26] D. Midgley, R. Marks, and D. Kunchamwar. Buildingand assurance of agent-based models: An example andchallenge to the �eld. Journal of Business Research,60(8):884�893, August 2007.

[27] R. Axelrod. Advancing the art of simulation in thesocial sciences. Complex., 3(2):16�22, 1997.

[28] S. Ferguson. How computers make our kids stupid.Maclean's, 118(23):24�30, June 2005.

[29] T. Ishida, Y. Nakajima, Y. Murakami, and H. Nakan-ishi. Augmented experiment: participatory design withmultiagent simulation. In IJCAI'07: Proceedings ofthe 20th international joint conference on Arti�cal in-telligence, pages 1341�1346, San Francisco, CA, USA,2007. Morgan Kaufmann Publishers Inc.

[30] T. Garcia-Valverde, E. Serrano, J. A. Botia, A. F.Gomez-Skarmeta, and J.M. Cadenas. Social simula-tion to simulate societies of users inmersed in an am-bient intelligence environment. 1st IJCAI Workshopon Social Simulation (SS@IJCAI2009), 2009.

[31] O.S.G. Alliance. Osgi service platform, release 3. IOSPress, Inc., 2003.

Page 19: Simulation of Human Behaviours for the Validation of ... · Keywords: Ambient Intelligent, Ambient Assisted Living, Behaviour Simulation, User Modelling 1. Introduction An Ambient

[32] I. Nieto, J. A. Botía, and A. F. Gómez-Skarmeta. In-formation and hybrid architecture model of the OCPcontextual information management system. Journalof Universal Computer Science, 12(3):357�366, 2006.

[33] E. Serrano, J. A. Botia, and J. M. Cadenas. Ubik:a multi-agent based simulator for ubiquitous compu-ting applications. Journal of Physical Agents, 3(2):39,2009.

[34] G. Darmois. Sur les lois de probabilit a estimationexhaustive. CR Acad. Sci. Paris, 260:1265�1266, 1935.

[35] B. O. Koopman. On distributions admitting a su�-cient statistic. Transactions of the American Mathe-matical Society, pages 399�409, 1936.

[36] A.M. Law, W.D. Kelton, and W.D. Kelton. Simulationmodeling and analysis. McGraw-Hill New York, 1991.

[37] K.G. Troitzsch. Validating simulation models. Net-worked Simulations and Simulated Networks, pages265�270, 2004.

[38] M. Richiardi, R. Leombruni, N. Saam, and M. Son-nessa. A common protocol for agent-based social si-mulation. Journal of Arti�cial Societies and SocialSimulation, 9(1):15, 2006.

[39] F. Klügl. A validation methodology for agent-basedsimulations. In Proceedings of the 2008 ACM sympo-sium on Applied computing, pages 39�43. ACM, 2008.

[40] H.R. Neave and P.L. Worthington. Distribution-freetests. Routledge London, 1989.

[41] D. Sheskin. Handbook of parametric and nonparame-tric statistical procedures. CRC Pr I Llc, 2004.

[42] F. J. Massey Jr. The Kolmogorov-Smirnov test forgoodness of �t. Journal of the American StatisticalAssociation, 46(253):68�78, 1951.

[43] F. N. David and N. L. Johnson. The probability in-tegral transformation when parameters are estimatedfrom the sample. Biometrika, 35(1-2):182, 1948.

[44] T. Garcia-Valverde, A. Garcia-Sola, F. Lopez-Marmol,and J. A. Botia. Engineering Ambient Intelligence Ser-vices by Means of MABS. Trends in Practical Applica-tions of Agents and Multiagent Systems, pages 37�44,2010.

[45] Emilio Serrano and Juan Botia. Validating ambientintelligence based ubiquitous computing systems bymeans of arti�cial societies. Information Sciences,November 2010.

[46] J. Barton and V. Vijayaraghavan. Ubiwise: A ubiqui-tous wireless infrastructure simulation environment.HP Labs, 2002.

[47] E. O'Neill, M. Klepal, D. Lewis, T. O'Donnell,D. O'Sullivan, and D. Pesch. A testbed for evalua-ting human interaction with ubiquitous computing en-vironments. In Testbeds and Research Infrastructures

for the Development of Networks and Communities,2005. Tridentcom 2005. First International Confer-ence on, pages 60�69. IEEE, 2005.

[48] H. Nishikawa, S. Yamamoto, M. Tamai, K. Nishi-gaki, T. Kitani, N. Shibata, K. Yasumoto, and M. Ito.UbiREAL: Realistic smartspace simulator for syste-matic testing. UbiComp 2006: Ubiquitous Computing,pages 459�476, 2006.

[49] D. Thalmann, S.R. Musse, and M. Kallmann. VirtualHumans' Behaviour: Individuals, Groups, and Crowds.

Proceedings of Digital Media Futures, pages 13�15,1999.

[50] P. Bécheiraz and D. Thalmann. A behavioral anima-tion system for autonomous actors personi�ed by emo-tions. In Proceedings of the 1998 Workshop on Em-bodied Conversational Characters. Citeseer, 1998.

[51] L. Chittaro and M. Serra. Behavioral programmingof autonomous characters based on probabilistic au-tomata and personality. Computer Animation andVirtual Worlds, 15(34):319�326, 2004.

[52] L. Temime, Y. Pannet, L. Kardas, L. Opatowski,D. Guillemot, and P.Y. Boëlle. NOSOSIM: an agent-based model of pathogen circulation in a hospital ward.In Proceedings of the 2009 Spring Simulation Multi-conference, pages 1�8. Society for Computer Simula-tion International, 2009.

[53] W. B. Arthur. On designing economic agents that be-have like human agents. Journal of Evolutionary Eco-nomics, 3(1):1�22, 1993.

[54] G. Anastassakis, T. Panayiotopoulos, and T. Ritch-ings. Virtual agent societies with the mVITAL intelli-gent agent system. In Intelligent Virtual Agents, pages112�125. Springer, 2001.

[55] H. Noser and D. Thalmann. Towards autonomous syn-thetic actors. Synthetic Worlds. TL Kunii and A. Lu-ciani, John Wiley and Sons, Ltd, 1995.

[56] V. Reynolds, V. Cahill, and A. Senart. Requirementsfor an ubiquitous computing simulation and emulationenvironment. In Proceedings of the �rst internationalconference on Integrated internet ad hoc and sensornetworks, page 1. ACM, 2006.

[57] Y. Liu, M. O'Grady, and G. O'Hare. Scalable ContextSimulation for Mobile Applications. In On the Move toMeaningful Internet Systems 2006: OTM 2006 Work-shops, pages 1391�1400. Springer, 2006.

[58] M. Martin and P. Nurmi. A generic large scale simu-lator for ubiquitous computing. In Mobile and Ubiqui-tous Systems: Networking & Services, 2006 Third An-nual International Conference on, pages 1�3. IEEE,2007.