Touch Screen Interactive Simulations on Android Tablet...

1
Touch Screen Interactive Simulations on Android Tablet Computers Victor Du Preez and Ken Hawick Computer Science, Massey University, Albany, North Shore 102-904, Auckland, New Zealand http://complexity.massey.ac.nz Tablet Computers Tablet computers are emerging as powerful platforms for ed- ucational and demonstration software in areas like compu- tational science and simulation which previously had needed higher performance. The human-computer interaction (HCI) mechanisms and potential of tablet devices make them attrac- tive for highly interactive simulations. We describe how a fam- ily of complex systems simulation models were developed as a domain-related family of Apps and discuss the software engi- neering issues we encountered in generalising the data struc- tures, and simulation code patterns to run as Android Apps. We have implemented a number of simulation models includ- ing: the Ising model of magnetism; diffusion models; Conway’s Game of Life and variants of it; and other cellular automata models. We show how the touch screen capability leads to new ways for users to interact with a running simulation - to steer the computations and to explore parameter model space. We also discuss performance achieved on various models with a range of modern tablet computers and other devices with similar processors. We describe our HCI and user-interface experiments and experiences on various Android Apps and dis- cuss possible future directions for interfacing to such highly interactive models on tablet computers using native App pro- grams. Although Human-Computer Interfacing is now a relatively ma- ture discipline with many of the key guiding principles well studied [1, 2, 3], the widespread availability of new compute devices is leading to hitherto unexplored new ways for inter- acting with interactive programs. The multi-touch capability of tablet computers is a particularly interesting area that is still being explored by new communities of users for various disciplines. Gestural interfaces are not new[4] although with the very rapid product commoditization of touch sensitive tablet computing the research and textbook literature on multi-touch and ges- tural systems has not yet caught up and there are surprisingly few accounts of multi-touch applications and associated expe- riences. Figure 1: Android tablet running interactive Ising model simulation, showing direct manipulation rendered model, interactive widgets, and running plot of magnetism. There are good accounts in the literature of HCI experiments and applications[5] for tablet computing on data entry[6], database interaction[7], and interactive training[8]. Software development work is also reported on HCI frameworks that will further enable these applications. We are interested in highly-interactive computer simulations in a wide range of scientific disciplines, but a particular area of growing interest is that of simulation models that can be steered and interacted with using the direct manipulation paradigm as it is known in the HCI literature. Relatively low performance tablets can be used for many mobile comput- ing and learning applications but the platform performance re- quirement for running a computationally intensive simulation is similar to that of interactive computer gaming[9, 10, 11]. Figure 1 shows an example of the sort of interactive simula- tion we are interested. The figure shows an Android computer running a direct manipulation model rendering as well a run- ning plot of model properties, and also a more common set of graphical user interface control widgets[12]. In this present paper we focus on two dimensional grid models such cellular automata. We develop some mechanisms for exploring touch and multi-touch interaction paradigms with this family of mod- els and use this to experiment with the capabilities of tablet based computers. Although tablet computers have had a somewhat checkered history, they are now emerging as surprisingly good platforms for simulation demonstration models since they now relatively cheap, widely accessible to students, and are easily held with- out needing to be wired. Most recently, available devices such as the iPad, and a range of vendors’ Android-based tablet systems have sufficient computational power to run simula- tion models of suitable size and complexity so as to be useful independent of the need for server-side computations. The platforms we used in this paper had dual-core processors, and quad-core processors are likely to become widely available in tablets later this year. A number of different tablet platforms are now available and it is not entirely trivial to determine which vendor invented or originated which feature and there is especial confusion about the origins of the multi-touch capability. The work we report in this paper used a range of Android[13, 14] operating system supported tablets from vendors including: Samsung, Asus and Motorola although we have also developed prototype Apps for the Apple iPad. Putting aside proprietary concerns, the multi- touch capability is here, is becoming widely available and is not yet a well explored HCI paradigm for many applications and user communities. These devices thus open up a new area of HCI and inter- action capabilities that will be important to other emerging compute intensive applications such as interactive simulations. The range of HCI possibilities for tablets is quite different from that for desktop computer interaction modes and is not yet a fully explored nor understood space. Making inroads into this space is the main contribution of this paper. Programming Android The Android operating system has emerged as a powerful and popular platform for developing Apps to run on mobile plat- forms such as tablet computers and smart phones. Android is a Unix/Linux like operating system with many well established, tested and widely used and understood internal software mod- els. Its relative open-ness makes it an attractive development platform for mobile Apps[15, 16, 17]. We employed a Java program running using the Android Soft- ware Development Kit and the HCI and graphical user interface widgets to build the Apps for our simulations. The interface ideas for Android are relatively mature[18] but we did have to resort to our own development work for some features such as the multi-scale zooming and model navigation. Some of the ideas of gesture based interfacing for touch sensitive screens are now well known and have been documented for systems such as the iGesture tablet. Others such as some of the multi- touch features are not yet well documented or explored. Figure 2: Touch event handling pseudo-code. The code listed is an example of our own development of the multi-touch interface. This is the pseudo-code for the event handling and detection of touches in our App. The code stipu- lates how we detect multiple touches and how we react to the one, two or three fingers. These events then trigger different functionality in the App which allows for real time interaction for both the user and the simulation. Android supports what is known as a multiple activities model and the screen area is managed using various “activities.” The SDK supports many of the conventional GUI widgets such as sliders , buttons, menus and so forth. A canvas and associ- ated component library is available to manage things such as a plot of model properties. We found it useful to make this plot feature optional - for some models it is a distraction, but for others such as the Ising model family of simulations it is very valuable to let the user see the dramatic change in a model property when a parameter is adjusted. The plots (shown as a red tracer in the screen shots in below is a plot of the mag- netization – or the fraction of ”up” cell sites compared to the number of cells. The issues concerned with managing concurrency and paral- lelism of a running simulation that is served by interactive controls is a subject for a separate article. In summary we establish an atomic data field that can serve as a shared lock between the stop start GUI buttons and the thread that runs the simulation. Android provides some suitable software appa- ratus to manage the concurrency but we did find we needed to experiment considerably to minimise garbage collection in- terference that slowed down the running simulation. The main part of our App model is the direct manipulation ren- dered model of the simulated system. This is a mesh of integer cells, on an x-y mesh. For the models we experimented with it was convenient to impose periodic boundary conditions. This means that the model mesh wraps around in both vertical and horizontal directions – it is in effect simulating the surface of a torus. This means that we do not need special edge conditions for cells and all cells have the same number of neighbours. This greatly simplifies the simulation code with no loss of generality as far as our HCI experimentation is concerned. There are two coordinate systems we must manage - the model coordinates in terms of x [0,L x ), y [0,L y ), for a model of N L x L y cells, but also a pixel coordinate system determined by the properties and limitations of the particular tablet. A definite limitation of the touch screen technology is the reso- lution achievable by a human finger. Although capacitive stylus devices can typically be employed to improve spatial resolution, it is desirable to be avoid the need for these and let the user use their fingers directly. This means we need to be able to make the chosen cell region enlarge so that a specific cell is easily and unambiguously touched and hence edited. For the discrete state models we simulated it is convenient to adopt the idiom that a touched cell cycles its value through its allowed states. The touch screen and SDK allow multiple touches to be recog- nized. This offers capabilities not accessible using a mouse on a normal screen. The two-touch ”pinch-to-zoom” idiom is now quite well known in the context of manipulating static images. We implemented it for our evolving model renderer by dynam- ically transforming the rendering into an enlarged or shrunk length scale based on the average of the identified two touch points in pixel space, as they transform to model coordinate space. This enables the use to pinch-to-zoom out or to move the touching fingers apart to zoom in to the model. The model is constrained at initialisation, although different sizes can be chosen, this only makes sense as a static rather than dynamic choice. Touch Screen HCI There are a number of gestures possible with the touch screen. These can be used to control the simulations directly. Figure 3: The four HCI interactions used in the simulation App: conventional widget interaction; three finger pan and swipe; two finger pinch-to-zoom; and single touch model cell toggle-edit. Figure 3 shows the four HCI interaction idioms we experi- mented with on the tablet screen. A single finger can be used in the normal way to press buttons, adjust sliders, and enter text into fields in the usual way with a pop up touch keyboard. Three or more finger touches followed by a swipe motion are recognized as meaning pan the model rendering around in model space to vie w a different windowed region. Two touch points indicates a pinch-to-zoom action, with the enlarged region pivoting around the average of the two touched points. A single finger touch within the direct manipulation model rendering indicates a toggle or cell edit action. Figure 4: Ising Model screenshots showing different temperature and field effects. The top shot shows large spin clusters that have formed in the system for a particular temperature and applied field, and the bottom shot shows the effect of applying a stronger field bias so that the grey spin cells win at the expense of the white cells. The red plot trace charts the magnetization evolving with simulation time. The tablet platform is performing quite a lot of computations to run this simulation - each cell is updated each time-step and involves gathering information about its neighbours and using these to compare an exponential function evaluation compared with a randomly generated number to emulate the thermal heat-bath process. Figure 5: Model selection screen-shots showing typical menu choices available with the graphical user interface software development kit. This shows the sort of menu options that are supported by the graphical user interface software development kit. The top shot shows a global set of choices offered on App startup - this could support some initial startup or “splash” screen informa- tion. The middle view is a flat menu of the models supported by our App. Additional help or explanatory information could be provided in this view. The bottom shot is of some model cell pattern initialisation choices offered to the user to initialise the model simulation. These could be built in choices or saved from a previous run or stored in a some sort of model-specific formatted file set. Discussion The tablet devices we employed for this work all used a two core processor and the Android operating system makes good use of this to manage the touch screen interactivity and the simulation model updates quite well using appropriate thread- ing and concurrency. We did find we needed to specify quite carefully how the model data should be treated however to make sure garbage collection inefficiencies did not slow down the model. Models like the sort we describe here are interesting to exper- iment with on a range of different sizes and scales. Generally speaking the larger the supportable size the richer the set of complex phenomena we can explore, but even relatively small sizes of N 100 × 100 cells as shown here are still interesting. Present level tablet processor performance can perhaps man- age to double this model size without noticeable slow down on some of the simpler (less computationally intensive) models. We estimate that a typical desktop multi core processor is ca- pable of at least another factor of four in model size scale-up with similar user perceived model evolution speed. Neverthe- less the ability to experiment with these simulation models in the same way as users interact with casual gaming platforms is useful for model investigation, building up user intuition about the meaning of model parameters, and of course for disciplinary domain-specific student learning. We employed the Android software development kit for this work. Prior work by us on building interactive models used Java Swing and OpenGL graphical software technologies. We believe the Android kit performs relatively favourably and sup- plies many of the GUI widgets and HCI idiomatic tools that are required but these appears to be room for further devel- opment. Although we were able to use SDK Canvas objects to support code for our App, we did need to implement a lot of software infrastructure ourselves to enable the multi-touch direct model object manipulation apparatus. In the work reported in this paper we have relied on platform specific operating systems, user interface libraries and other software apparatus. Since we had implemented these simula- tion models in other contexts it was not particularly difficult to implement them for Android platforms. A recent software development trend however is to consider platform indepen- dence through the use of highly graphical and interactive web applications. It is interesting to speculate about how future users will expect to be able to interact with touch screen tablet computers and what their performance expectations will be. At the time of writing vendors are announcing higher screen resolution capa- bilities for the next generation of tablet devices. Current global concerns will likely drive the need for low power consumption from such devices, and indeed there are obvious heat emission limitations for hand held devices. We do expect to see greater computational performance become possible with greater num- bers of cores available in processors. Even better systems and user-level management of threading concurrency will be needed to make good use of many cores for interactive Apps in the future. Application such as we describe here may also be able to make use of gesture based interactions that go beyond the touch screen[19] with camera-based interaction able to support ac- curate and platform portable hand and even finger tracking. The gestures that are both intuitive and easily implemented are likely to be different to the standard mouse idioms that users have become familiar with. Mode changing controls such as ”shift-click” and so forth are hard to implement without a per- manent keyboard. It is possible that a multi-model approach using a combination of gestures and voice commands[20] may become feasible on commodity devices however. See: http://www.massey.ac.nz/ ~ kahawick/cstn/154/ cstn-154.html for more information. Acknowlegments Thanks to B.Pearce who helped with this project and to the IIMS Summer Scholarship Programme for financial support. References [1] C. J. Scogings, The Integration of Task and Dialogue Modelling in the Early Stages of User Interface Design. PhD thesis, Massey University, 2003. [2] D. Diaper and N. Stanton, eds., The Handbook of Task Analysis for Human-Computer Interaction. IEA, 2004. [3] C. Scogings and C. Philips, The Handbook of Task Analysis for Human-Computer Interaction, ch. Linking Task and Dialogue Modeling: Toward an Integrated Software Engineering Method, pp. 551–568. IEA, 2004. [4] P. Kortum, HCI Beyond the GUI - Design for Hapric, Speech, Olfactory and other Nontraditional Interfaces. Morgan Kaufmann, 2008. [5] R. Capra, G. Golovchinsky, B. Kules, D. Russell, C. L. Smith, D. Tunkelang, and R. W. White, “Hcir 2011: The fifth international workshop on human-computer interaction and information retrieval,” ACM SIGIR Forum, vol. 45, pp. 102–107, December 2011. [6] S. J. Castellucci and I. S. MacKenzie, “Gathering text entry metrics on android devices,” in Proc. Computer Human Interactions (CHI2011), (Vancouver, BC, Canada), pp. 1507–1512, 7-12 May 2011. [7] N. Buchanan, “An examination of electronic tablet based menus for the restaurant industry,” Master’s thesis, University of Delaware, 2011. [8] J. E. MacDonald, E. M. Foster, J. M. Divina, and D. W. Donnelly, “Mobile Interactive Training: Tablets, Readers, and Phones?Oh, My!,” in Proc. Interservice/Industry Training, Simulation and Education Conference (I/ITSEC 2011), no. 11038, (Orlando, Florida, USA), pp. 1–9, 3-6 December 2011. [9] K. W. Cheng, “Casual gaming.” VU Amsterdam, January 2011. [10] R. Kemp, N. Palmer, T. Kielmann, and H. Bal, “Opportunistic communication for multiplayer mobile gaming: Lessons learned from photoshoot,” in Proc. Second Int. Workshop on Mobile Opportunistic Networking (MobiOpp’10), (Pisa, Italy), pp. 182–184, 22-23 February 2010. [11] C. Feijoo, S. Ramos, and J.-L. Gomez-Barroso, “An analysis of mobile gaming development - the role of the software platforms,” in Proc. Business Models for Mobile Platforms(BMMP10), (Berlin, Germany), October 2010. [12] B. Shneiderman, Designing the User Interface - Strategies for Effective Human-Computer Interac- tion. Addison-Wesley, 1998. [13] J. P. Conti, “The androids are coming,” Engineering and Technology Magazine, vol. May - June, pp. 72–75, 2008. [14] M. J. Johnson and H. K. A, “Porting the google android mobile operating system to legacy hardware,” in Proc. IASTED Int. Conf. on Portable Lifestyle Devices (PLD 2010), (Marina Del Rey, USA), pp. 620–625, 8-10 November 2010. [15] W. Jackson, Android Apps for Absolute Beginners. No. ISBN 978-1-4302-3446-3, Apress, 2011. [16] W.-M. Lee, Beginning Android Application Development. No. ISBN 978-1-118-01711-1, Wiley, 2011. [17] D. Smith and J. Friesen, Android Recipes A Problem-Solution Approach. Apress, 2011. [18] S. Kim, “Logical user interface modeling for multimedia embedded systems,” in Proc. Int. Conf on Multimedia, Computer Graphics and Broadcasting (MulGrab 2011), (Jeju Island, Korea), 8-10 December 2011. [19] D. Graham-Rowe, “Taking touch beyond the touch screen,” MIT Technology Review, vol. Septem- ber, p. Online, 2011. [20] H. A. Murphy, C. C. Sekhar, C. S. Ramalingam, and S. Chakravarthy, “Multimodal interfaces to the computer,” Info. Tech ad Comms. Resources for Sustainable Development, vol. 1, pp. 64–74, 2010. Encyclopedia of Life Support Systems.

Transcript of Touch Screen Interactive Simulations on Android Tablet...

Page 1: Touch Screen Interactive Simulations on Android Tablet Computerscssg.massey.ac.nz/posters/pdf/A0-Tablets-Poster.pdf · 2013-10-13 · popular platform for developing Apps to run on

Touch Screen Interactive Simulations on Android Tablet ComputersVictor Du Preez and Ken Hawick

Computer Science, Massey University, Albany, North Shore 102-904, Auckland, New Zealand

http://complexity.massey.ac.nz

Tablet Computers

Tablet computers are emerging as powerful platforms for ed-ucational and demonstration software in areas like compu-tational science and simulation which previously had neededhigher performance. The human-computer interaction (HCI)mechanisms and potential of tablet devices make them attrac-tive for highly interactive simulations. We describe how a fam-ily of complex systems simulation models were developed as adomain-related family of Apps and discuss the software engi-neering issues we encountered in generalising the data struc-tures, and simulation code patterns to run as Android Apps.We have implemented a number of simulation models includ-ing: the Ising model of magnetism; diffusion models; Conway’sGame of Life and variants of it; and other cellular automatamodels. We show how the touch screen capability leads tonew ways for users to interact with a running simulation - tosteer the computations and to explore parameter model space.We also discuss performance achieved on various models witha range of modern tablet computers and other devices withsimilar processors. We describe our HCI and user-interfaceexperiments and experiences on various Android Apps and dis-cuss possible future directions for interfacing to such highlyinteractive models on tablet computers using native App pro-grams.

Although Human-Computer Interfacing is now a relatively ma-ture discipline with many of the key guiding principles wellstudied [1, 2, 3], the widespread availability of new computedevices is leading to hitherto unexplored new ways for inter-acting with interactive programs. The multi-touch capabilityof tablet computers is a particularly interesting area that isstill being explored by new communities of users for variousdisciplines.Gestural interfaces are not new[4] although with the very rapidproduct commoditization of touch sensitive tablet computingthe research and textbook literature on multi-touch and ges-tural systems has not yet caught up and there are surprisinglyfew accounts of multi-touch applications and associated expe-riences.

Figure 1: Android tablet running interactive

Ising model simulation, showing direct

manipulation rendered model, interactive

widgets, and running plot of magnetism.

There are good accounts in the literature of HCI experimentsand applications[5] for tablet computing on data entry[6],database interaction[7], and interactive training[8]. Softwaredevelopment work is also reported on HCI frameworks that willfurther enable these applications.We are interested in highly-interactive computer simulationsin a wide range of scientific disciplines, but a particular areaof growing interest is that of simulation models that canbe steered and interacted with using the direct manipulationparadigm as it is known in the HCI literature. Relatively lowperformance tablets can be used for many mobile comput-ing and learning applications but the platform performance re-quirement for running a computationally intensive simulationis similar to that of interactive computer gaming[9, 10, 11].Figure 1 shows an example of the sort of interactive simula-tion we are interested. The figure shows an Android computerrunning a direct manipulation model rendering as well a run-ning plot of model properties, and also a more common setof graphical user interface control widgets[12]. In this presentpaper we focus on two dimensional grid models such cellularautomata. We develop some mechanisms for exploring touchand multi-touch interaction paradigms with this family of mod-els and use this to experiment with the capabilities of tabletbased computers.Although tablet computers have had a somewhat checkeredhistory, they are now emerging as surprisingly good platformsfor simulation demonstration models since they now relativelycheap, widely accessible to students, and are easily held with-out needing to be wired. Most recently, available devices suchas the iPad, and a range of vendors’ Android-based tabletsystems have sufficient computational power to run simula-tion models of suitable size and complexity so as to be usefulindependent of the need for server-side computations. Theplatforms we used in this paper had dual-core processors, andquad-core processors are likely to become widely available intablets later this year.A number of different tablet platforms are now available andit is not entirely trivial to determine which vendor invented ororiginated which feature and there is especial confusion aboutthe origins of the multi-touch capability. The work we reportin this paper used a range of Android[13, 14] operating systemsupported tablets from vendors including: Samsung, Asus andMotorola although we have also developed prototype Apps forthe Apple iPad. Putting aside proprietary concerns, the multi-touch capability is here, is becoming widely available and isnot yet a well explored HCI paradigm for many applicationsand user communities.These devices thus open up a new area of HCI and inter-action capabilities that will be important to other emergingcompute intensive applications such as interactive simulations.The range of HCI possibilities for tablets is quite different fromthat for desktop computer interaction modes and is not yet afully explored nor understood space. Making inroads into thisspace is the main contribution of this paper.

Programming Android

The Android operating system has emerged as a powerful andpopular platform for developing Apps to run on mobile plat-forms such as tablet computers and smart phones. Android isa Unix/Linux like operating system with many well established,tested and widely used and understood internal software mod-els. Its relative open-ness makes it an attractive developmentplatform for mobile Apps[15, 16, 17].We employed a Java program running using the Android Soft-ware Development Kit and the HCI and graphical user interfacewidgets to build the Apps for our simulations. The interfaceideas for Android are relatively mature[18] but we did have toresort to our own development work for some features such asthe multi-scale zooming and model navigation. Some of theideas of gesture based interfacing for touch sensitive screensare now well known and have been documented for systemssuch as the iGesture tablet. Others such as some of the multi-touch features are not yet well documented or explored.

Figure 2: Touch event handling pseudo-code.

The code listed is an example of our own development of themulti-touch interface. This is the pseudo-code for the eventhandling and detection of touches in our App. The code stipu-lates how we detect multiple touches and how we react to theone, two or three fingers. These events then trigger differentfunctionality in the App which allows for real time interactionfor both the user and the simulation.Android supports what is known as a multiple activities modeland the screen area is managed using various “activities.” TheSDK supports many of the conventional GUI widgets such assliders , buttons, menus and so forth. A canvas and associ-ated component library is available to manage things such as aplot of model properties. We found it useful to make this plotfeature optional - for some models it is a distraction, but forothers such as the Ising model family of simulations it is veryvaluable to let the user see the dramatic change in a modelproperty when a parameter is adjusted. The plots (shown asa red tracer in the screen shots in below is a plot of the mag-netization – or the fraction of ”up” cell sites compared to thenumber of cells.The issues concerned with managing concurrency and paral-lelism of a running simulation that is served by interactivecontrols is a subject for a separate article. In summary weestablish an atomic data field that can serve as a shared lockbetween the stop start GUI buttons and the thread that runsthe simulation. Android provides some suitable software appa-ratus to manage the concurrency but we did find we neededto experiment considerably to minimise garbage collection in-terference that slowed down the running simulation.The main part of our App model is the direct manipulation ren-dered model of the simulated system. This is a mesh of integercells, on an x-y mesh. For the models we experimented with itwas convenient to impose periodic boundary conditions. Thismeans that the model mesh wraps around in both vertical andhorizontal directions – it is in effect simulating the surface of atorus. This means that we do not need special edge conditionsfor cells and all cells have the same number of neighbours. Thisgreatly simplifies the simulation code with no loss of generalityas far as our HCI experimentation is concerned.There are two coordinate systems we must manage - the modelcoordinates in terms of x ∈ [0, Lx), y ∈ [0, Ly), for a model ofN ≡ LxLy cells, but also a pixel coordinate system determinedby the properties and limitations of the particular tablet.A definite limitation of the touch screen technology is the reso-lution achievable by a human finger. Although capacitive stylusdevices can typically be employed to improve spatial resolution,it is desirable to be avoid the need for these and let the user usetheir fingers directly. This means we need to be able to makethe chosen cell region enlarge so that a specific cell is easilyand unambiguously touched and hence edited. For the discretestate models we simulated it is convenient to adopt the idiomthat a touched cell cycles its value through its allowed states.The touch screen and SDK allow multiple touches to be recog-nized. This offers capabilities not accessible using a mouse ona normal screen. The two-touch ”pinch-to-zoom” idiom is nowquite well known in the context of manipulating static images.We implemented it for our evolving model renderer by dynam-ically transforming the rendering into an enlarged or shrunklength scale based on the average of the identified two touchpoints in pixel space, as they transform to model coordinatespace.This enables the use to pinch-to-zoom out or to move thetouching fingers apart to zoom in to the model. The modelis constrained at initialisation, although different sizes can bechosen, this only makes sense as a static rather than dynamicchoice.

Touch Screen HCI

There are a number of gestures possible with the touch screen.These can be used to control the simulations directly.

Figure 3: The four HCI interactions used in the

simulation App: conventional widget

interaction; three finger pan and swipe; two

finger pinch-to-zoom; and single touch model

cell toggle-edit.

Figure 3 shows the four HCI interaction idioms we experi-mented with on the tablet screen. A single finger can beused in the normal way to press buttons, adjust sliders, andenter text into fields in the usual way with a pop up touchkeyboard. Three or more finger touches followed by a swipemotion are recognized as meaning pan the model renderingaround in model space to vie w a different windowed region.Two touch points indicates a pinch-to-zoom action, with theenlarged region pivoting around the average of the two touchedpoints. A single finger touch within the direct manipulationmodel rendering indicates a toggle or cell edit action.

Figure 4: Ising Model screenshots showing

different temperature and field effects.

The top shot shows large spin clusters that have formed inthe system for a particular temperature and applied field, andthe bottom shot shows the effect of applying a stronger fieldbias so that the grey spin cells win at the expense of the whitecells. The red plot trace charts the magnetization evolving withsimulation time. The tablet platform is performing quite a lotof computations to run this simulation - each cell is updatedeach time-step and involves gathering information about itsneighbours and using these to compare an exponential functionevaluation compared with a randomly generated number toemulate the thermal heat-bath process.

Figure 5: Model selection screen-shots showing

typical menu choices available with the

graphical user interface software development

kit.

This shows the sort of menu options that are supported bythe graphical user interface software development kit. The topshot shows a global set of choices offered on App startup - thiscould support some initial startup or “splash” screen informa-tion. The middle view is a flat menu of the models supportedby our App. Additional help or explanatory information couldbe provided in this view. The bottom shot is of some modelcell pattern initialisation choices offered to the user to initialisethe model simulation. These could be built in choices or savedfrom a previous run or stored in a some sort of model-specificformatted file set.

Discussion

The tablet devices we employed for this work all used a twocore processor and the Android operating system makes gooduse of this to manage the touch screen interactivity and thesimulation model updates quite well using appropriate thread-ing and concurrency. We did find we needed to specify quitecarefully how the model data should be treated however tomake sure garbage collection inefficiencies did not slow downthe model.Models like the sort we describe here are interesting to exper-iment with on a range of different sizes and scales. Generallyspeaking the larger the supportable size the richer the set ofcomplex phenomena we can explore, but even relatively smallsizes of N ≈ 100×100 cells as shown here are still interesting.Present level tablet processor performance can perhaps man-age to double this model size without noticeable slow down onsome of the simpler (less computationally intensive) models.We estimate that a typical desktop multi core processor is ca-pable of at least another factor of four in model size scale-upwith similar user perceived model evolution speed. Neverthe-less the ability to experiment with these simulation models inthe same way as users interact with casual gaming platforms isuseful for model investigation, building up user intuition aboutthe meaning of model parameters, and of course for disciplinarydomain-specific student learning.We employed the Android software development kit for thiswork. Prior work by us on building interactive models usedJava Swing and OpenGL graphical software technologies. Webelieve the Android kit performs relatively favourably and sup-plies many of the GUI widgets and HCI idiomatic tools thatare required but these appears to be room for further devel-opment. Although we were able to use SDK Canvas objectsto support code for our App, we did need to implement a lotof software infrastructure ourselves to enable the multi-touchdirect model object manipulation apparatus.In the work reported in this paper we have relied on platformspecific operating systems, user interface libraries and othersoftware apparatus. Since we had implemented these simula-tion models in other contexts it was not particularly difficultto implement them for Android platforms. A recent softwaredevelopment trend however is to consider platform indepen-dence through the use of highly graphical and interactive webapplications.It is interesting to speculate about how future users will expectto be able to interact with touch screen tablet computers andwhat their performance expectations will be. At the time ofwriting vendors are announcing higher screen resolution capa-bilities for the next generation of tablet devices. Current globalconcerns will likely drive the need for low power consumptionfrom such devices, and indeed there are obvious heat emissionlimitations for hand held devices. We do expect to see greatercomputational performance become possible with greater num-bers of cores available in processors. Even better systems anduser-level management of threading concurrency will be neededto make good use of many cores for interactive Apps in thefuture.Application such as we describe here may also be able to makeuse of gesture based interactions that go beyond the touchscreen[19] with camera-based interaction able to support ac-curate and platform portable hand and even finger tracking.The gestures that are both intuitive and easily implemented arelikely to be different to the standard mouse idioms that usershave become familiar with. Mode changing controls such as”shift-click” and so forth are hard to implement without a per-manent keyboard. It is possible that a multi-model approachusing a combination of gestures and voice commands[20] maybecome feasible on commodity devices however.See: http://www.massey.ac.nz/~kahawick/cstn/154/cstn-154.html for more information.

Acknowlegments

Thanks to B.Pearce who helped with this project and to theIIMS Summer Scholarship Programme for financial support.

References

[1] C. J. Scogings, The Integration of Task and Dialogue Modelling in the Early Stages of User InterfaceDesign. PhD thesis, Massey University, 2003.

[2] D. Diaper and N. Stanton, eds., The Handbook of Task Analysis for Human-Computer Interaction.IEA, 2004.

[3] C. Scogings and C. Philips, The Handbook of Task Analysis for Human-Computer Interaction,ch. Linking Task and Dialogue Modeling: Toward an Integrated Software Engineering Method,pp. 551–568. IEA, 2004.

[4] P. Kortum, HCI Beyond the GUI - Design for Hapric, Speech, Olfactory and other NontraditionalInterfaces. Morgan Kaufmann, 2008.

[5] R. Capra, G. Golovchinsky, B. Kules, D. Russell, C. L. Smith, D. Tunkelang, and R. W. White, “Hcir2011: The fifth international workshop on human-computer interaction and information retrieval,”ACM SIGIR Forum, vol. 45, pp. 102–107, December 2011.

[6] S. J. Castellucci and I. S. MacKenzie, “Gathering text entry metrics on android devices,” in Proc.Computer Human Interactions (CHI2011), (Vancouver, BC, Canada), pp. 1507–1512, 7-12 May2011.

[7] N. Buchanan, “An examination of electronic tablet based menus for the restaurant industry,”Master’s thesis, University of Delaware, 2011.

[8] J. E. MacDonald, E. M. Foster, J. M. Divina, and D. W. Donnelly, “Mobile Interactive Training:Tablets, Readers, and Phones?Oh, My!,” in Proc. Interservice/Industry Training, Simulation andEducation Conference (I/ITSEC 2011), no. 11038, (Orlando, Florida, USA), pp. 1–9, 3-6 December2011.

[9] K. W. Cheng, “Casual gaming.” VU Amsterdam, January 2011.

[10] R. Kemp, N. Palmer, T. Kielmann, and H. Bal, “Opportunistic communication for multiplayermobile gaming: Lessons learned from photoshoot,” in Proc. Second Int. Workshop on MobileOpportunistic Networking (MobiOpp’10), (Pisa, Italy), pp. 182–184, 22-23 February 2010.

[11] C. Feijoo, S. Ramos, and J.-L. Gomez-Barroso, “An analysis of mobile gaming development - therole of the software platforms,” in Proc. Business Models for Mobile Platforms(BMMP10), (Berlin,Germany), October 2010.

[12] B. Shneiderman, Designing the User Interface - Strategies for Effective Human-Computer Interac-tion. Addison-Wesley, 1998.

[13] J. P. Conti, “The androids are coming,” Engineering and Technology Magazine, vol. May - June,pp. 72–75, 2008.

[14] M. J. Johnson and H. K. A, “Porting the google android mobile operating system to legacyhardware,” in Proc. IASTED Int. Conf. on Portable Lifestyle Devices (PLD 2010), (Marina DelRey, USA), pp. 620–625, 8-10 November 2010.

[15] W. Jackson, Android Apps for Absolute Beginners. No. ISBN 978-1-4302-3446-3, Apress, 2011.

[16] W.-M. Lee, Beginning Android Application Development. No. ISBN 978-1-118-01711-1, Wiley,2011.

[17] D. Smith and J. Friesen, Android Recipes A Problem-Solution Approach. Apress, 2011.

[18] S. Kim, “Logical user interface modeling for multimedia embedded systems,” in Proc. Int. Confon Multimedia, Computer Graphics and Broadcasting (MulGrab 2011), (Jeju Island, Korea), 8-10December 2011.

[19] D. Graham-Rowe, “Taking touch beyond the touch screen,” MIT Technology Review, vol. Septem-ber, p. Online, 2011.

[20] H. A. Murphy, C. C. Sekhar, C. S. Ramalingam, and S. Chakravarthy, “Multimodal interfaces tothe computer,” Info. Tech ad Comms. Resources for Sustainable Development, vol. 1, pp. 64–74,2010. Encyclopedia of Life Support Systems.