1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009...

28
1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 [email protected] School of Computer Science EVALUATION STUDIES IN HEALTH INFORMATICS AND A PROPOSED INTEGRATED MODEL OF USER ACCEPTANCE OF TECHNOLOGY

Transcript of 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009...

Page 1: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

1

Noor Azizah KS MOHAMADALI

Supervisor:Dr. Jonathan Garibaldi

IMA SEMINAR28th April 2009

[email protected]

School of Computer Science

EVALUATION STUDIES IN HEALTH INFORMATICS AND A PROPOSED INTEGRATED MODEL OF USER ACCEPTANCE

OF TECHNOLOGY

Page 2: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

2

Presentation Outline

■ Introduction

■ An overview of Evaluation Study

■ An overview of Existing Model of Technology Acceptance (Information

Systems Theory)

■ An overview of Existing Work on Technology Acceptance in HealthCare

■ Proposed Integrated Conceptual Model of Technology Acceptance

■ Current Work and Future Work

■ Conclusion

IMA SEMINAR28th April 2009

Page 3: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

3

Introduction

Implementation of new information systems in the organization costs thousand of millions of dollars each year.

Decision makers often believe that technology will bring benefits. However, evidence from various studies on implementation of new system in healthcare sectors does fail (Southon et. Al., 1999).

Effective evaluation of healthcare informationsystems are necessary in order to ensure systems adequately meet the requirements and information processing needs of users and health care organizations.

IMA SEMINAR28th April 2009

Page 4: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

4

An overview of Evaluation Study in Health Informatics

Evaluation study in health informatics is a study that measure or explore the attributes of health information systems (in planning, development, implementation or operation) with the aim to informs a decision to be made concerning that systems in a specific context. (Mohd Yusof and Papazafeiropoulou,2008)

Evaluation is carried out to seek answers to the following (Friedman and Wyatt, 1997) :■ Why : objective of evaluation? ■ Who: which stakeholders’ perspective is going to be evaluated? ■ What: aspects of evaluation?■ When: which phase in the system development life cycle?■ How: method of evaluation?

IMA SEMINAR28th April 2009

Page 5: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

5

Why: Objective of Evaluation

Authors Purpose of Evaluation

Yusof and Papazafeiropoulou (2008)

“Evaluation can be used to improve HIS through past experience, to identify more effective techniques or methods, investigate failure and learn from previous experiences”.

Meijden et. al (2003)

“Only a thorough evaluation study can show whether or not specific system was successful in a specific settings”

Nahm et. al (2007)

“The assessment outcome of CIS implementation is vital not only to justify the cost within organization but also to promote the national agenda to improve healthcare information technology”

IMA SEMINAR28th April 2009

Page 6: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

6

Who and What ?

Stakeholders Concerns

Management/ Organization

Whether investment justified?Will the users accept the system?How committed are the users to use the system? Etc

User Are all necessary facilities (e.g. training support, network infrastructure, etc.) provided?How useful and easy is to use the system? How safe and secure is the system?How is the quality of the information provided by the system? Etc.

Patient How will it improve the quality of services?

Developer Has it met all the user requirements?

IMA SEMINAR28th April 2009

Page 7: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

7

When: Stage of EvaluationEvaluation can be carried out during each of the three main phases of the system development life cycle:

Pre-implementation (development)

During implementation

Post-implementation

(Yusof and papazafeiropolou, 2008)

IMA SEMINAR28th April 2009

Page 8: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

8

How: Method of Evaluation

Author (s) Objectivist Subjectivist

Wyatt and Liu (2002)

Objectivist evaluation is an evaluation approach that uses experimental designs and statistical analyses of quantitative data

Subjectivist approach is an evaluation approach relies on qualitative data which can be derived from observation, interview and analysis of documents and other artefacts.

Limitation of objectivist approach - Cannot provide answer as to why and how a system works within a specific settings.

Many researchers now tend to use subjectivist approach when undertaking evaluation work (Collen, 1986; Friedman and Wyatt, 1997, Kaplan, 2001a; Gremy et. al., 1999)

IMA SEMINAR28th April 2009

Page 9: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

9

Among all, user acceptance is one most important area of research. User acceptance is a risk for successful of any IT

project (Louise K. Schaper, 2007) Clinical Information System (CIS) experienced high

level of user resistance, thus understanding of a successful CIS implementation is critical to improve health care services as a whole. (Jean-Marc Palm, 2006)

IMA SEMINAR28th April 2009

Page 10: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

10

Research Methodology (Part 1)

Steps Involved:1. Identify critical success factors (LR).2. Analyse Existing Theory/ Model of User

Acceptance (IS Theory).3. Analyse Existing work on user

acceptance of technology.4. Development of Proposed Integrated

Model of Technology Acceptance.5. Evaluate Proposed Model.

Page 11: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

11

Step 1: Some of Identified Critical Success Factors

Factors (s) Sources

It improves job performance Garfield (2005) etc

Allow to work quicker Despont-Gros et. al(2005)

Save time Ting-Ting Lee(2008) etc

I find the system to be easy to use William J.Doll(1991) etc

Instruction are clear and easy to remember William J.Doll(1991) etc

User-friendly Sicotte et. al(2006) etc

Conciseness and Completeness Maryati Yusof et. al(2008) etc

Helpdesk support Martens, Weijden (2008) etc

Training etc… Mahmood et al (2000),

Page 12: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

12

Step 2: Overview of Existing Model of Technology Acceptance (IS Theory)

■ Unified Theory of Technology Acceptance and Use of Technology by Vankatesh, 2003 (UTAUT)

The basic concept underlying this model is that individuals will form various beliefs and attitudes regarding the technology; these will, in turn, have an impact on their intentions to use the technology and therefore, affect their actual use of the technology.

■ Information Systems Success Model by DeLone & McLean, 2002, 2003

A system can be evaluated in terms of information, system, and service quality; these characteristics affect the subsequent use or intention to use and user satisfaction.

■ Task-technology fit (TTF) by Goodhue and Thompson,1995Task-technology fit (TTF) theory holds that IT is more likely to have a positive impact on individual performance and be used if the capabilities of the IT match the tasks that the user must perform.

Page 13: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

13

Our Observation

UTAUT and IS Success Model addressing the same issuebut with different construct defined.

Behavioural Intention to Use/ Intention to Use/ Use.-IS Success Model – Information Quality, System Quality and

Service Quality-UTAUT - Performance Expectancy, Effort Expectancy, Social

Influence And Facilitating Condition.

The importance of fit between the factors (Ammenwerth et al., 2006; Kaplan,2001b; Goodhue,1998).

The importance of moderating factors such as age, gender, experiences that may or may not have influence on new systems.

IMA SEMINAR28th April 2009

Page 14: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

14

Step 3: Analyses of Existing Work on User Acceptance of Technology

Page 15: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

15

The design-reality gap model (Heeks,2006)

Information

Technology

Design

Processes

Management systems and skills

Objectives and values

Staffing and skills

Management systems and skills

Reality

Other resources

Information

Technology

Processes

Objectives and values

Staffing and skills

Other resources

Gap

Strength – ►Identification of most of the important factors for evaluation

►Introduction of gap features

Limitation : Moderating factors

IMA SEMINAR28th April 2009

Page 16: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

16

ICT and OTs : A model of information and communication technology acceptance and utilisation by occupational therapists (Schaper and Pervan,2007)

Performance Expectancy

Effort Expectancy

Computer anxiety

Computer self-efficacy

Computer Attitude

Social Influences

Organisational Facilitating condition

Compatibility

Behavioural Intention

Use Behaviour

MODERATORS:Age; Gender, Experiences, Voluntariness of Use, Access, Clinical speciality, Clinical workload, setting type, geographic area

TECHNOLOGICAL CONTEXT

IMPLEMENTATION CONTEXT

INDIVIDUAL CONTEXT

Limitation : Fit factors

IMA SEMINAR28th April 2009

Page 17: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

17

CHEATS : a generic information and communication technology evaluation framework (Shaw, 2002)

This proposed framework identified six dimensions for evaluation which

are clinical, human and organizational, educational, administrative,

technical and social.

Strength :

Detailed measurement for each of above factors.

Limitation :

► Fit factors

► Moderating factors

IMA SEMINAR28th April 2009

Page 18: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

18

Understanding IT acceptance by individual professional: Towards an integrating view (Yi et al., 2006)

PerceivedBehavioural

Control

Behavioural Intention

Subjective Norms

Perceived Ease of Use

Perceived Usefulness

Image

Result DemonstrabilityPersonal

Innovativeness in IT

H10

H1

H10

H6

H16

H15

H5

H7

H8

H2H3

H10

H12

H11

H13

Limitation: Moderating factors, Fit factor, Organizational factors

IMA SEMINAR28th April 2009

Page 19: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

19

Step 4: Proposed Integrated Model of Technology

Acceptance 1. To make use of ‘IS Success Model’ and ‘Unified Theory of Technology

Acceptances and Use of Technology’.

2. To incorporate fit factors as proposed by Goodhue (1995).

3. Proposed Conceptual Model of User Acceptance of Technology

IMA SEMINAR28th April 2009

Page 20: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

20

Page 21: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

21

Step 5 : Process of Evaluating a Model Ability to explain past observations Ability to predict future observations

http://en.wikipedia.org/wiki/Scientific_modeling

Page 22: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

22

Model Evaluation (Phase 1)Factors (s) Construct (s)

It improve job performance Performance Expectancy (Perceived Usefulness)

Improve communication Performance Expectancy

Save time Performance Expectancy

I find the system to be easy to use Effort Expectancy (Perceived Ease of Use)

Instruction are clear and easy to remember Effort Expectancy

User-friendly Effort Expectancy

Conciseness and Completeness Information Quality

Excellent helpdesk support Service Quality

Training etc… Facilitating Condition etc

Page 23: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

23

Model Evaluation (Phase 2) •Case Study Strategy - Qualitative methods

•Initial contact with clinical collaborators from Nottingham Breast Institute was made on April 2008, who have deployed the Distiller Software (Slide Path, 2008).

•Users – Medical Researcher Students

•Data collected through audio and hand-recording.

•The data will then be transcribed into filed note and will be analyzed.

•Emerging themes will be identified.

•NViVo software

IMA SEMINAR28th April 2009

Page 24: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

24

Current and Future Works

Investigate techniques to assign priority among factors.

Analytic Hierarchy Process (AHP) and Fuzzy Cognitive Map (FCM)

Work on Knowledge Representation Identify appropriate techniques to represent the

knowledge (critical success factors) and to predict rate of successful implementation of new system.

IMA SEMINAR28th April 2009

Page 25: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

25

Conclusion

Identifying those critical factors for successful implementation of new information systems in health care sector may help decision makers to make better investment decisions in new technology more effectively.

Proposed integrated model hopefully will serve as guideline in conducting evaluation study, particularly on user acceptance of technology.

IMA SEMINAR28th April 2009

Page 26: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

26

References[1] Ting-Ting Lee, Mary Etta Mills, Barker Bausell, Ming-Hui Lu (2008). Two-stage evaluation of the impact of a nursing information system

in Taiwan. International Journal of Medical Informatics, 77, 698-707. [2] Gray Southon, Chris Sauer, Kit Dampney (1999). Lesson from a failed information system initiative: issues for complex organisations.

International Journal of Medical Informatics, 55, 33-46. [3] F. Gremy, J.M. Fessler, M. Bonnin (1999). Information systems evaluation and subjectivity. International Journal of Medical Informatics,

56, 13-23. [4] Morris F. Collen (1986). Origins of Medical Informatics. The Western Journal of Medicine, 145, 778-785. [5] Maryati Mohd Yusof, Anastasia Papazafeiropoulou (2008). Investigating evaluation frameworks for health information systems.

International Journal of Medical Informatics, 77, 377-385. [6] Eun-Shim Nahm, Vinay Vaydia, Danny Ho, Barbara Scharf, Jake Seagull (2007). Outcome assessment of clinical information system

implementation: A practical guide. Nursing Outlook, 55, 282-288. [7] Maryati Mohd Yusof, Jasna Kuljis, Anastasia Papzafeiropoulou, Lampros K. Stergioulas (2008). An evaluation framework for Health

Information Systems: human, organization and technology-.t factors (HOT-.t). International Journal of Medical Informatics, 77, 386-398.

[8] Nicola T. Shaw (2002). CHEATS: a generic information communication technology (ICT) evaluation framework. Computers in Biology and Medicine, 32, 209-220.

[9] M.J Van Der Meijden, H.J. Tange, Troost, A.Hasman (2003). Determinants of Success of Inpatient Clinical Information Systems: A literature Review. Journal of the American Medical Association, 10, 235-243.

[10] Mo Adam Mahmood, Janice M.Burn, Leopoldo A. Gemoets, Carmen Jacquez (2000). Variables affecting information technology end-user satisfaction: a meta-analysis of the empirical literature. Int. J. Human-Computer Studies, 52, 751-771.

[11] J.D Martens, T. van Der Weijden, (2008). Feasibility and Acceptability of a Computerized System with automated reminders for prescribing behavior in primary care. Int. J. Medical Informatics, 77, 199-207.

[12] Jeremy C. Wyatt, Sylvia M. Wyatt (2003). When and how to evaluate information systems?. International Journal of Medical Informatics, 69, 251-259.

[13] J C Wyatt, J L Y Liu (2002). Basic concepts on medical informatics. Journal of Epidemiology and Community Health, 56, 808-812. [14] Bonnie Kaplan (2001). Evaluating informatics applications clinical decision support systems literature review. International Journal of

Medical Informatics, 64, 15-37. [15] Elske Ammenwerth, Carola Iller, Cornelia Mahler (2006). IT-adoption and the interaction of task, technology and individuals: a .t

framework and a case study. BMC Medical Informatics and Decision Making, 6, 1-13.

Page 27: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

27

[16] Jean-Marc Palm, Isabelle Colombet, Claude Sicotte, Patrice Degoulet (2006). Determinants of User Satisfaction with a Clinical Information System. AMIA 2006 Symposium Proceeding. 614-618.

[17] Christelle Despont-Gros, Henning Mueller, Christian Lovis (2005). Evaluating user interactions with clinical information systems: A model based on human-computer interaction models. Journal of Biomedical Informatics, 38, 244-255.

[18] Richard Heeks (2006). Health information systems: Failure, success and improvisation. International Journal of Medical Informatics, 75, 125-137.

[19] Louise K. Schaper, Graham P. Pervan (2007). ICT and OTs: A model of information and communication technology acceptance and utilization by occupational therapists. International Journal of Medical Informatics, 76s, S212-S221.

[20] William J.Doll (1991). The Measurement of End-User Computing Satisfaction: Theoretical and Methodological Issues. MIS Quarterly, 5-10.

[21] Gray Southon (1999). IT, Change and evaluation: an overview of the role of evaluation in health services. International Journal of Medical Informatics, 56, 125-133.

[22] Clifford S. Goodman, Roy Ahn (1999). Methodological approaches of health technology assessment. International Journal of Medical Informatics, 56, 97-105.

[23] Mun Y. Yi, Joyce D. Jackson, Jae S. Park, Janice C. Probst (2006). Understanding information technology acceptance by individual professionals: Towards an integrative view. Information and Management, 43, 350-363.

[24] Bonnie Kaplan (2001). Evaluating informatics applications-some alternative approaches: theory, social interactionism, and call for methodological pluralism. International Journal of Medical Informatics, 64, 39-56.

[25] C.P. Friedman, J.C. Wyatt (1997). Evaluation Methods in Medical Informatics, Springer-Verlag, New York. [26] Joan Ash, Marc Berg (2003). Report of conference Track 4: socio-technical issues of HIS. International Journal of

Medical Informatics 69, 305-306. [27] Monice J. Garfield (2005), Acceptance of Ubiquitous Computing. Information Systems Management, 22, 24-31.[28] Claudie Sicotte, Guy Pare, Marie-Pierre Moreault, Andree Paccioni (2006). A Risk Assessment of Two

Interorganizational Clinical Information Systems. Journal of the American Medical Informatics Associations, 13, 557-566.

[29] http://www.istheory.yorku.ca/UTAUT.htm[30] http://www.fsc.yorku.ca/york/istheory/wiki/index.php/Delone_and_McLean_IS_success_model[31] http://www.fsc.yorku.ca/york/istheory/wiki/index.php/Task-technology_fit

Page 28: 1 Noor Azizah KS MOHAMADALI Supervisor: Dr. Jonathan Garibaldi IMA SEMINAR 28 th April 2009 mnk@cs.nott.ac.uk School of Computer Science EVALUATION STUDIES.

28

Questions?

IMA SEMINAR28th April 2009