Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3...

93

Transcript of Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3...

Page 1: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described
Page 2: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   2 

2.2 Gap Certification 

The following identifies criterion or criteria certified via gap certification. 

§170.314 

  (a)(1)    (a)(19)  (d)(6)    (h)(1) 

  (a)(6)    (a)(20)  (d)(8)    (h)(2) 

  (a)(7)    (b)(5)*  (d)(9)   

  (a)(17)    (d)(1)  (f)(1)   

  (a)(18)    (d)(5)  (f)(7)*   

*Gap certification allowed for Inpatient setting only 

 No gap certification 

 

2.3 Inherited Certification 

The following identifies criterion or criteria certified via inherited certification. 

§170.314 

  (a)(1)    (a)(16) Inpt. only  (c)(2)    (f)(2) 

  (a)(2)    (a)(17) Inpt. only  (c)(3)    (f)(3) 

  (a)(3)    (a)(18)  (d)(1)    (f)(4) Inpt. only 

  (a)(4)    (a)(19)  (d)(2)    (f)(5) Optional & Amb. only

  (a)(5)    (a)(20)  (d)(3)    (f)(6) Optional & Amb. only

  (a)(6)    (b)(1)  (d)(4)    (f)(7) Amb. only 

  (a)(7)    (b)(2)  (d)(5)    (g)(1) 

  (a)(8)    (b)(3)  (d)(6)    (g)(2) 

  (a)(9)    (b)(4)  (d)(7)    (g)(3) 

  (a)(10)    (b)(5)  (d)(8)    (g)(4) 

  (a)(11)    (b)(6) Inpt. only  (d)(9) Optional    (h)(1) 

  (a)(12)    (b)(7)  (e)(1)    (h)(2) 

  (a)(13)    (b)(8)  (e)(2) Amb. only    (h)(3) 

  (a)(14)    (b)(9)  (e)(3) Amb. only     

  (a)(15)    (c)(1)  (f)(1)     

 No inherited certification 

 

Page 3: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described
Page 4: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   4 

3.2.2 Test Tools 

Test Tool  Version 

Cypress  2.5.1 

ePrescribing Validation Tool  1.0.5 

HL7 CDA Cancer Registry Reporting Validation Tool  n/a 

HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool  n/a 

HL7 v2 Immunization Information System (IIS) Reporting Validation Tool  1.8.1 

HL7 v2 Laboratory Results Interface (LRI) Validation Tool  1.7.1 

HL7 v2 Syndromic Surveillance Reporting Validation Tool  1.7.1 

Transport Testing Tool  180 

Direct Certificate Discovery Tool  3.0.3 

Edge Testing Tool  n/a 

 No test tools required   

 

3.2.3 Test Data 

 Alteration (customization) to the test data was necessary and is described in Appendix A 

 No alteration (customization) to the test data was necessary  

3.2.4 Standards 

3.2.4.1 Multiple Standards Permitted 

The following identifies the standard(s) that has been successfully tested where more than one standard is permitted. 

Criterion #  Standard Successfully Tested 

(a)(8)(ii)(A)(2)    §170.204(b)(1) 

HL7 Version 3 Implementation Guide: URL‐Based Implementations of the Context‐Aware Information Retrieval (Infobutton) Domain 

  §170.204(b)(2) 

HL7 Version 3 Implementation Guide: Context‐Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide 

(a)(13)    §170.207(a)(3) 

IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

  §170.207(j) 

HL7 Version 3 Standard: Clinical Genomics; Pedigree 

(a)(15)(i)    §170.204(b)(1)  

HL7 Version 3 Implementation Guide: URL‐Based Implementations of the Context‐Aware Information Retrieval (Infobutton) Domain 

  §170.204(b)(2) 

HL7 Version 3 Implementation Guide: Context‐Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide 

(a)(16)(ii)    §170.210(g)  

Network Time Protocol Version 3 (RFC 1305)  

  §170. 210(g) 

Network Time Protocol Version 4 (RFC 5905) 

(b)(2)(i)(A)    §170.207(i)  

The code set specified at 45 CFR 162.1002(c)(2) (ICD‐10‐CM) for the indicated conditions  

  §170.207(a)(3) 

IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

Page 5: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   5 

Criterion #  Standard Successfully Tested 

(b)(7)(i)    §170.207(i)  

The code set specified at 45 CFR 162.1002(c)(2) (ICD‐10‐CM) for the indicated conditions  

  §170.207(a)(3) 

IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

(e)(1)(i)    Annex A of the FIPS Publication 140‐2 

SHA‐1 with RSA 

(e)(1)(ii)(A)(2)    §170.210(g)  

Network Time Protocol Version 3 (RFC 1305)  

  §170. 210(g) 

Network Time Protocol Version 4 (RFC 5905) 

(e)(3)(ii)    Annex A of the FIPS Publication 140‐2 

SHA‐1 with RSA 

Common  MU Data Set (15) 

  §170.207(a)(3) 

IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

  §170.207(b)(2) 

The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT‐4) 

 None of the criteria and corresponding standards listed above are applicable 

 

3.2.4.2 Newer Versions of Standards  

The following identifies the newer version of a minimum standard(s) that has been successfully tested.  

Newer Version  Applicable Criteria 

   

 No newer version of a minimum standard was tested 

 

3.2.5 Optional Functionality 

Criterion #  Optional Functionality Successfully Tested 

(a)(4)(iii)   Plot and display growth charts 

(b)(1)(i)(B)   Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation) 

(b)(1)(i)(C)   Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols) 

(b)(2)(ii)(B)   Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation) 

(b)(2)(ii)(C)   Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols) 

(f)(3)   Ambulatory only – Create syndrome‐based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario) 

Common MU Data Set (15)  

 Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature) 

Common MU Data Set (15) 

 Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD‐10‐PCS) 

 No optional functionality tested 

 

Page 6: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   6 

3.2.6 2014 Edition Certification Criteria* Successfully Tested 

Criteria # Version 

Criteria # Version 

TP**  TD***  TP**  TD*** 

  (a)(1)    (c)(1)  1.7.1 2.5.1

  (a)(2)  1.2     (c)(2)  1.7.1 2.5.1

  (a)(3)  1.2 1.4   (c)(3)  1.7.1 2.5.1

  (a)(4)  1.4 1.3   (d)(1)   

  (a)(5)  1.4 1.3   (d)(2)  1.5  

  (a)(6)    (d)(3)  1.3  

  (a)(7)    (d)(4)  1.2  

  (a)(8)  1.2     (d)(5) 

  (a)(9)  1.3 1.3   (d)(6) 

  (a)(10)  1.2 1.4   (d)(7)  1.2  

  (a)(11)  1.3     (d)(8)   

  (a)(12)  1.3     (d)(9) Optional 

  (a)(13)  1.2     (e)(1)  1.8 1.5

  (a)(14)  1.2     (e)(2) Amb. only  1.2 1.6

  (a)(15)  1.5     (e)(3) Amb. only  1.3  

  (a)(16) Inpt. only    (f)(1) 

  (a)(17) Inpt. only    (f)(2)  1.3 1.8.1

  (a)(18)    (f)(3)  1.3 1.7.1

  (a)(19)    (f)(4) Inpt. only 

  (a)(20)    (f)(5) Optional & Amb. only 

  (b)(1)  1.7 1.4   (f)(6) Optional & Amb. only 

  (b)(2)  1.4 1.6   (f)(7) Amb. only 

  (b)(3)  1.4 1.2   (g)(1) 

  (b)(4)  1.3 1.4   (g)(2)  1.8a 2.0

  (b)(5)  1.4 1.7.1   (g)(3)  1.3  

  (b)(6) Inpt. only    (g)(4)  1.2  

  (b)(7)  1.4 1.7   (h)(1) 

  (b)(8)    (h)(2) 

  (b)(9)    (h)(3) 

*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) 

**Indicates the version number for the Test Procedure (TP) 

***Indicates the version number for the Test Data (TD)  

Page 7: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   7 

3.2.7 2014 Clinical Quality Measures* 

Type of Clinical Quality Measures Successfully Tested: 

  Ambulatory 

  Inpatient 

  No CQMs tested 

*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures) 

Ambulatory CQMs 

CMS ID  Version  CMS ID  Version  CMS ID  Version  CMS ID  Version 

  2      90    136      155   

  22      117    137      156  v3 

  50      122    138  v3    157   

  52      123  v3  139      158   

  56      124     140      159   

  61      125     141      160   

  62      126  v3  142      161   

  64      127  v3  143      163   

  65      128     144      164   

  66      129  v4  145      165  v3 

  68  v4    130  v3  146      166   

  69      131  v3  147  v4    167   

  74      132    148      169   

  75      133    149      177  v3 

  77      134    153      179   

  82      135    154  v3    182   

 

Inpatient CQMs 

CMS ID  Version  CMS ID  Version  CMS ID  Version  CMS ID  Version 

  9      71    107      172   

  26      72    108      178   

  30      73    109      185   

  31      91    110      188   

  32      100    111      190   

  53      102    113   

   55      104    114   

  60      105    171   

Page 8: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   8 

3.2.8 Automated Numerator Recording and Measure Calculation 

3.2.8.1 Automated Numerator Recording 

Automated Numerator Recording Successfully Tested 

  (a)(1)    (a)(9)  (a)(16)    (b)(6) 

  (a)(3)    (a)(11)  (a)(17)    (e)(1) 

  (a)(4)    (a)(12)  (b)(2)    (e)(2) 

  (a)(5)    (a)(13)  (b)(3)    (e)(3) 

  (a)(6)    (a)(14)  (b)(4)  

  (a)(7)    (a)(15)  (b)(5) 

 Automated Numerator Recording was not tested  

3.2.8.2 Automated Measure Calculation 

Automated Numerator Recording Successfully Tested 

  (a)(1)    (a)(9)  (a)(16)    (b)(6) 

  (a)(3)    (a)(11)  (a)(17)    (e)(1) 

  (a)(4)    (a)(12)  (b)(2)    (e)(2) 

  (a)(5)    (a)(13)  (b)(3)    (e)(3) 

  (a)(6)    (a)(14)  (b)(4)  

  (a)(7)    (a)(15)  (b)(5) 

 Automated Measure Calculation was not tested  

 

3.2.9 Attestation 

Attestation Forms (as applicable)  Appendix 

 Safety‐Enhanced Design*  B 

 Quality Management System**  C 

 Privacy and Security  D 

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) 

**Required for every EHR product 

 

Page 9: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   9 

Appendix A: Alteration of Test Data 

Criteria  Explanation 

b4  Determined that modified Test Data had equivalent level of robustness to NIST Test Data 

 

Page 10: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   10 

Appendix B: Safety Enhanced Design 

Page 11: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

While designing the Blue EHS, usability was given the utmost priority. A specific research team was involved in collating research on achieving usability for healthcare software’s. The User Interface has been developed with stress on cognitive design enabling the UI to support the users tasks rather than stay parallel to them. The principles defined in the below researches have been a guiding force on the design, combined with process defined below.

University of Maryland – Sharp C project for Cognitive EHR design

Morgan Kaufan – Designing with the Mind in Mind

http://inspiredehrs.org/

The Process

A 5 step process was set in place to maximize usability, save time and to ensure efficiency in the system. Actual users were a major stakeholder in making sure the concepts were practically helping them do their tasks.

Page 12: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

1. Strategize — We believe that Strategy is the key guiding factor here because a) It articulates the brand b) Defines the guiding principles c) Puts the long-term vision of an organization into prospect The strategy underpinning the UI design will shape the goals of the project — what the organization is hoping to achieve with the project, how its success should be measured, and what priority it should have in the grand scheme of things.

Some of the techniques we use while Strategizing include: 1) Contender Analysis - Includes performing an audit/review of competing software, download/ signup and conduct a user testing of their software and pen down a report that summarizes the competitive landscape. 2) Brainstorming – The team gets together to find a conclusion for a specific problem by gathering a list of ideas spontaneously contributed by its members.

2. Research — We also call it the Discovery phase, the Research phase is probably the most variable between projects. A complex project comprises of significant user and competitor research activities. Techniques we use for Research include: a) Contextual Enquiry - We interview users in the location that they use the software, to understand their tasks and challenges. b) Content Auditing – We review and audit a client’s existing repository of content.

c) User Testing – We ask our users to perform tasks, and to think out loud while doing so while using the software.

d) Persona Creation

3. Analysis — The aim of the Analysis phase is to draw insights from data collected during the Research phase. Capturing, organizing and making inferences from the

Page 13: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

“what” helps our UI designers begin to understand the “why”. Communicating the designer’s understanding back to end-users helps to confirm that any assumptions being made are valid. Technique we use for Analysis:

a) Scenarios – We envision scenarios that the doctor/nurse encounters in their daily practice of using the EMR. We place ourselves in their shoes and understand how as a user we would like the EMR to perform. We base our design along those lines.

4. Design — The Design phase of the UX project is a collaborative one (involving input and ideas from different people) and iterative (meaning that it cycles back upon itself to validate ideas and assumptions). Building on the user feedback loop established from the previous phases, the premise of the Design phase is to put ideas in front of users, get their feedback, refine them, and repeat. These ideas may be represented by paper prototypes, interactive wireframes, or semi-functioning prototypes, all deliberately created in low-fidelity to delay any conversation relating to graphic identity, branding or visual details.

Techniques we use for design include:

a) Workflow Diagram: A graphical representation of activities and actions conducted by users of a system.

b) Prototyping: Our UI developers create rough sketched drawings of a user interface, and using them in a usability test to gather feedback. Participants point to locations on the page that they would click, and screens are manually presented to the user based on the interactions they indicate.

c) Beta Launch: Releasing a closed beta release of your product involves allowing only a select group of users to use the software and provide feedback before it becomes available to the wider public.

5. Production—The Production phase is where the high-fidelity design is fleshed out, content and digital assets are created, and a high-fidelity version of the product is

Page 14: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

validated with stakeholders and end-users through user testing sessions. The role of the Ui Designer shifts from creating and validating ideas to collaborating with developers to guide and champion the vision.

Page 15: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

The below section explains the usability principles relating to

• Computerized provider order entry

• Drug-drug, drug-allergy interaction checks

• Medication list

• Medication allergy list

• Electronic prescribing

• Clinical Decision Support Engine

• Clinical Information Reconciliation

As BlueEHS uses New Crop for Erx this section explains the design principles connecting the system to New Crop

E Prescription Module in the BlueEHS (Design concept, Access & Usability)

We narrowed down the usage of the e-prescription based on our interaction with the doctors/nurses in the Blue EHS to 3 tasks

1) Viewing medications 2) Perform an action outside a prescription 3) Carry out a prescription

We realized that all of these go hand in hand. The doctor can perform action 1 and then move on to 3. Else he could start with 3 and move on to 2 or 1. We wanted to set in place a design that facilitates this interaction.

We also understood that the doctors wanted a system that was flexible such that they could carry this out from multiple areas in the EMR rather than confining to just one single area. Viewing Medication lists - Medication lists record information about all the drugs a patient is currently receiving and their prescribed dosages. We wanted to keep things simple yet detailed, displaying only relevant information. The design concept of our simple list displays bare-bones basic information. It’s made to be read quickly/scanned at a glance. It’s easy to scan visually to see the name, strength, and dosing of the medication. The list makes it easy to search for and locate particular items. It gives the physician, a broad overview of the patient’s medical history and their related medications.

Page 16: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

It’s a quick snapshot, intended to be viewed at a glance. A physician doing a more complex task, like e-prescribing, might prefer an interactive display with more information. In this context, the physician just needs to quickly see the medications’ names. We’ve avoided adding extraneous detail here. Also concise lists are easier to read. Physicians don’t need to see the medications’ quantities, start dates, or the number of refills in a given prescription to perform this task.

The Cognitive Design Concept: Our cognitive design concept replicates on how a doctor looks at medications. In a scenario where there are 10 medications for example, the doctor does not look at all the 10 medications at one shot. The focus is on one medication and that is when he sees the sig and the other details. We designed the medication lists and made them easier to read by emphasizing the names of drugs and de-emphasizing everything else. We realize that physician’s eyes need to notice the names and strengths more than they need to take in the whole line of text. Dosage instructions such as “take 1 tablet daily,” while important in some contexts, are secondary pieces of information. One method of denoting that these instructions are of secondary importance is to use gray text. The difference between this gray text and the rest won’t be extreme, and thus won’t be visually jarring, but it will be immediately apparent to the human brain’s visual processing system. Compared with this light black text, the blue medicine names will possess the “preattentive attributes" and will be flagged as important. Keeping that in mind, we followed this design strategy When he chooses one medication, visually the sig becomes more .The design is replicating the cognitive methodology of how the doctor is looking at a medication in comparison to simply having all the medications in one place.

Page 17: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Accessibility

We created the system to be flexible such that the doctor could access the e-prescription module from multiple areas in the EMR without having to go back to just one area. Here are some of the areas within the system where the doctor can easily access the eRx module from.

1. Current Meds Widget in the Demographics screen/Facesheet: The doctor can easily view the patient’s current medication from the demographics screen as a widget on the right side of the screen without having to go into the encounter.

This is how we incorporated the design concept into action:

a) Emphasized the names of the drug (Blue)

b) De-emphasized everything else

Page 18: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

2. e-prescription from the demographics screen: The demographic screen/facesheet is the most viewed screen in the EMR. We have enabled access for physicians to e-prescribe directly from here without having to go into the patient encounter. We have enabled this as a link on the top of the screen. This type of accessibility plays a key role in cases where a patient walks in only for a medication refill or wants a change in the medication dosage etc the doctor does not have to go into the notes, thus saving time and making it efficient.

Page 19: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

3. Access from the patient note taking screen: After completing a detailed History, physical examination and diagnosing the condition, the next important area from where the doctor prescribes medications is from the note taking area. In the Blue EHS, we have an functionality that enables the doctor to access the e-prescription module from multiple forms based on the doctor’s preference. Shown here is an example:

Page 20: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

4. Allergies widget from the Demographics screen: The doctor can view the allergies of the patient from the demographics screen as a widget that populates on the right side of the screen. This re-directs the doctor to the e-prescription module.

Page 21: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

The allergies are color coded with each color representing a varied intensity of the allergy. This makes it visually appealing and easy for the doctor to understand at a glance.

5. Medications tab as a widget within the encounter screen: The idea behind bringing this widget within the encounter screen is for the doctor to view the current medications while

Page 22: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

entering the patients History or physical exam without having to go back into the demographics screen. The widget can be pulled out from the right side of the screen as seen here.

Clinical Decision Support Engine

Page 23: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Clinical Decision Rule engine is the engine for the Clinical & Patient Reminders. Essentially, this is an engine that allows for the incorporation of rule sets to check, monitor and report clinical information (vitals, history information, medications, procedures, labs, etc.) in real time. It has been built in a fashion to be very flexible and to support the physician in his area of practice.

What does it do?

The CDS engine allows a doctor to set rules and reminders pertaining to a particular patient. The engine allows you to set time limits and reminders which appear on the patient screen which prompts the doctor to carry out the task.

Page 24: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

The Cognitive Approach: What we had in mind when we designed the CDR engine was a methodology that saves time for the doctor and a system that reminds a doctor to carry out a particular test or perform a task.

For example if a patient has been diagnosed with Diabetes, a clinical reminder can be set in place that notifies the doctor to carry out a foot exam or an eye exam. This is so that the doctor doesn’t miss out on doing it.

Thus the system cognitively thinks on behalf of a doctor, supporting him and guiding him in treating the patient the way he is expected to.

Page 25: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Reminders can be set as Active, Passive and Patient

Active Reminder – Shows as an Alert screen on the patient’s page as shown below. When the doctor logs into the patient’s demographic screen, the reminder will appear as a drop down, reminding the doctor what to do.

Passive Reminder – (N) Pending alerts show many alerts for the patient when the Pending Alerts button as shown below, is clicked.

Patient Reminder – A reminder is sent to the patient either as a text message or via email.

Page 26: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Viewing from the Demographics screen (example)

Viewing the Reminders

The system also gives the doctors an option to view the reminders based on their choice either from the Demographics screen, Encounter screen or both thus allowing it to be seamless to their clinical workflow

Page 27: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Clinical Information Reconciliation: Clinical information reconciliation is a process by which a medical record belonging to a patient coming from a hospital/clinic using an EMR system outside of the Blue EHS gets converted into the format of the Blue EHS.

This is a feature by which physicians can merge/reconcile patient records (CCDA format) without having to manually enter that patient’s previous data thus saving time and effort and enables the physician to focus on care.

The requirement for merging records is that when transitioning a patient to another care setting, the Eligible Provider or Eligible Hospital should provide a summary care record in the CCDA format.

Once the CCDA document has been Imported and uploaded into the Blue EHS, you can link it to the same patient or any other patient in the system.

Page 28: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Page 29: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Overview of the Process

Once the details are uploaded and edited, you can merge it to that patient’s records by clicking approve.

Page 30: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

EHR Usability Test Report of BlueEHS

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

BlueEHS 1.0

Date of Usability Test: June 26,2015

Date of Report: June 27,2015 Report Prepared by: Pablo M. Dapena.

Page 31: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

1

TABLE OF CONTENTS TABLE OF CONTENTS…………………………………………………………1

EXECUTIVE SUMMARY………………………………………………………..2

DISCUSSION OF THE FINDINGS……………………………………...4

EFFECTIVENESS………………………………………………………...4

EFFICIENCY……………………………………………………………….4

SATISFACTION……………………………………………………………4

INTRODUCTION…………………………………………………………………5

METHOD…………………………………………………………………...5

RESULTS………………………………………………………………………..10

DATA ANALYSIS AND REPORTING………………………………….10

MAJOR FINDINGS & RECOMMENDATIONS………………………..11 TESTER’S TIMES………………………………………………………..11

APPENDICES…………………………………………………………………...12

APPENDIX 1: DEMOGRAPHICS QUESTIONNAIRE……………….12 APPENDIX 2: INFORMED CONSENT FORM………………………..17

APPENDIX 3: SYSTEM USABILITY SCALE QUESTIONNAIRE…..19

APPENDIX 4: G-3 TESTING WORKFLOWS…………………………21

REFERENCES………………………………………………………………….30

Page 32: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

2

Executive Summary A usability test of BlueEHS 1.0 was conducted on June 26,2015 with two testers, one of whom is a Medical Assistant and the other, a Licensed Practical Nurse. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of the current user interface and provide evidence of usability in the EHR Under Test(EHRUT). During the usability test, 2 healthcare provider users served as participants and used the EHR in simulated, but representative tasks. This study collected performance data on 6 tasks typically conducted on an EHR.

● CPOE/Medication List/Electronic prescribing ● Medication Allergy List ● Drug-drug interaction ● Drug-Allergy interaction ● Clinical Information Reconciliation ● Clinical Decision Support

Prior to the usability test participants were asked to review and sign an informed consent/release form(included in Appendix 3); they were instructed that they could withdraw at any time. Participants were also instructed to watch training videos prior to the usability test. These videos were provided by ZH Healthcare, and gave an overview of how to conduct each test. Each participant had prior experience with the EHR. During each one-on-one usability test, the participant was greeted by the facilitator. The facilitator introduced the test and recorded user performance data on paper and electronically. The facilitator did not give the participant assistance in how to complete the task. Participant video and audio were recorded for subsequent analysis. The following types of data were collected for each participant:

● Number of tasks successfully completed without assistance ● Number and types of errors ● Path deviations ● Participant’s verbalizations ● Participant’s satisfaction ratings of the system

All participant data was de-identified - no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

Page 33: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

3

Tasks

Performance Evaluation

Effectiveness Efficiency Satisfaction

CPOE/Medication List/ Electronic prescribing:

Yes 4 4 Excellent

CPOE Labs Yes 5 5 Excellent

CPOE Radiology Yes 5 5 Excellent

Medication allergy list: Yes 4 4 Excellent

Drug-Drug interaction Yes 4 5 Excellent

Drug-Allergy interaction Yes 4 5 Excellent

Clinical Information Reconciliation Yes 3 3 Good

Clinical Decision Support Yes 3 3 Good

The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks in the table below. Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average.

Page 34: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

4

DISCUSSION OF THE FINDINGS The following is a discussion of the major areas in which the EHRUT was evaluated (a detailed list of findings from the testing and recommendations may be found in the Results section on pg. 11). EFFECTIVENESS Overall effectiveness was above average. Most tasks were completed by both testers . One task could not be properly evaluated for effectiveness, as the instructions were not clear enough. The testers did mention that they understood what the main idea of the task was. EFFICIENCY Overall efficiency was high as there were almost no path deviations on the majority of tasks performed. The EHRUT was verbally reported to be user friendly by both testers. Specifically testers found that the UI was very intuitive.In testing there were some process inefficiencies such as: missing terms after entering in prompt, allergies not included in the allergy list, rule not appearing after a “warning” was established, etc. Minor improvements and additional changes are recommended to be addressed related to quality, time, and user interface for this platform to be more efficient. SATISFACTION Satisfaction rating was evaluated for the two testers. Both testers evaluated the EHRUT on the SUS questionnaire with scores over 80. Information from this test should be taken into consideration when undertaking future testing and system development.

Page 35: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

5

INTRODUCTION The EHRUT tested for this study was BlueEHS 1.0. Designed to present medical information to healthcare providers in all facility types, the EHRUT consists of demographics, vitals, medication prescribing and management and decision making. The usability testing attempted to represent realistic exercises and conditions. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction, such as time to complete task, number of deviations from task and overall impression of tasks were captured during the usability testing. METHOD PARTICIPANTS A total of 2 participants were tested on the EHRUT(s). Participants in the test were a medical assistant and a Licensed Practical Nurse. Participants were recruited by the testing team, participation was voluntary with no incentive given to participate. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Demographics Tester 1 Tester 2

Gender Female Female

Age Range 21-23 40-59

Race/ Ethnicity Caucasian Caucasian

Occupation/Role Medical Assistant Licensed Practical Nurse

Professional Experience(in years) 1 year 10 years

Computer/ HIT Experience Intermediate Intermediate

Experience with similar software Yes Yes(Often)

Assistive Technologies used N/A N/A

Usability Testing Experience None None

Page 36: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

6

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with 1 EHR. Each participant used the system remotely using GoToMeeting and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

● Number of tasks successfully completed within the allotted time without assistance ● Number and types of errors ● Path deviations ● Participant’s verbalizations ● Participant’s satisfaction ratings of the system

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

● 1. CPOE/Medication List/Electronic prescribing ● 2. Medication Allergy List ● 3. Drug-drug Interaction ● 4. Drug-Allergy Interaction ● 5. Clinical Information Reconciliation ● 6. Clinical Decision Support

PROCEDURES Prior to the test each participant reviewed and signed an informed consent form (See Appendix 2) as well as responded to a demographics questionnaire (See Appendix 1). The facilitator moderated the session including administering instructions and tasks. The facilitator also obtained post-task rating data. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

● As instructed by the facilitator. The facilitator read aloud the documented tasks step-bystep. The tester was also provided with the written documentation of instructions.

● Without further assistance. The facilitator observed the tester, and did not give additional assistance unless the tester could not complete the test as specified by instructions.

Page 37: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

7

● Complete the task as specified before commenting on functionality and ease of use.

For each task, the participants were given a written copy of the task. Task timing began once the facilitator introduced the question. The task time was stopped once the participant had successfully completed the task. Scoring is discussed below. Following the session, the administrator gave the participant the post-test questionnaire (e.g., the System Usability Scale, see Appendix 3) and thanked each individual for their participation. Participants' task success rate, time on task, errors, deviations, verbal responses, and post-test questionnaire were recorded. TEST LOCATION Both participants were at 1038 W. North Blvd. Suite.102 Leesburg,FL 34748 which is a physician’s office. The test facility included a quiet testing room with a table, computer for the participants and recording computer for the administrator. Only the participants and administrator were in the test room. To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range. The test was conducted remotely using GoToMeeting software. The software was downloaded on the personal computers of the facilitator and testers. Each usability test was recorded using the GoToMeeting software. Both audio recording of the conversation as well as a visual recording of the testers screen were obtained. All observers and the data logger worked from a separate room where they could see the participant’s screen, and listen to the audio of the session via a mic. TEST ENVIRONMENT

For testing, the computers used were a Dell Desktop running Windows 7 on the Mozilla Firefox Browser and a Lenovo laptop running Windows 8.1 on Mozilla Firefox Browser. The Dell Desktop was using a 19’’ monitor with resolution 1280x1024. Color Settings for the Dell machine: Color Depth: 32bit; Brightness: -10; Contrast: 46; Gamma: 0.9. Lenovo laptop was using a 15.6” monitor with resolution 1366 x 768. Color Settings for the Lenovo laptop: Color Depth: 32bit; Brightness: 0; Contrast: 50; Gamma: 1.0. The participants used the laptop’s keyboard and trackpad when interacting with the EHRUT. The test application was set up by the vendor ZH HealthCare according to the vendor’s documentation describing the system set-up and preparation. The application itself was web-based using a test database. Technically, the system performance (i.e., response time) was

Page 38: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

8

representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size).

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Demographic Questionnaire 2. Informed Consent 3. System Usability Scale 4. G-3 Testing Workflows

PARTICIPANT INSTRUCTIONS The facilitator reads the following instructions aloud to the each Participant: Thank you for agreeing to participate in this study. Your input is very important. Our session today will last about 90 minutes. During that time you will use an instance of an electronic health record in a test database. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as I instruct you. Please try to complete the tasks in full before commenting on the system. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. This testing will be recorded so we can review the results at a later date. Following the procedural instructions, participants were shown the EHR and given their first task, orally by the facilitator. Each task was done consecutively. Overall impressions of the EHR were also asked of each participant and recorded. Participants were then given 6 tasks to complete USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an

Page 39: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

9

acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: 1.Effectiveness of BlueEHS by measuring participant success rates and errors. 2.Efficiency of BlueEHS by measuring the average task time and deviations 3. Satisfaction with BlueEHS by measuring ease of use ratings DATA SCORING The following table details how tasks were scored, errors evaluated, and the time data analyzed.

Measures Rational and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times the task was attempted. The results were provided as a percentage. Task times were recorded during both testing exercises.

Effectiveness: Task Failures

If the participants abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as a “Failure”. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Effectiveness: Task Deviations

The participant’s path(ie.steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

Effectiveness Errors

If the system did not perform as to be expected these errors in the system would be recorded and tallied.

Page 40: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

10

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering the System Usability Scale(SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently”, “I thought the system was easy to use”, and “I would imagine that most people would learn to use this system very quickly”. See full System Usability Scale questionnaire in the Appendix.

Page 41: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

11

RESULTS DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. None of the participants failed to follow the session or task, so no data was excluded. The usability testing results for the EHRUT are detailed below:

Tasks # Task success (Yes/No)

Path Deviations

Task Time(min)

Optimal Time (min)

Errors Task Rating (5-

easy)

Observed/ Optimal

Mean (SD)

System Errors

User Errors

Mean(SD)

CPOE/Medication List/ Electronic prescribing

2 100% 13/13 3.12 2.5 0 0 4

CPOE Lab 2 100% 7/7 3.19 3 0 0

CPOE Radiology 2 100% 9/9 2.72 3 0 0 5

Medication Allergy List 2 80% 9/7 2.20 1.5 0 0 4

Drug-Drug interaction 2 100% 6/6 2.95 3 0 0 4.5

Drug-Allergy interaction 2 100% 10/10 2.27 2.5 0 0 4.5

Clinical Information Reconciliation

2 100% 13/13 3.12 3 0 0 3

Clinical Decision Support

2 75% 18/14 3.86 3.5 0 2* 3

* errors are defined in ‘Major Findings & Recommendation’ section The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks in the table below. Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average.

Page 42: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

12

Tasks Tester #1 Tester #2 Overall Evaluation

System Usability Scale 82.5 87.5 85

MAJOR FINDINGS & RECOMMENDATIONS -Both participants mentioned that the UI was very easy to navigate and intuitive. No major complications during the testing. -Although there was no visible logout link. Participants were able to figure out where to click to logout. -The workflow documents in regards of the “Clinical Decision Support” was somewhat difficult for both testers to comprehend. -Both testers mentioned that both the CPOE Lab and CPOE Radiology module were extremely simple and easy to use with the directions provided. -Although, we only tested 6 specific areas in the EMR. Both users mentioned that they were very comfortable navigation in the EMR and that it appeared to be easy to use. -CPOE-Medication List/Electronic prescribing task had no complications. Both users mentioned that the instructions were easy to follow -Medication Allergy List task had no complications. Both users completed the task with no issues and mentioned that the instructions were easy to follow. -Drug-Drug interaction did not have any major complications. Both users completed the task with no issues and mentioned that the instructions were easy to follow. -Drug-Allergy Interaction did not have any complications. -Clinical Information Reconciliation did not have any major complications. Both users mentioned that the instructions were easy to follow. -Clinical Decision Support did have some complications due to user errors. Both users mentioned that the instructions were not clear in the workflow documentation provided related to the action to be done on the actual interface. This lead the testers to click on multiple areas to understand the usage better which could have been avoided,; although they did understand the need for this and what it’s

Page 43: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

13

purpose was after they got into the interface fully. Their recommendation was to make the documentation simpler and more related to the activity in hand. This was experienced only the CDS setup screens handling and not in the alert sections. TIME PER TASK:

TESTER #1 TESTER #2 Mean

CPOE/Medication List/ Electronic prescribing:

2min 30 sec 3 min 45 sec 3min 12sec

CPOE Lab 2 min 49 sec 3 min 49 sec 3 min 19 sec

CPOE Radiology 2 min 27 sec 2 min 59 sec 2 min 72 sec

Medication allergy list: 1 min 50 sec 2 min 35 sec 2 min 20 sec

Drug-Drug interaction 2 min 45 sec 3 min 9 sec 2 min 95 sec

Drug-Allergy interaction 2 min 11 sec 2 min 22 sec 2 min 27 sec

Clinical Information Reconciliation 4 min 5 sec 2 min 10 sec 3 min 12 sec

Clinical Decision Support 3 min 45 sec 3 min 59 sec 3 min 86 sec

Page 44: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

14

APPENDICES

APPENDIX 1: DEMOGRAPHICS QUESTIONNAIRE

Tester #1

Page 45: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

15

Page 46: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

16

Tester #2

Page 47: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

17

Page 48: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

18

APPENDIX 2: INFORMED CONSENT FORM Consent Form: Remote Usability Test (Adult)

Tester #1

Page 49: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

19

Tester #2

Page 50: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

20

APPENDIX 3: SYSTEM USABILITY SCALE QUESTIONNAIRE

Tester #1

Page 51: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

21

Tester #2

Page 52: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

22

Page 53: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

23

Page 54: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

24

(a)(1)CPOE Lab

Page 55: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

25

Page 56: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

26

Page 57: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

27

Page 58: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

28

(a)(1) CPOE Radiology

Page 59: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

29

Page 60: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 1

Contents Introduction ....................................................................................................................................................... 2

Getting to the module ....................................................................................................................................... 2

1) Basic Details ............................................................................................................................................... 6

2) Demographics Filter Criteria ...................................................................................................................... 7

3) Target Action Groups ............................................................................................................................... 11

Page 61: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 2

Introduction Clinical Decision Rule engine is the engine for the Clinical & Patient Reminders.

Essentially, this is an engine that allows for the incorporation of rule sets to check, monitor and report clinical information (vitals, history information, medications, procedures, labs, etc.) in real time. It has been built in a fashion to be very flexible and to support internationalization.

Getting to the module Access the CDR Engine by clicking the icon on the top left side of the screen and find the CDR engine in the drop down box as shown.

Page 62: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 3

This brings you to the page with the CDR engine, with an already predefined set of clinical reminders. You also have the option to Add New Rule.

When you click Add new rule, a window opens under the headings

1. Basic Details 2. Demographics Filter Criteria 3. Target /Action Groups

Page 63: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 4

In the Basic Details, Reminders can be set as Active, Passive and Patient

Active Reminder – Shows as an Alert screen on the patient’s page as shown below.

Page 64: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 5

Passive Reminder – (N) Pending alerts show many alerts for the patient when the Pending Alerts button as shown below, is clicked.

Patient Reminder – A reminder is sent to the patient either as a text message or via email.

Page 65: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 6

When you click Add new rule, the following window opens under the headings:

Under the Basic Details you can enter the definition under: Clone: to take the clone of existing rule.

Title: Name of the new rule.

Type: Active, Passive & Patient.

Display: – This shows where you want to display it under – Demographics or Encounter Screen.

Admin Lock :- Only the privileged users can edit or delete the rule if admin lock is checked. Privileges can be set from module installer screen. The Clinical Warning and the Patient Warning options can be found under “Reminder” as shown below .Once you are done entering all the details, you can click SAVE to save your work.

1) Basic Details

Page 66: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 7

Reminder Settings: In the reminder settings, there are three priority levels (Priority 1, Priority 2, and Priority 3) and an option for a message as shown below.

Clinical Warning: The clinical warning date can be set either as year, week, month or day. This setting allows the patient to get a warning screen message on their patient screen any time before the date of action. For instance, if the option is set to “1 week,” the patient will get the warning 1 week before. Priority 3, 2 & 1: Priority dates can be set and serve as reminder after the set date if no action is taken. Message:

This is where you can enter the message you want to display.

Page 67: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 8

The system will take you to the next heading under Demographics only once you click SAVE. In the Criteria tab there is an option to add or edit groups. The Demographics Filter Criteria enables you to filter your criteria based on your preferences under the following headings

2) Demographics Filter Criteria

Page 68: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 9

You can also edit the Demographics Filter Criteria by clicking the Edit option next to the “Default” drop down menu.

Page 69: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 10

Sex: Select and enter the necessary details and click Add

Inclusion: If male is selected and Inclusion is given as Yes then only males will be included and females excluded and vice versa.

Age: Select the age under the criteria shown below along with the unit and inclusion criteria and click add.

Page 70: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 11

Custom Table: The custom table contains a preset list of tables already entered in the system.

Diagnosis: Filtering can also be selected based on the patient diagnosis under the following tab.

Page 71: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 12

Lifestyle: Lifestyle modifications under Alcohol, Coffee and Tobacco. This can be edited in the patient demographics screen.

Medical Issue: This filter is based on the patient’s medical issue, enter the medical issue, click Add.

Page 72: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 13

Medication: This filter is based on the name of the Medication, add the medication and click Add

Allergy: This filter is based on the type of the allergy (e.g. – Pine nuts), enter the allergy description and click add.

Page 73: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 14

Patient – The filter can also be applied based on patient name as shown below. Add the patient name, click add.

Page 74: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 15

3) Target Action Groups-

Target can be set under Lifestyle and Custom with the frequency and interval corresponding to that action.

Under the target action groups, you can set a target and the action corresponding to that particular target.

Page 75: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

CDR ENGINE

© 2013 ZH Healthcare. All Rights Reserved Page 16

The Custom input Yes/No option formulates the screen that is shown as below, this will appear as a reminder on the patient screen and when clicked on it gives you the drop down option where in you can enter if the necessary action has been taken or not.

If you want to Add more than one Target and action to a particular Rule you can click the “Add new Target/Action and enter the respective details and click save.

Page 76: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Clinical Information Reconciliation

Overview: Clinical Information Reconciliation is the ability to enable a user to

electronically reconcile the data that represents a patient’s active medication,

problem and medication allergy list from a Consolidated Patient Summary and

consume it into the EHR. This is a requirement of the EHR. Sites are encouraged to

utilize for best practice, however it is not an objective that has to be met for

Meaningful Use. This is an application that does have to be activated by CPSI.

Steps on How to Perform Reconciliation:

Getting Into the care coordination screen

Log into the BlueEHS; go to the “Menu” located on the top left side of the screen.

Select “Care Coordination” from the menu. On doing so you will be directed to

the care coordination is screen.

Page 77: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Importing Patient CCDA

To import Patient CCDA or Consolidated Clinical Document Architecture. We will

have to click on the Import tab in the Care Coordination screen. Select CCDA sub

tab. Before we can import the patient CCDA from the system we will first upload

the patient CCDA from an external source into the system.

Page 78: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Uploading patient’s CCDA

To upload the CCDA you will have to click on the “Upload” Button in the CCDA

screen. IN the above image marked in red is the area where we click to upload the

patient CCDA.

Page 79: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Once you click on the upload button you will get a pop up screen. Here you will

have to search for the patient CCDA. Once you select the patient CCDA click on

open and the pop up page will disappear. The below image shows how the CCDA

is uploaded.

Once the CCDA is uploaded you will be able to view the patient CCDA general

detail as Shown in the below image.

Page 80: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

The area marked in red is where you will be able to view the patients CCDA

general information’s such as date when the CCDA was uploaded, name of the

user who uploaded the CCDA, Patients name and Date of Birth. If you have the

patient already in the system and once you upload the same patients CCDA under

the heading “Match found” it will show as “Yes”.

Viewing Patients Information from the CCDA screen

Page 81: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

If you wish to view the patient’s information from the CCDA screen you can click

on the view button. In the above shown picture marked in red is the area where

you can click to view the patient’s information.

The above is the sample of patient information that you can view from the CCDA

screen.

Page 82: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Adding a patient from the CCDA screen

Once you have uploaded the patients CCDA you will be able to view an icon of a

person just besides the “Discard Button” On selecting the icon the patients CCDA

will disappear from the CCDA screen and will be added into the system as a

patient. In the above Shown picture marked in red is the area where you can click

to add the patient. Once you have added the patient you can go into the patient

search area the get the patient information. Below is an example of how the

search for the patient.

Page 83: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Here you can see that all the vital information and the patients last encounter

details been populated in the demographic area.

Page 84: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Once you enter in the patients demographics you will see that all the vital,

medical and allergic information is automatically added, also if the patient had

done any previous encounter the same will also be viewable.

In the future if you add a new CCDA of the same patient you will see the system

will automatically detect the patient and will inform you that a match has been

found. The system will mention “Yes” under Match found. The system will also

populate the PID and the name of the similar patient.

Updating a patients current information

Page 85: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Once you have uploaded a new CCDA of the same patient the system will provide

you will and option to update the patients information. You will be provided an

edit icon where on clicking the same you will get a page where you can edit the

patient’s information. Below Shown are the screens where you will get the edit

Option and the screen where you can edit the patient details.

Marked in red are the areas where you can make the change in the patient

information and marked in green are the last updated patient information. Once

you have updated the information you can go ahead and click on “Approve” to

confirm the changes of if you do not wish to you can click on “Discard” to deny

the changes.

Page 86: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

You can also click on Import all to make all the necessary changes at once.

Page 87: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described
Page 88: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   11 

Appendix C: Quality Management System 

Page 89: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Quality Management System Attestation Form-EHR-37-V03

InfoGard Laboratories, Inc. Page 1

For reporting information related to testing of 170.314(g)(4).

Vendor and Product Information

Vendor Name Z&H Health Care Solutions,LLC

Product Name BlueEHS

Quality Management System

Type of Quality Management System (QMS) used in the development, testing, implementation, and maintenance of EHR product.

Based on Industry Standard (for example ISO9001, IEC 62304, ISO 13485, etc.). Standard:

A modified or “home-grown” QMS.

No QMS was used.

Was one QMS used for all certification criteria or were multiple QMS applied?

One QMS used.

Multiple QMS used.

Description or documentation of QMS applied to each criteria:

Not Applicable.

Statement of Compliance

I, the undersigned, attest that the statements in this document are completed and accurate.

Vendor Signature by an Authorized Representative

Date 05/27/2015

Page 90: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   12 

Appendix D: Privacy and Security 

Page 91: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Privacy and Security Attestation Form-EHR-36-V04

InfoGard Laboratories, Inc. Page 1

Vendor and Product Information

Vendor Name Z&H Health Care Solutions,LLC

Product Name BlueEHS

Privacy and Security

170.314(d)(2) Auditable events and tamper-resistance

Not Applicable (did not test to this criteria)

Audit Log:

Cannot be disabled by any user.

Audit Log can be disabled.

The EHR enforces that the audit log is enabled by defaultwhen initially configured

Audit Log Status Indicator:

Cannot be disabled by any user.

Audit Log Status can be disabled

The EHR enforces a default audit log status. Identify thedefault setting (enabled or disabled):

There is no Audit Log Status Indicator because the Audit Log cannot be disabled.

Encryption Status Indicator (encryption of health information locally on end user device):

Cannot be disabled by any user.

Encryption Status Indicator can be disabled

The EHR enforces a default encryption status. Identify thedefault setting (enabled or disabled): enabled

There is no Encryption Status Indicator because the EHR does not allow health information to be stored locally on end user devices.

Identify the submitted documentation that describes the inability of the EHR to allow users to disable the audit logs, the audit log status, and/or the encryption status: D2_Audit Reports.docx

Identify the submitted documentation that describes the method(s) by which the EHR protects 1) recording of actions related to electronic health information, 2) recording of audit log status, and 3) recording of encryption status from being changed, overwritten, or deleted by the EHR technology: D2_Audit Reports.docx

Identify the submitted documentation that describes the method(s) by which the EHR technology detects whether the audit log has been

Page 92: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Privacy and Security Attestation Form-EHR-36-V04

InfoGard Laboratories, Inc. Page 2

altered: D2_Audit Reports.docx

170.314(d)(7) End-user device encryption

Storing electronic health information locally on end-user devices (i.e. temp files, cookies, or other types of cache approaches).

Not Applicable (did not test to this criteria)

The EHR does not allow health information to be stored locally on end-user devices.

Identify the submitted documentation that describes the functionality used to prevent health information from being stored locally: D7_End-user Device Encryption.docx

The EHR does allow health information to be stored locally on end user devices.

Identify the FIPS 140-2 approved algorithm used for encryption:

Identify the submitted documentation that describes how healthinformation is encrypted when stored locally on end-userdevices:

The EHR enforces default configuration settings that either enforces the encryption of locally stored health information or prevents health information from being stored locally.

Identify the default setting:

170.314(d)(8) Integrity

Not Applicable (did not test to this criteria)

Identify the hashing algorithm used for integrity (SHA-1 or higher): SHA-1

170.314(e)(1) View, Download, and Transmit to 3rd Party

Not Applicable (did not test to this criteria)

Identify the FIPS 140-2 approved algorithm used for encryption:

AES

Identify the FIPS 140-2 approved algorithm used for hashing:

SHA-1

170.314(e)(3) Secure Messaging

Not Applicable (did not test to this criteria)

Identify the FIPS 140-2 approved algorithm used for encryption:

AES

Identify the FIPS 140-2 approved algorithm used for hashing:

SHA-1

Statement of Compliance

I, the undersigned, attest that the statements in this document are accurate.

Vendor Signature by an Authorized Representative Shameem C Hameed, Chairman

Date 05/27/2015

Page 93: Test · Direct Certificate Discovery Tool 3.0.3 Edge Testing Tool n/a No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described

Test Results Summary for 2014 Edition EHR Certification 15‐3498‐R‐0035‐PRA V1.0, January 5, 2016  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   13 

  

Test Results Summary Document History  

Version  Description of Change  Date 

V1.0  Initial release  January 5, 2016 

  

END OF DOCUMENT