IG-2552-14-0099 - UL Transaction Security

Transcription

IG-2552-14-0099 - UL Transaction Security
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification
Part 1: Product and Developer Information
1.1
Certified Product Information
Product Name:
Product Version:
Domain: Test Type:
1.2
OnCallData
5.0
Ambulatory
Modular EHR
Developer/Vendor Information
Developer/Vendor Name:
Address:
Website:
Email:
Phone:
Developer/Vendor Contact:
InstantDx, LLC
9801 Washingtonian Boulevard, Suite 240
Gaithersburg, MD 20878
www.oncalldata.com
support@instantdx.com
(301) 208‐8000
Call Center
Part 2: ONC‐Authorized Certification Body Information
2.1
ONC‐Authorized Certification Body Information
ONC‐ACB Name:
Address:
Website:
Email:
Phone:
ONC‐ACB Contact:
InfoGard Laboratories, Inc.
709 Fiero Lane Suite 25
San Luis Obispo, CA 93401
www.infogard.com
ehr@infogard.com
(805) 783‐0810
Adam Hardcastle
This test results summary is approved for public release by the following ONC‐Authorized Certification Body Representative:
Adam Hardcastle
ONC‐ACB Authorized Representative
EHR Certification Body Manager
Function/Title
2/28/2016
Signature and Date
©2016 InfoGard. May be reproduced only in its original entirety, without revision
1
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
2.2
Gap Certification
The following identifies criterion or criteria certified via gap certification
§170.314
(a)(1)
(a)(17)
(d)(5)
(a)(6)
(b)(5)*
(d)(6)
(a)(7)
(d)(1)
(d)(8)
*Gap certification allowed for Inpatient setting only
No gap certification
(d)(9)
(f)(1)
2.3 Inherited Certification
The following identifies criterion or criteria certified via inherited certification
§170.314
(a)(1)
(a)(14)
(c)(3)
(f)(1)
(a)(2)
(a)(15)
(d)(1)
(f)(2)
(a)(3)
(a)(16) Inpt. only
(d)(2)
(f)(3)
(a)(4)
(a)(17) Inpt. only
(d)(3)
(f)(4) Inpt. only
(a)(5)
(b)(1)
(d)(4)
(f)(5) Optional & Amb. only
(a)(6)
(b)(2)
(d)(5)
(a)(7)
(b)(3)
(d)(6)
(f)(6) Optional & Amb. only
(a)(8)
(b)(4)
(d)(7)
(a)(9)
(b)(5)
(d)(8)
(g)(1)
(a)(10)
(b)(6) Inpt. only
(d)(9) Optional
(g)(2)
(a)(11)
(b)(7)
(e)(1)
(g)(3)
(a)(12)
(c)(1)
(e)(2) Amb. only
(g)(4)
(a)(13)
(c)(2)
(e)(3) Amb. only
No inherited certification
©2016 InfoGard. May be reproduced only in its original entirety, without revision
2
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
Part 3: NVLAP‐Accredited Testing Laboratory Information
Report Number: 14‐2552‐R‐0089 V1.3
Test Date(s): November 14‐December 24, 2014
Location of Testing: InfoGard and Vendor Site
3.1
NVLAP‐Accredited Testing Laboratory Information
ATL Name:
Accreditation Number:
Address:
Website:
Email:
Phone:
ATL Contact:
InfoGard Laboratories, Inc.
NVLAP Lab Code 100432‐0
709 Fiero Lane Suite 25
San Luis Obispo, CA 93401
www.infogard.com
ehr@infogard.com
(805) 783‐0810
Milton Padilla
For more information on scope of accreditation, please reference http://ts.nist.gov/Standards/scopes/1004320.htm
Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative:
Milton Padilla
ATL Authorized Representative
EHR Test Body Manager
Function/Title
2/28/2016
Signature and Date
3.2 Test Information
3.2.1
Additional Software Relied Upon for Certification
Additional Software
Applicable Criteria
First Databank
(a)2, (a)10, (b)3
No additional software required
Functionality provided by Additional Software
Drug database
©2016 InfoGard. May be reproduced only in its original entirety, without revision
3
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
3.2.2
Test Tools
Test Tool
Version
Cypress
1.0.4
ePrescribing Validation Tool
HL7 CDA Cancer Registry Reporting Validation Tool
HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool
HL7 v2 Immunization Information System (IIS) Reporting Valdiation T
HL7 v2 Laboratory Restults Intervace (LRI) Validation Tool
HL7 v2 Syndromic Surveillance Reporting Validation Tool
Transport Testing Tool
Direct Certificate Discovery Tool
No test tools required
3.2.3
Test Data
Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter ]
No alteration (customization) to the test data was necessary
3.2.4
Standards
3.2.4.1 Multiple Standards Permitted
The following identifies the standard(s) that has been successfully tested where more than one standard is permitted
Criterion #
(a)(8)(ii)(A)(2)
(a)(13)
(a)(15)(i)
Standard Successfully Tested
§170.204(b)(1)
§170.204(b)(2)
HL7 Version 3 Implementation HL7 Version 3 Implementation Guide: URL‐Based Guide: Context‐Aware Implementations of the Context‐ Knowledge Retrieval (Infobutton) Service‐Oriented Aware Information Retrieval Architecture Implementation (Infobutton) Domain
Guide
§170.207(a)(3)
IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release
§170.204(b)(1) HL7 Version 3 Implementation Guide: URL‐Based Implementations of the Context‐
Aware Information Retrieval (Infobutton) Domain
§170.207(j)
HL7 Version 3 Standard: Clinical Genomics; Pedigree
§170.204(b)(2)
HL7 Version 3 Implementation Guide: Context‐Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide
©2016 InfoGard. May be reproduced only in its original entirety, without revision
4
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
§170.210(g) §170. 210(g)
Network Time Protocol Version 3 Network Time Protocol Version 4 (RFC 5905)
(RFC 1305) §170.207(i) §170.207(a)(3)
The code set specified at 45 CFR IHTSDO SNOMED CT® (b)(2)(i)(A)
162.1002(c)(2) (ICD‐10‐CM) for International Release July 2012 the indicated conditions and US Extension to SNOMED CT® March 2012 Release
§170.207(a)(3)
§170.207(i) The code set specified at 45 CFR IHTSDO SNOMED CT® (b)(7)(i)
162.1002(c)(2) (ICD‐10‐CM) for International Release July 2012 the indicated conditions and US Extension to SNOMED CT® March 2012 Release
(e)(1)(i)
Annex A of the FIPS Publication 140‐2
List encryption and hashing algorithms
§170.210(g) §170. 210(g)
(e)(1)(ii)(A)(2)
Network Time Protocol Version 3 Network Time Protocol Version 4 (RFC 1305) (RFC 5905)
(e)(3)(ii)
Annex A of the FIPS Publication 140‐2
List encryption and hashing algorithms
§170.207(a)(3)
§170.207(b)(2)
IHTSDO SNOMED CT® The code set specified at 45 CFR Common MU Data International Release July 2012 162.1002(a)(5) (HCPCS and CPT‐
Set (15)
and US Extension to SNOMED 4)
CT® March 2012 Release
None of the criteria and corresponding standards listed above are applicable
(a)(16)(ii)
3.2.4.2
Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested Newer Version
Applicable Criteria
No newer version of a minimum standard was tested
3.2.5
Optional Functionality
Criterion #
(a)(4)(iii)
(b)(1)(i)(B)
(b)(1)(i)(C) (b)(2)(ii)(B)
Optional Functionality Successfully Tested
Plot and display growth charts
Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)
Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)
Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)
©2016 InfoGard. May be reproduced only in its original entirety, without revision
5
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)
Ambulatory setting only – Create syndrome‐based public health surveillance information for transmission (f)(3)
using the standard specified at §170.205(d)(3) (urgent care visit scenario)
Express Procedures according to the standard Common MU Data specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Set (15) Code on Dental Procedures and Nomenclature)
Express Procedures according to the standard Common MU Data specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD‐
Set (15)
10‐PCS)
No optional functionality tested
(b)(2)(ii)(C) 3.2.6
2014 Edition Certification Criteria* Successfully Tested
Criteria #
Version
TP**
TD***
Criteria #
Version
TP**
TD***
(a)(1)
(c)(3)
(a)(2)
(d)(1)
1.2
(a)(3)
(d)(2)
1.5
(a)(4)
(d)(3)
(a)(5)
(d)(4)
(a)(6)
(d)(5)
(a)(7)
(d)(6)
(a)(8)
(d)(7)
(a)(9)
(d)(8)
(a)(10)
1.2
1.4 (d)(9) Optional
(a)(11)
(e)(1)
(a)(12)
(e)(2) Amb. only
(a)(13)
(e)(3) Amb. only
(a)(14)
(f)(1)
(a)(15)
(f)(2)
(a)(16) Inpt. only
(f)(3)
(f)(4) Inpt. only
(a)(17) Inpt. only
(b)(1)
(f)(5) Optional & Amb. only
(b)(2)
(b)(3)
1.4
1.2 (f)(6) Optional & Amb. only
(b)(4)
(b)(5)
(g)(1)
1.8a
(b)(6) Inpt. only
(g)(2)
(b)(7)
(g)(3)
1.3
(c)(1)
(g)(4)
1.2
(c)(2)
*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)
**Indicates the version number for the Test Procedure (TP)
©2016 InfoGard. May be reproduced only in its original entirety, without revision
2.0
6
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
***Indicates the version number for the Test Data (TD)
3.2.7
2014 Clinical Quality Measures*
Type of Clinical Quality Measures Successfully Tested:
Ambulatory
Inpatient
No CQMs tested
*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures)
CMS ID
Version CMS ID
2
22
50
52
56
61
62
64
65
66
68
69
74
75
77
82
CMS ID
90
117
122
123
124
125
126
127
128
129
130
131
132
133
134
135
Version CMS ID
9
26
30
31
32
53
55
60
71
72
73
91
100
102
104
105
Ambulatory CQMs
Version CMS ID
Version CMS ID
Version
136
155
137
156
138
157
139
158
140
159
141
160
142
161
143
163
144
164
145
165
146
166
147
167
148
169
149
177
153
179
154
182
Inpatient CQMs
Version CMS ID
Version CMS ID
Version
107
172
108
178
109
185
110
188
111
190
113
114
171
©2016 InfoGard. May be reproduced only in its original entirety, without revision
7
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
3.2.8 Automated Numerator Recording and Measure Calculation
3.2.8.1 Automated Numerator Recording
Automated Numerator Recording Successfully Tested
(a)(1)
(a)(9)
(a)(16)
(a)(3)
(a)(11)
(a)(17)
(a)(4)
(a)(12)
(b)(2)
(a)(5)
(a)(13)
(b)(3)
(a)(6)
(a)(14)
(b)(4)
(a)(7)
(a)(15)
(b)(5)
Automated Numerator Recording was not tested 3.2.8.2
Automated Measure Calculation
Automated Numerator Recording Successfully Tested
(a)(1)
(a)(9)
(a)(16)
(a)(3)
(a)(11)
(a)(17)
(a)(4)
(a)(12)
(b)(2)
(a)(5)
(a)(13)
(b)(3)
(a)(6)
(a)(14)
(b)(4)
(a)(7)
(a)(15)
(b)(5)
Automated Measure Calculation was not tested 3.2.9
(b)(6)
(e)(1)
(e)(2)
(e)(3)
(b)(6)
(e)(1)
(e)(2)
(e)(3)
Attestation
Attestation Forms (as applicable)
Appendix
A
Safety‐Enhanced Design*
B
Quality Management System**
C
Privacy and Security
*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4)
**Required for every EHR product
©2016 InfoGard. May be reproduced only in its original entirety, without revision
8
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
Appendix A: Safety Enhanced Design
An inaccurate description of the summative usability testing measures used for Effectiveness, Efficiency, and Satisfaction was provided in the "Results" section of the report. The information provided in the table of results data did not match the results as described for measures of Effectiveness, Efficiency, and Satisfaction.
The following required data was missing: 1. Display information
2. Entity who set up the application
©2016 InfoGard. May be reproduced only in its original entirety, without revision
9
EHR Usability Test Report of OnCallData
Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports
OnCallData v5.0
Date of Usability Test:
November 10, 2014
Date of Report:
Report Prepared By:
November 13, 2014
Glen Ikeda, InstantDx
800-576-0526
support@instantdx.com
9801 Washingtonian Boulevard, Suite 240
Gaithersburg, MD 20878
Table of Contents
Executive Summary
INTRODUCTION
METHOD
PARTICIPANTS
STUDY DESIGN
TASKS
PROCEDURE
TEST ENVIRONMENT
PARTICIPANT INSTRUCTIONS
USABILITY METRICS
RESULTS
DATA ANALYSIS AND REPORTING
DISCUSSION OF THE FINDINGS
InstantDx User Centered Design Process
EXECUTIVE SUMMARY
A usability test of OnCallData v5.0 a modular EHR was conducted on November 10, 2014 in
Washington DC by InstantDx, LLC. The purpose of this test was to test and validate the usability
of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, 2 healthcare providers matching the target demographic criteria served
as participants and used the EHRUT in simulated, but representative tasks.
This study collected performance data on 9 tasks typically conducted on an EHR.
•
Search for Patient
•
Patient Eligibility Check and Medication History Retrieval
•
View Medications
•
View Allergies
•
Record Allergy
•
Record Medication
•
Prescribe a medication
•
Check Drug formulary
•
Check Drug/Drug and Drug/Allergy Interaction
During the 45 minute one-on-one usability test, each participant was greeted by the
administrator. Participants had prior experience with the EHR.
The administrator introduced the test, and instructed participants to complete a series of tasks
(given one at a time) using the EHRUT. During the testing, the administrator timed the test and
recorded user performance data on paper. The administrator did not give the participant
assistance in how to complete the task. No compensation was provided to the participants for
their time.
The following types of data were collected for each participant:
•
•
•
•
•
Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Participant’s verbalizations
Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of
the participant to the data collected.
Various recommended metrics, in accordance with the examples set forth in the NIST Guide to
the Processes Approach for Improving the Usability of Electronic Health Records, were used to
evaluate the usability of the EHRUT. Following is a summary of the performance and rating data
collected on the EHRUT.
Task
Search Patient
Patient Drug
Insurance
Eligibility Check
and Med History
Retrieval
View Allergies
View Medication
Record Medication
Record Allergy
Prescribe
Medication
Check Drug
Formulary
Check Drug/Drug
and Drug/Allergy
Interactions
Participant
Verbalization
None
None
Participant
Satisfaction
High
High
Errors
Time for Completion
0
0
30 Seconds
Automatic
None
None
None
None
None
High
High
High
High
High
0
0
0
0
0
30 Seconds
Under 1 minute
Under 1 minute
Under 1 minute
Under 2 minutes
None
High
0
Under 2 minutes
None
High
0
Under 1 minute
The results from the System Usability Scale scored the subjective satisfaction with the system
based on performance with these tasks to be: [xx].1 (This information was not collected)
Major findings
In addition to the performance data, the following qualitative observations were made:
No individual task took over 3 minutes to perform. Much of the time completing the tasks involved
reviewing the data and not application performance. The participants did not ask any questions while
performing the tasks but did provide feedback after the tasks were complete.
No application errors were observed. Also no participant errors in performing the tasks were observed.
The participants generally provided positive feedback on the ease and usefulness of the tasks
performed. They found the workflow intuitive and there were positive comments on how the
application fits into their existing office workflow.
The participants liked that patient drug insurance eligibility and medication history from retail
pharmacies and PBMs were automatically retrieved.
The display of drug formulary information was favorably commented on for its usefulness in allowing
them to prescribe a drug which will entail the least expense for the patient. One participant felt we
could better highlight some of the formulary information.
The automatic retrieval of patient medication history from PBMs and pharmacies was favorably
commented on for its ease in providing a complete medication history for the patient. The ease of
reconciling the active medications was also viewed favorably.
The fact that medication history, patient problems, patient allergies, and patient notes are all displayed
on one page was viewed favorably as they could view the information without clicking around to other
pages.
The page where we display drug/drug and drug/allergy interactions contain a lot of information, making
it somewhat hard to read.
Areas for improvement
•
•
Certain drug formulary information could be better highlighted
Make the page displaying drug/drug and drug/allergy interactions more readable.
INTRODUCTION
The EHRUT tested for this study was OnCallData v5.0, designed to present medical information
to healthcare providers in medical practice office settings. The EHRUT includes electronic
prescribing, medication history, patient allergies, and patient problems. The usability testing
attempted to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and
provide evidence of usability in the EHR. To this end, measures of effectiveness, efficiency and
user satisfaction, such as time for completion, ease of use and workflow, were captured during
the usability testing.
METHOD
INTENDED USERS
OnCallData v5.0 is used primarily for electronically prescribing medications.
OnCallData v5.0 is intended to be used by healthcare practitioners and their staff. These users
are primarily doctors and nurses who write medication prescriptions.
PARTICIPANTS
A total of two participants were tested on the EHRUT(s). Participants in the test were nurses.
Participants were recruited by InstantDx, LLC and received no compensation. In addition,
participants had no direct connection to the development of or organization producing the
EHRUT(s). Participants were not from the testing or supplier organization. Participants were
given the opportunity to have the same orientation and level of training as the actual end users
would have received.
The following is a table of participants by characteristics, including demographics, professional
experience, computing experience and user needs for assistive technology. Participant names
were replaced with Participant IDs so that an individual’s data cannot be tied back to individual
identities.
Part
ID
1
2
1
2
Gender
F
F
Age
n/a
n/a
Education
n/a
n/a
Occupation/
role
Nurse
Nurse
Professional
Experience
Nurse
Nurse
Computer
Experience
Medium
Medium
Product
Experience
High
High
Assistive
Technology
Needs
None
None
Two participants were recruited and two participated in the usability test. Zero participants
failed to show for the study.
Participants were scheduled for one 45 minute session with15 minutes in between each session
for debrief by the administrator and to reset systems to proper test conditions.
STUDY DESIGN
Overall, the objective of this test was to uncover areas where the application performed well –
that is, effectively, efficiently, and with satisfaction – and areas where the application failed to
meet the needs of the participants. The data from this test may serve as a baseline for future
tests with an updated version of the same EHR and/or comparison with other EHRs provided the
same tasks are used. In short, this testing serves as both a means to record or benchmark
current usability, but also to identify areas where improvements must be made.
During the usability test, participants interacted with one EHR. Each participant used the system
in the same location, and was provided with the same instructions. The system was evaluated
for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for
each participant:
•
•
•
•
•
Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Participant’s verbalizations (comments)
Participant’s satisfaction ratings of the system
TASKS
A number of tasks were constructed that would be realistic and representative of the kinds of
activities a user might do with this EHR, including:
•
•
•
Search for patient
o Search for a patient of the users choosing.
 Enter mandatory search criteria of partial or full patient last name and
first name
 Enter optional date of birth
 Enter optional Sex of patient
o All patients matching search criteria entered are displayed
 Display includes full Patient Name, Sex, Date of Birth, Address
o Select a patient to write a prescription or go the patient’s medical chart
Patient PBM insurance coverage data and Medication History retrieval
o Performed automatically by the application
View allergies
o View existing recorded allergies for a patient.
•
•
•
•
•
•
Record Allergies
o Record a new patient allergy
 Search for a drug allergen of their choosing.
 All drug allergens matching the search are displayed
 Select the drug to record and optionally enters the reaction and severity
 View the newly recorded allergy along with previously recorded
allergies
o Select a patient with no previously recorded allergies
 Record ‘No Known Allergies’.
 View this patient’s allergy record to see ‘No Known Allergies’ recorded
View Medications
o View medications which were prescribed by employees of the practice
o View new medication records which are retrieved automatically from
pharmacies and PBMs.
Record Medication
o Search for and select a medication completing the SIG and optionally the
dispense amount
o Select a patient with no medication history
 Record ‘No Known Active Medications’.
 View medication record to see ‘No Known Active Medications’
o Reconcile the patient medication history, marking which medications the
patient is currently taking (active) and which medications the patient no longer
takes (inactive).
Prescribe Medication
o Select the medication to prescribe using each of the following methods:
 Search for the drug to prescribe
• Search for a medication
o Select from the list of matching medications
o Enter the dispense amount and SIG to be included on
the prescription
 Choose the medication and SIG from the “Prescribers Favorite Scripts”
 Renew an active medication
o Select the pharmacy which will fill the prescription(s)
Check drug formulary
o View the formulary and benefit information which is automatically displayed to
the user based on the medication and patient PBM plan.
o Select an alternate medication for a ‘non-preferred’ medication.
Check drug/drug and drug/allergy interactions
o After all medications to be prescribed are selected, click a ‘Verify Script’ button
which will automatically display DUR information including drug/drug and
drug/allergy interactions.
o Review any interactions found and change the medication to be prescribed if
necessary.
Tasks were selected based on their frequency of use, criticality of function, and those that may
be most troublesome for users.
PROCEDURES
The administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, and took notes on
participant comments. Participants were instructed to perform the tasks (see specific
instructions below):
•
•
•
As quickly as possible making as few errors and deviations as possible.
Without assistance; administrators were allowed to give immaterial guidance and
clarification on tasks, but not instructions on use.
Without using a think aloud technique.
Task timing began once the administrator finished reading the question. The task time was
stopped once the participant indicated they had successfully completed the task. Scoring is
discussed below in the Data Scoring section.
Participants' demographic information, task success rate, time on task, errors, verbal responses,
and post-test questionnaire were recorded.
TEST ENVIRONMENT
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the
testing was conducted in a healthcare office. For testing, the computer used a desktop
computer running a Windows operating system.
The participants used mouse and keyboard when interacting with the EHRUT.
The application itself was running on a web browser using a test database and a WAN
connection. Technically, the system performance (i.e., response time) was representative to
what actual users would experience in a field implementation. Additionally, participants were
instructed not to change any of the default system settings (such as control of font size).
PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each participant.
Thank you for participating in this study. Your input is very important. Our session today will last
about 45 minutes. During that time you will use an instance of an electronic health record. I will
ask you to complete a few tasks using this system and answer some questions. You should
complete the tasks as quickly as possible making as few errors as possible. Please try to
complete the tasks on your own following the instructions very closely. Please note that we are
not testing you we are testing the system, therefore if you have difficulty all this means is that
something needs to be improved in the system. I will be here in case you need specific help,
but I am not able to instruct you or provide help how to use the application.
Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be
useful to you, and how we could improve it. I did not have any involvement in its creation, so
please be honest with your opinions. All of the information that you provide will be kept
confidential and your name will not be associated with your comments at any time. Should you
feel it necessary you are able to withdraw at any time during the testing.
Following the procedural instructions, participants were shown the EHR and as their first task, were given
time (15 minutes) to explore the system and make comments. Once this task was complete, the
administrator gave the following instructions:
For each task, I will read the description to you and say “Begin.” At that point, please perform
the task and say “Done” once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you
your impressions about the task once you are done.
Participants were given 9 tasks to complete.
USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic
Health Records, EHRs should support a process that provides a high level of usability for all
users. The goal is for users to interact with the system effectively, efficiently, and with an
acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user
satisfaction were captured during the usability testing. The goals of the test were to assess:
•
•
•
Effectiveness of OnCallData by measuring participant success rates and errors
Efficiency of OnCallData by measuring the average task time
Satisfaction with OnCallData by measuring ease of use ratings
DATA SCORING
The following table details how tasks were scored, errors evaluated, and the time data analyzed.
Measures
Effectiveness:
Task Success
Rationale and Scoring
A task was counted as a “Success” if the participant was able to
achieve the correct outcome, without assistance, within the time
allotted on a per task basis.
The total number of successes were calculated for each task and then
divided by the total number of times that task was attempted. The
results are provided as a percentage.
Task times were recorded for successes. Observed task times divided
by the optimal time for each task is a measure of optimal efficiency.
Optimal task performance time, as benchmarked by expert
performance under realistic conditions, is recorded when constructing
tasks. Target task times used for task times in the Moderator’s Guide
must be operationally defined by taking multiple measures of optimal
performance and multiplying by some factor e.g. 1.25, that allows
some time buffer because the participants are presumably not trained
to expert performance. Thus, if expert, optimal performance on a task
was 30 seconds then allotted task time performance was 30 * 1.25
seconds. This ratio should be aggregated across tasks and reported
with mean and variance scores.
Effectiveness:
Task Failures
If the participant abandoned the task, did not reach the correct answer
or performed it incorrectly, or reached the end of the allotted time
before successful completion, the task was counted as an “Failures.”
No task times were taken for errors.
The total number of errors was calculated for each task and then
divided by the total number of times that task was attempted. Not all
deviations would be counted as errors.11 This should also be
expressed as the mean number of failed tasks per participant.
On a qualitative level, an enumeration of errors and error types should
be collected.
Efficiency:
Task Time
Each task was timed from when the administrator said “Begin” until
the participant said, “Done.” If he or she failed to say “Done,” the time
was stopped when the participant stopped performing the task. Only
task times for tasks that were successfully completed were included
in the average task time analysis. Average time per task was
calculated for each task. Variance measures (standard deviation and
standard error) were also calculated.
Satisfaction:
Task Rating
Participant’s subjective impression of the ease of use of the
application was measured by administering both a simple post-task
question as well as a post-session questionnaire. After each task, the
participant was asked to rate “Overall, this task was:” on a scale of 1
(Very Difficult) to 5 (Very Easy). These data are averaged across
participants. 12
Common convention is that average ratings for systems judged easy
to use should be 3.3 or above.
To measure participants’ confidence in and likeability of the [EHRUT]
overall, the testing team administered the System Usability Scale
(SUS) post-test questionnaire. Questions included, “I think I would like
to use this system frequently,” “I thought the system was easy to use,”
and “I would imagine that most people would learn to use this system
very quickly.”
RESULTS
DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the
Usability Metrics section above. Participants who failed to follow session and task instructions
had their data excluded from the analyses.
The usability testing results for the EHRUT are detailed below. The results should be seen in light
of the objectives and goals outlined in Section 3.2 Study Design. The data should yield actionable
results that, if corrected, yield material, positive impact on user performance.
Task
Search Patient
Patient Drug
Insurance
Eligibility Check
and Med History
Retrieval
View Allergies
View Medication
Record Medication
Record Allergy
Prescribe
Medication
Check Drug
Formulary
Check Drug/Drug
and Drug/Allergy
Interactions
Participant
Verbalization
None
None
Participant
Satisfaction
High
High
Errors
Time for Completion
0
0
30 Seconds
Automatic
None
None
None
None
None
High
High
High
High
High
0
0
0
0
0
30 Seconds
Under 1 minute
Under 1 minute
Under 1 minute
Under 2 minutes
None
High
0
Under 2 minutes
None
High
0
Under 1 minute
Search Patient
•
•
Enter search criteria for locating a specific patient belonging to the practice. At minimum the
first three letters of the patient last name is entered.
The user selects the patient whose records are to be accessed from the list of patients retrieved
based on the search criteria entered.
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 1 minute.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
Retrieve Patient Drug Insurance Eligibility and Medication History
•
This task is performed automatically by the application
View Medications
•
•
•
Click the Medication List tab to automatically display the patient’s active and inactive
medications.
In the Manage Medication section, the medication Name and strength displays as well as the
initial medication record date, latest medication record date, number of times prescribed, and
the prescribing doctor (if available).
The medications can be sorted by medication name, initial medication record date, latest
medication record date, number of times prescribed, and prescribing doctor. The user changes
the sort of the medication list by clicking the appropriate column header.
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 1 minute.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
View Allergies
•
•
The user clicks the Medication List tab and the patient’s allergies will automatically display in the
Allergies section of the page.
The allergen name, reaction, date recorded, user who recorded, and severity level are displayed.
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 1 minute.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
Record Medication
•
•
•
•
Enter the medication to add and click the ‘Add Drug to Med List’ button.
Select the appropriate medication from the list of matching medications.
Enter the dose instructions for the medication and optionally the amount dispensed.
Click the ‘next’ button to add the medication to the patient’s medication history.
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
No individual component took over 1 minutes.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
User expressed desire to make entry of SIG optional
Record Allergy
•
•
•
•
Click the Manage Allergy button to search for the allergen to record.
Select allergen from the matching records.
Optionally enter an allergic reaction the patient has to the allergen.
Optionally enter the severity of the allergen.
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 1 minute.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
Prescribe Medication
•
Click the Script tab to display the script writing page.
o The user can search for a medication to prescribe, select from the Prescribers Favorite
Scripts, or renew from the patient’s active medications.
 Search for a medication
• Enter the medication to prescribe and clicking the Search Therapy
button.
• Select the medication from the list of medications which match what
the user entered.
• Enter the amount to dispense and dosing instructions.
o Selecting a Prescriber Favorite Script
 From a dropdown list, select the medication to prescribe
o
Renewing from the patient’s active medication list
 Click the Renew Meds button and selects the medication(s) to renew
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 3 minutes.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
Check Drug Formulary
o
Select a medication to prescribe. The drug formulary status for that drug is automatically
displayed based on the patient’s drug benefit plan. The patient’s plan information is
automatically retrieved when selecting a patient to prescribe for.
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 1 minute.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
Check Drug/Drug and Drug/Allergy Interactions
o
o
After all medications to prescribe and the pharmacy are selected, click the ‘Verify’ button. All
DUR checking, including Drug/Drug and Drug/Allergy interactions are performed. Any
interactions found are displayed automatically to the user
Prescribed drugs may be changed if interactions are found
EFFECTIVENESS
Users were able to successfully complete task with no errors.
EFFICIENCY
Task took less than 2 minutes.
SATISFACTION
Users expressed a high level of satisfaction.
AREAS FOR IMPROVEMENT
None expressed by users
InstantDx User-Centered Design Process
InstantDx has applied the same User-Centered Design (UCD) process to each of the following
criteria. The InstantDx UCD process is modeled after ISO 13407.
•
§170.314(a)(2) Drug-drug, drug-allergy interaction checks
•
§170.314(a)(6) Medication list
•
§170.314(a)(7) Medication allergy list
•
§170.314(b)(3) Electronic prescribing
InstantDx UCD Process
The InstantDx UCD process for all of the above modules include the following steps:
1. Gather the base functional specifications based on certification requirements for
Surescripts certification
2. Conduct user interviews to understand how each of the modules would be utilized by
practitioners and their staff
3. Conduct user interviews to understand user workflows and workflow bottlenecks during
patient encounters.
4. Modify and add to functional specifications based on user interviews.
5. Design a user interface prototype based on user interviews; how the modules shall be
utilized, and where each module fits in the user workflow.
6. Develop/modify the prototype for each module to demonstrate to users
7. Obtain user feedback to see if each module prototype meets user requirements,
Surescripts certification, and Surescripts Whitecoat standards
8. Repeat steps 6 and 7 as necessary
9. After obtaining user approval, develop the modules and integrate into the OnCallData
application.
10. Deploy the application to the production servers
11. Modify the listed modules in subsequent product releases based on user feedback. Repeat
this step throughout the product lifecycle
12. Modify the listed modules in subsequent product releases based on ONC certification
requirements for Meaningful Use Stage I and Stage II, retaining the user interface for
each module as much as possible
Appendix 1: PARTICIPANT DEMOGRAPHICS
Following is a high-level overview of the participants in this study.
Gender
Men
[0]
Women
[2]
Total (participants)
[2]
Occupation/Role
RN/BSN
Physician
Admin Staff
Total (participants)
[2]
[0]
[0]
[2]
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
Appendix B: Quality Management System
©2016 InfoGard. May be reproduced only in its original entirety, without revision
10
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
Appendix C: Privacy and Security
©2016 InfoGard. May be reproduced only in its original entirety, without revision
11
Test Results Summary for 2014 Edition EHR Certification
14‐2552‐R‐0089‐PRA V1.1, February 28, 2016
Test Results Summary Document History Description of Change
Version
Initial release
V1.0
Updated Safety‐Enhanced Design report
V1.1
Date
December 24, 2014
February 28, 2016
END OF DOCUMENT
©2016 InfoGard. May be reproduced only in its original entirety, without revision
12