ONC HIT Certification Program Test Results

Transcription

ONC HIT Certification Program Test Results
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
ONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification
Part 1: Product and Developer Information
1.1 Certified Product Information
Product Name:
Falcon
Product Version:
2.35.0
Domain:
Ambulatory
Test Type:
Modular EHR
1.2 Developer/Vendor Information
Developer/Vendor Name:
Falcon, LLC.
Address:
1551 Wewatta St. Denver CO 80202
Website:
www.falconehr.com
Email:
EHRsupport@davita.com
Phone:
303.681.7226
Developer/Vendor Contact:
Laura Whalen
Page 1 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
Part 2: ONC-Authorized Certification Body Information
2.1 ONC-Authorized Certification Body Information
ONC-ACB Name:
Drummond Group
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
ehr@drummondgroup.com
Phone:
817-294-7339
ONC-ACB Contact:
Bill Smith
This test results summary is approved for public release by the following ONC-Authorized Certification
Body Representative:
Bill Smith
ONC-ACB Authorized Representative
Function/Title
6/4/14
Signature and Date
2.2
Certification Committee Chair
Gap Certification
The following identifies criterion or criteria certified via gap certification
§170.314
x
(a)(1)
(a)(17)
x
(d)(5)
x
(a)(6)
(b)(5)*
x
(d)(6)
x
(a)(7)
(d)(1)
x
(d)(8)
x
(d)(9)
x
(f)(1)
*Gap certification allowed for Inpatient setting only
No gap certification
Page 2 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
2.3 Inherited Certification
The following identifies criterion or criteria certified via inherited certification
§170.314
(a)(1)
(a)(14)
(c)(3)
(f)(1)
(a)(2)
(a)(15)
(d)(1)
(f)(2)
(a)(3)
(a)(16) Inpt. only
(d)(2)
(f)(3)
(a)(4)
(a)(17) Inpt. only
(d)(3)
(f)(4) Inpt. only
(a)(5)
(b)(1)
(d)(4)
(a)(6)
(b)(2)
(d)(5)
(f)(5) Optional &
Amb. only
(a)(7)
(b)(3)
(d)(6)
(a)(8)
(b)(4)
(d)(7)
(f)(6) Optional &
Amb. only
(a)(9)
(b)(5)
(d)(8)
(g)(1)
(a)(10)
(b)(6) Inpt. only
(d)(9) Optional
(g)(2)
(a)(11)
(b)(7)
(e)(1)
(g)(3)
(a)(12)
(c)(1)
(e)(2) Amb. only
(g)(4)
(a)(13)
(c)(2)
(e)(3) Amb. only
x No inherited certification
Page 3 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
Part 3: NVLAP-Accredited Testing Laboratory Information
Report Number: KAM-060314-1867
Test Date(s): 06/06/13, 03/06/14, 06/03/14
3.1 NVLAP-Accredited Testing Laboratory Information
ATL Name:
Drummond Group EHR Test Lab
Accreditation Number:
NVLAP Lab Code 200979-0
Address:
13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website:
www.drummondgroup.com
Email:
ehr@drummondgroup.com
Phone:
512-335-5606
ATL Contact:
Beth Morrow
For more information on scope of accreditation, please reference NVLAP Lab Code 200979-0.
Part 3 of this test results summary is approved for public release by the following Accredited Testing
Laboratory Representative:
Kyle Meadors
ATL Authorized Representative
Signature and Date
6/4/14
Test Proctor
Function/Title
Nashville, TN
Location Where Test Conducted
3.2 Test Information
3.2.1 Additional Software Relied Upon for Certification
Additional Software
Surescripts Network for
Clinical Interoperability
Health Companion
Applicable Criteria
b.1, b.2, e.1
e.1
Functionality provided
by Additional Software
Direct HISP
Portal
Page 4 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
Additional Software
Applicable Criteria
Functionality provided
by Additional Software
No additional software required
3.2.2 Test Tools
Version
Test Tool
Cypress
x
x
2.4.1
ePrescribing Validation Tool
1.0.3
HL7 CDA Cancer Registry Reporting Validation Tool
1.0.3
HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool
1.7
x
HL7 v2 Immunization Information System (IIS) Reporting Validation
Tool
x
HL7 v2 Laboratory Results Interface (LRI) Validation Tool
1.7
HL7 v2 Syndromic Surveillance Reporting Validation Tool
1.7
x
Transport Testing Tool
178
x
Direct Certificate Discovery Tool
2.1
1.7.1
No test tools required
3.2.3 Test Data
Alteration (customization) to the test data was necessary and is described in
Appendix [insert appendix letter]
No alteration (customization) to the test data was necessary
3.2.4 Standards
3.2.4.1 Multiple Standards Permitted
The following identifies the standard(s) that has been successfully tested
where more than one standard is permitted
Criterion #
Standard Successfully Tested
§170.204(b)(1)
(a)(8)(ii)(A)(2)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
§170.204(b)(2)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
Page 5 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
Criterion #
Standard Successfully Tested
x
(a)(13)
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
§170.204(b)(1)
(a)(15)(i)
(a)(16)(ii)
HL7 Version 3 Implementation
Guide: URL-Based
Implementations of the
Context-Aware Information
Retrieval (Infobutton) Domain
x
§170.204(b)(2)
HL7 Version 3 Implementation
Guide: Context-Aware
Knowledge Retrieval
(Infobutton) Service-Oriented
Architecture Implementation
Guide
§170. 210(g)
Network Time Protocol
Version 3 (RFC 1305)
Network Time Protocol
Version 4 (RFC 5905)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
§170.207(i)
(b)(7)(i)
HL7 Version 3 Standard:
Clinical Genomics; Pedigree
§170.210(g)
§170.207(i)
(b)(2)(i)(A)
§170.207(j)
The code set specified at 45
CFR 162.1002(c)(2) (ICD-10CM) for the indicated
conditions
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
x
§170.207(a)(3)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
Annex A of the FIPS Publication 140-2
(e)(1)(i)
[list encryption and hashing algorithms]
AES
SHA-1
(e)(1)(ii)(A)(2)
§170.210(g)
Network Time Protocol
Version 3 (RFC 1305)
x
§170. 210(g)
Network Time Protocol
Version 4 (RFC 5905)
Annex A of the FIPS Publication 140-2
(e)(3)(ii)
[list encryption and hashing algorithms]
AES
SHA-1
§170.207(a)(3)
Common MU
Data Set (15)
IHTSDO SNOMED CT®
International Release July
2012 and US Extension to
SNOMED CT® March 2012
Release
x
§170.207(b)(2)
The code set specified at 45
CFR 162.1002(a)(5) (HCPCS
and CPT-4)
Page 6 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
Criterion #
Standard Successfully Tested
None of the criteria and corresponding standards listed above are
applicable
3.2.4.2 Newer Versions of Standards
The following identifies the newer version of a minimum standard(s) that
has been successfully tested
Newer Version
Applicable Criteria
No newer version of a minimum standard was tested
3.2.5 Optional Functionality
Criterion #
x (a)(4)(iii)
Optional Functionality Successfully Tested
Plot and display growth charts
(b)(1)(i)(B)
Receive summary care record using the standards specified at
§170.202(a) and (b) (Direct and XDM Validation)
(b)(1)(i)(C)
Receive summary care record using the standards specified at
§170.202(b) and (c) (SOAP Protocols)
(b)(2)(ii)(B)
Transmit health information to a Third Party using the standards
specified at §170.202(a) and (b) (Direct and XDM Validation)
(b)(2)(ii)(C)
Transmit health information to a Third Party using the standards
specified at §170.202(b) and (c) (SOAP Protocols)
(f)(3)
Ambulatory setting only – Create syndrome-based public health
surveillance information for transmission using the standard
specified at §170.205(d)(3) (urgent care visit scenario)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures
and Nomenclature)
Common MU
Data Set (15)
Express Procedures according to the standard specified at
§170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)
No optional functionality tested
Page 7 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
3.2.6 2014 Edition Certification Criteria* Successfully Tested
Criteria #
Version
TP** TD***
Criteria #
Version
TP
TD
(c)(3)
1.6
(d)(1)
1.2
(a)(1)
1.2
x
(a)(2)
1.2
x
(a)(3)
1.2
1.4
x
(d)(2)
1.4
x
(a)(4)
1.4
1.3
x
(d)(3)
1.3
x
(a)(5)
1.4
1.3
x
(d)(4)
1.2
(a)(6)
1.3
1.4
(d)(5)
1.2
(a)(7)
1.3
1.3
(d)(6)
1.2
x
(a)(8)
1.2
(d)(7)
1.2
x
(a)(9)
1.3
1.3
(d)(8)
1.2
x
(a)(10)
1.2
1.4
(d)(9) Optional
1.2
x
(a)(11)
1.3
x
(e)(1)
1.7
1.4
x
(a)(12)
1.3
x
(e)(2) Amb. only
1.2
1.5
x
(a)(13)
1.2
x
(e)(3) Amb. only
1.3
x
(a)(14)
1.2
(f)(1)
1.2
1.2
x
(a)(15)
1.5
(f)(2)
1.3
1.7.1
(a)(16) Inpt. only
1.3
(f)(3)
1.3
1.7
(a)(17) Inpt. only
1.2
(f)(4) Inpt. only
1.3
1.7
x
(b)(1)
1.6
1.3
x
(b)(2)
1.4
1.5
(f)(5) Optional &
Amb. only
1.2
1.2
x
(b)(3)
1.4
1.2
x
(b)(4)
1.3
1.4
(f)(6) Optional &
Amb. only
1.3
1.0.3
x
(b)(5)
1.4
1.7
(g)(1)
1.6
1.8
(b)(6) Inpt. only
1.3
1.7
x
(g)(2)
1.6
1.8
x
(b)(7)
1.4
1.5
x
(g)(3)
1.3
x
(c)(1)
1.6
1.6
x
(g)(4)
1.2
x
(c)(2)
1.6
1.6
1.5
x
x
x
1.2
1.6
No criteria tested
*For a list of the 2014 Edition Certification Criteria, please reference
http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)
**Indicates the version number for the Test Procedure (TP)
***Indicates the version number for the Test Data (TD)
Page 8 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
3.2.7 2014 Clinical Quality Measures*
Type of Clinical Quality Measures Successfully Tested:
Ambulatory
x
Inpatient
No CQMs tested
*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov
(navigation: 2014 Clinical Quality Measures)
CMS ID
Version
CMS ID
2
x
22
v2
50
x
x
Ambulatory CQMs
Version CMS ID
90
136
117
137
122
v2
x
138
Version
CMS ID
Version
155
x
v2
156
v2
157
52
123
139
158
56
124
140
159
61
125
141
160
62
126
142
161
64
127
143
128
144
129
145
x
165
v2
x
166
v3
65
v3
66
x
163
164
x
68
v3
130
146
x
69
v2
131
147
167
74
132
148
169
75
133
149
177
153
179
154
182
77
x
82
CMS ID
134
135
Version
CMS ID
v2
Inpatient CQMs
Version CMS ID
v2
Version
CMS ID
9
71
107
172
26
72
108
178
30
73
109
185
31
91
110
188
32
100
111
190
53
102
113
55
104
114
60
105
171
Version
Page 9 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
3.2.8 Automated Numerator Recording and Measure Calculation
3.2.8.1 Automated Numerator Recording
Automated Numerator Recording Successfully Tested
(a)(1)
(a)(9)
(a)(16)
(b)(6)
(a)(3)
(a)(11)
(a)(17)
(e)(1)
(a)(4)
(a)(12)
(b)(2)
(e)(2)
(a)(5)
(a)(13)
(b)(3)
(e)(3)
(a)(6)
(a)(14)
(b)(4)
(a)(7)
(a)(15)
(b)(5)
x Automated Numerator Recording was not tested
3.2.8.2 Automated Measure Calculation
Automated Measure Calculation Successfully Tested
x
(a)(1)
x
(a)(9)
(a)(16)
(b)(6)
x
(a)(3)
x
(a)(11)
(a)(17)
x
(e)(1)
x
(a)(4)
x
(a)(12)
x
(b)(2)
x
(e)(2)
x
(a)(5)
x
(a)(13)
x
(b)(3)
x
(e)(3)
x
(a)(6)
x
(a)(14)
x
(b)(4)
x
(a)(7)
x
(a)(15)
x
(b)(5)
Automated Measure Calculation was not tested
3.2.9 Attestation
Attestation Forms (as applicable)
Appendix
x Safety-Enhanced Design*
A
x Quality Management System**
B
x Privacy and Security
C
*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16),
(b)(3), (b)(4)
**Required for every EHR product
3.3 Appendices
Attached below.
Page 10 of 11
Test Results Summary for 2014 Edition EHR Certification
Version EHR-Test-144 Rev 17-Feb-2014
Test Results Summary Document History
Version
17-Feb-2014
Description of Change
Edited: section header page 3; contact info
page 4
Date
17-Feb-2014
10-Feb-2014
Modified layout
10-Feb-2014
20-Nov-2013
Updated test tool sections
20-Nov-2013
25-Oct-2013
Corrected numbering of 3.2.8 section
25-Oct-2013
15-Oct-2013
Modified layout slightly
15-Oct-2013
01-Oct-2013
Initial Version
01-Oct-2013
2014 Edition Test Report Summary
Page 11 of 11
UX Design & Falcon EHR 7.1.2013
User experience design (UXD) is an approach to the design of products and services that looks
beyond the design of the artifacts itself to the experience that it creates for the people who use
it. UXD incorporates an understanding of human psychology and human behavior into the
products we create.
Hypothesis
We start with an idea about what we are trying to accomplish in the context of a project. This is
a high-level, supposition, “We think that these users want to do something so that…” This could
be a problem to be solved, supporting a job to be done, etc. The ideas come from many
different sources including, customer feedback, sales, marketing, product, development, client
services, regulatory, etc.
Concept
We conduct user research (interviews, surveys, focus groups) to better understand the problem
to be solved or the job to be done. We define and describe the people who we are solving the
problem for. We define why it’s a problem in the first place. We define the context of use, which
is an important consideration when designing a solution. We define how they are currently
getting things done without the solution (if they are). We shadow users to understand how they
are using the system and what unarticulated issues they may be running into. We include ideas
about solutions we think may solve the problem. We frame the usability metrics, if applicable.
For example, in one project, the goal was to be able to create a patient note in a single click.
We may include information on what competitors are doing or how unrelated, external software
is solving a similar problem in another field. We may benchmark usability metrics of the current
software, (such as how long it takes users to accomplish certain tasks) in order to demonstrate
that our eventually solution improved the efficiency, effectiveness, and satisfaction of these
tasks.
Design
The designer assigned to the project will wireframe potential solutions based on the concept.
We ideate multiple, different solutions and test them with users as well validate their feasibility
internally. Based on their feedback, we merge these solutions into a single design, which is
created in a high fidelity, functional prototype. The prototype is then validated internally.
Internal reviews happen with the development team to ensure that the proposed solution is
possible. Development will also provide design inputs if they have ideas about how certain
things can/should work.
We then set up a series of usability studies to test the prototype with customers. In the studies,
we ask the users to perform a series of tasks that we know they are trying to accomplish with
the software. We observe how they use the prototype, where they succeed, and more
importantly, where they struggle. We alter the design based on our observations during the
studies. This cycle of iteration can happen once or many times depending on the problem we
are trying to solve and the complexity associated with it.
Example Design 1
Example Design 2
Example Design 3
Merged Design, based on usability studies
The output of these activities is lean documentation of the solution in a User Story.
Build
We work with our developers on a daily basis to build out the solution in the real application.
We provide daily design direction and work through any issues or complications that arise as
the design becomes reality. We create an implementation plan and communicate the product
changes internally and externally.
We collect user feedback one final time once the design has been implemented in the real
application, before it has been released to the entire user base. We may make small adjustments
based on this feedback, if applicable. We then release the finished solution to our customers.
We may conduct additional usability studies with the implemented design a few months after it
has been released. The purpose of these studies is to collect usability metrics such as task time
and compare them to our previous benchmark, so that we understand if we are making the
software more efficient to use.
EHR Usability Test Report of Falcon version 2.35.0
Falcon version 2.35.0
Date of usability test:
Specific dates for each section provided within.
Date of report:
May 18, 2014
Report prepared by:
Falcon Product Team
Table of Contents
Executive Summary.........................................................................................................................2
Introduction ....................................................................................................................................2
Method ..................................................................................................................................................2
Tasks ......................................................................................................................................................3
Procedure ..............................................................................................................................................3
Test Location & Environment.................................................................................................................4
Test Forms and Tools .............................................................................................................................4
Participant Instructions ..........................................................................................................................4
Usability Metrics ....................................................................................................................................5
DATA SCORING...................................................................................................................................5
Specific Test Process/Results by Section .........................................................................................5
170.314.a.1 Computerized provider order entry ...................................................................................6
170.314.a.2 Drug-drug, drug-allergy interaction checks ........................................................................7
170.314.a.6 Medication list ...................................................................................................................8
170.314.a.7 Medication allergy list ........................................................................................................9
170.314.a.8 Clinical decision support ..................................................................................................10
170.314.b.3 Electronic prescribing.......................................................................................................11
170.314.b.4 Clinical information reconciliation ...................................................................................12
Executive Summary
Usability tests of Falcon version 2.35.0 were conducted on a variety of dates by the Falcon product
team. The purpose of these tests was to test and validate the usability of the current user interface,
and provide evidence of usability in the EHR Under Test (EHRUT).
During the usability test, healthcare providers matching the target demographic criteria served as
participants and used the EHRUT in simulated, but representative tasks. The specific participation
numbers are provided in the sub sections below.
This study collected performance data on a number of relevant tasks typically conducted on an EHR.
The process, content, and results for each functional area are outlined within each sub section.
Participants all had prior experience with the EHR. In all cases, the administrator introduced the test,
and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During
the testing, the administrator timed the test and, recorded user performance data. The administrator
did not initially give the participant assistance in how to complete the task.
Introduction
The EHRUT tested for this study was Falcon version 2.35.0. Designed to present medical information to
healthcare providers in their offices and in dialysis centers, the EHRUT consists of a web based platform
and a mobile platform. The usability testing attempted to represent realistic exercises and conditions.
The purpose of these studies was to test and validate the usability of the current user interface, and
provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness,
efficiency and user satisfaction, such as time to complete task, errors in completing task, were captured
during the usability testing.
Method
Current customers were contacted to participate in the studies. In some cases, the customers had
volunteered in advance for potential participation. Participants were chosen to represent a variety of
key characteristics (practice size, EHR experience) that would be important in extending findings to the
user base.
Study Design
Overall, the objective of this test was to uncover areas where the application performed well – that is,
effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs
of the participants. The data from this test may serve as a baseline for future tests with an updated
version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In
short, this testing serves as both a means to record or benchmark current usability, but also to identify
areas where improvements must be made.
Each participant used the system in the same location, and was provided with the same instructions.
The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures
collected and analyzed for each participant:
•
•
•
•
•
Number of tasks successfully completed within the allotted time without assistance
Time to complete the tasks
Number and types of errors
Participant’s verbalizations (comments)
Single Ease Question (SEQ) which we use to measure satisfaction
Tasks
A number of tasks were constructed that would be realistic and representative of the kinds of activities
a user might do with this EHR; tasks specific to each section are outlined within the content below.
The selection and definitely of tasks was done in collaboration with the users. We prioritized tasks
based on prominence in workflow and risk associated with possible user errors. Further, we
prioritized the steps in the tasks (the user’s path) based on the most error prone path.
If errors exist within any of the critical modules, we will assess risk against the prioritized list below.
Priority
1
2
3
4
5
6
7
Falcon Prioritization of Modules
Module Name
CPOE
Medication List
Electronic Prescribing
Medication Allergy List
Drug-to-Drug/Drug Allergy Test
Clinical Information Reconciliation
Clinical Decision Support
Procedure
At the scheduled time, the facilitator dialed into the conference line and webex meeting. The
participant did the same. The facilitator thanked the participant for their participation and described
the tasks that they would like the participant to complete for each section. The facilitator explained the
purpose of the usability test, asked the participant to think out loud, as this will give the facilitator and
his/her team better insight into how the participant understands the user interface. The facilitator
explained that he/she may not answer the participant’s questions during the test, because one of the
goals of the session is to ensure that the participant can use the software when someone is not there
to help.
The facilitator moderated the session including administering instructions and tasks. The facilitator also
monitored task times and took notes on participant comments and performance/behavior.
Task timing began once the administrator finished reading the question and gave the participant
control of the system via WebEx. The task time was stopped once the participant had successfully
completed the task, as determined by predefined success criteria. Scoring is discussed below.
Following the session, the facilitator thanked each individual for their participation.
Participants', task success rate, time on task, errors, SEQ, and comments were recorded into a
spreadsheet.
Test Location & Environment
All testing was conducted virtually, via phone and WebEx meeting.
The participants used mouse and keyboard when interacting with the EHRUT.
Technically, the system performance (i.e., response time) was representative to what actual users
would experience in a field implementation.
Test Forms and Tools
During the usability test, various documents and instruments were used, including: Predefined taskscenarios which are created to be representative tasks in the system. A timing application on
iPhone/iPad was used to record time on task.
The participant’s interaction with the EHRUT was captured and recorded digitally with WebEx tools.
Participant Instructions
The administrator reads the following instructions aloud to the each participant:
• Brief explanation of the purpose of the test – find usability problems – access satisfaction with
the design
• We are going to ask you to run through a couple of scenarios, I’ll read each scenario to you and
then ask to you being. After the scenario is completed, we will ask you to rate the scenario
based on how easy or difficult you felt the scenario was, 1-7.
• It’s important to note that we are testing the software, not you! Any problems or difficulties
you encounter are due to poor design and not anything that you are doing wrong. In fact, this
is probably the only time today where you don’t have to worry about making any mistakes.
• You are welcome to ask questions since we want to know what you find unclear in the
interface, but we will not answer most questions during the test itself, since the goal of the
test is to see whether the system can be used without outside help
• Feel free think out loud – this helps us understand how you view the system and makes it
easier to identify usability errors
• Please feel free to be honest, you won’t hurt our feelings and the purpose of this session to
ensure the design works when actually used
• Any questions before we begin?
Usability Metrics
The goals of the test were to assess:
1. Effectiveness of Falcon by measuring task-scenario completion percentage and recording number of
errors
2. Efficiency of Falcon by measuring the average task time
3. Satisfaction with Falcon by measuring ease of use ratings, using the Single Ease Question (SEQ)
DATA SCORING
Effectiveness: Tasks were counted as being completed if the user could meet the task success criteria
Task Success
A task was counted as a “Success” if the participant was able to achieve the correct
outcome, without assistance, within the time allotted on a per task basis.
The total number of successes were calculated for each task and then divided by the
total number of times that task was attempted. The results are provided as a
percentage.
Task Failures
If the participant abandoned the task, did not reach the correct answer or performed it
incorrectly, or reached the end of the allotted time before successful completion, the
task was counted as a “Failure.” No task times were counted for failures.
The total number of errors was calculated for each task.
Task Time
Each task was timed from when the facilitator passed control of the screen through
WebEx until the participant had achieved the predefined completion criteria. Only task
times for tasks that were successfully completed were included in the average task
time analysis. Average time per task was calculated for each task. Variance measures
(standard deviation and standard error) were also calculated.
Satisfaction:
After each task-scenario, the participant was asked, on a scale of 1-7, how difficult or
easy was this task? 1 being very difficult, and 7 being very easy.
Specific Test Process/Results by Section
170.314.a.1 Computerized provider order entry
Participants
•
•
•
•
•
Physician 1
Physician 2
Physician 3
Physician 4
Physician 5
Date and location
•
June 24 – 28, remote conference call and WebEx
Tasks
•
You need to order lab results for a patient. Order a lipid panel, hemoglobin and hematocrit
test. Save & Print the order.
Metrics
•
•
•
•
Avg. Task Time: 159.24
Avg. Task Completion: 50%
AVg. # of Errors: 1.83
SEQ: 3.7
Results
•
This task tested poorly, with only half of the users completing the task. The average task time
was excessive considering the simple nature of the task. We found many areas of
improvement.
Findings
•
Please see areas of improvement
Areas of improvement
•
•
•
When creating the order, the system defaults to a Diagnostic Order, and the user has to change
the order type to a lab order. Several users became confused at this stage, and tried searching
for lab tests to order in the diagnostic order screen, which accounts for the 50% task
completion rate.
Several lab tests that physicians routinely order are missing from the lab test list, leading to
inefficient work-arounds
The lack of an eSig function leads to inefficient practice workflow, because the physician must
print and sign the order as opposed to just printing it
•
Some of the lab tests have esoteric names, which lead to difficulty in finding them.
170.314.a.2 Drug-drug, drug-allergy interaction checks
Participants
•
•
•
•
•
Physician 1
Physician 2
Physician 3
Physician 4
Physician 5
Date and location
•
3.11.2013-3.15.2013, Remote conference call and WebEx
Tasks
•
•
We observed providers as they prescribed medications that had various drug-drug and drugallergy interactions and solicited their feedback on the interactions they were presented.
We asked users to adjust the severity of the drug-drug interactions that appeared
Metrics
•
•
•
•
Avg. Task time: 72.4 seconds
Avg. Completion Rate: 40%
Avg. # of Errors: 0.6
SEQ: 2.4
Results
•
•
The most common compliant was the interactions were too frequent and/or too slow to
appear when there was an interaction. The timing of displaying the interactions however was
very natural for the providers, as soon as they select the drug to be prescribed, we show the
interaction, and then allow them to override or select a different drug.
Many users could not find the settings screen to adjust the severity of the drug-drug
interactions. If they did find the screen, they were easily able to adjust the interaction levels
accordingly.
Findings
•
Please see areas of improvement
Areas of improvement
•
The readability of the text for each interaction could be greatly improved
•
Many providers noted that the alerts are too frequent, even when turned down to the lowest
setting
170.314.a.6 Medication list
Participants
•
•
•
•
•
Physician 1
Physician 2
Physician 3
Physician 4
Physician 5
Date and location
•
3.11.2013-3.15.2013, Remote conference call and WebEx
Tasks
•
We focused this test on reviewing the medication list while charting a note on a patient. The
goal was to see if the providers all the information they needed and if they had common tasks
accessible to them.
Metrics
•
•
•
•
Avg. Task time: 37.8
Avg. Completion Rate: 100%
Avg. # of Errors: 0
SEQ: 5.6
Results
•
Providers like having the ability to review medications while they chart their note on the
patient. It makes it easy to make changes while they are in their workflow as opposed to
making them travel to a different module in order to make changes. However there were
several aspects of the list that they found lacking, as we discuss in the Areas of Improvement
below.
Findings
•
Please see Areas of Improvement
Areas of Improvement
•
The “SIG” was placed to the right of the medication name, which reduced the overall
readability of the medication
•
•
•
•
While providers could “edit” medications, we lacked specific actions for common tasks such as
refilling medications or dose changes
Many providers requested that we show the allergies along with the medications on the
medication list
We lacked the ability to print the medication list from the medication list module, which was a
sore spot among many providers
Under the “Inactive” medications tab, we showed the medication start date as opposed to the
medication stop date
170.314.a.7 Medication allergy list
Participants
•
•
•
•
•
Physician 1
Physician 2
Physician 3
Physician 4
Physician 5
Date and location
•
3.18.2013 – 3.22.2013, remote conference call and webex
Tasks
•
We focused this test on reviewing the medication allergy list while charting a note on a patient.
The goal was to see if the providers all the information they needed and if they had common
tasks related to medication allergies accessible to them.
Metrics
•
•
•
•
Avg. Task time: 15.8
Avg. Completion Rate: 100%
Avg. # of Errors: 0
SEQ: 6
Results
•
The allergy list module tested fairly well and is fairly straight forward. Providers liked the fact
that we displayed the allergy, the type (e.g., medication, order, etc.), the severity, and the
reaction. There were some deficiencies as noted in the areas of improvement section.
Findings
•
Please see areas of improvement.
Areas of improvement
•
•
The check-box to signify “No Known Drug Allergies” was not present in the allergy list module
When adding a medication allergy, we defaulted the selection to “Routed Drug”, however
providers usually found it easier to search by “Drug Name”, e.g., Lipitor as opposed to Lipitor
40 MG
170.314.a.8 Clinical decision support
Participants
•
•
•
•
•
Physician 1
Physician 2
Physician 3
Physician 4
Physician 5
Date and location
•
3.18.2013 – 3.22.2013, remote conference call and webex
Tasks
•
We asked providers to create CDS interventions using our customized alert module. We asked
them to review interventions and discussed what they consider valuable.
Metrics
•
•
•
•
Avg. Task time: 214.2 seconds
Avg. Completion Rate: 100%
Avg. # of Errors: 0
SEQ: 4.2
Results
•
Providers found this feature very helpful. They enjoyed the customizable nature of the alerts,
and specifically liked that the alerts could be turned on only for one provider at the practice
instead of the entire practice.
Findings
•
We did extensive testing on this feature prior to release. Our most pertinent finding was that in
addition to user defined alerts, what really helps the providers is being able to relate certain
medications and lab results to problems. So that when they are charting on that problem, they
can review any medications or the latest lab results associated to that problem.
Areas of improvement
•
N/A
170.314.b.3 Electronic prescribing
Participants
•
•
•
•
•
Physician 1
Physician 2
Physician 3
Physician 4
Physician 5
Date and location
•
June 17 – June 21 2013, Remote conference call and WebEx
Tasks
•
•
•
Your patient needs a new prescription for Lisinopril. ePrescribe your patient one, feel free to
use your local pharmacy.
Start: eCharting > Encounters
End: User has ePrescribed the lisinopril Rx
Metrics
•
•
•
•
Avg. Task time: 59.15 seconds
Avg. Completion Rate: 100%
Avg. # of Errors: 0
SEQ: 5.0
Results
While providers did not make any errors and were able to complete the task, we noticed many areas of
improvement as noted below.
Findings
•
Please see areas of improvements
Areas of improvement
•
•
Pharmacy Search – when searching by zip code, the search returns pharmacies in that zip code,
but also returns pharmacies that had the search string in their phone number, which polluted
the list and made finding a local pharmacy much more difficult
Saving a default pharmacy – The function to save a default pharmacy to a patient’s chart is in
another screen, so providers found it very difficult to update a default pharmacy when the
patient changed pharmacies
•
•
•
Auto focus – When initiating the eRx, the cursor did not auto-focus into the search box, this
adds an unnecessary click
Quantity – The quantity of pills did not auto-calculate based on the instructions and the
number of refills
Allergies – While ePrescribing, the allergies were not viewable which sometimes forced the
provider to stop the eRx workflow, and go back to the chart to review the patient’s allergies.
170.314.b.4 Clinical information reconciliation
Participants
•
•
•
•
•
•
Nurse 1
Nurse 2
Physician 1
Physician 2
Physician 3
Physician 4
Date and location
•
9.18.2013 – 10.10.2013, Remote conference call and WebEx
Tasks
The goal of this workflow is to facilitate the clinical information reconciliation of structured
medications, allergies and problems into a patient’s electronic medical record. This helps providers
achieve an accurate list of these data elements.
•
•
•
•
•
Task 1 – Review and add a medication from a transition of care summary to the patient’s chart
(exact match)
Task 2 – Review and add a medication from a transition of care summary to the patient’s chart
(one-to-many)
Task 3 – Review and add an allergy from a transition of care summary to the patient’s chart
Task 4 – Remove an allergy from the patient’s chart
Task 5 – Review and add a problem from a transition of care summary to the patient’s chart
Metrics
•
•
•
•
Results
Avg. Task time: 158.4
Avg. Completion Rate: 100%
Avg. # of Errors: 0.4
SEQ: 6.2
Users were able to complete each task without error. It took users less than 5 seconds to complete
tasks 1, 3 and 4. Tasks 2 and 5 took significantly longer to complete – up to 3 minutes. At times, the
workflow was abandoned after starting tasks 2 and 5 because the provider did not know what to select
from the list of many offerings or did not have adequate knowledge to select the correct value.
•
•
•
•
•
Overall, the workflows were well received and intuitive.
Displaying medications, allergies and problems alphabetically helped users easily locate the
data.
Providers liked the fact that one or many providers could participate in this workflow at
different times.
Providers had a difficult time understanding what was added from the transition of care
summary to the patient’s chart. Displaying the “source” was simply not enough.
Providers were not aware that hovering over the patient’s name would progressively disclose
patient demographics.
Findings
Providers receive anywhere from 1 to 5 referrals per week
60% of referrals come from a doctor’s office; 40% are from the hospital
Today, transition of care documents are scanned into the patient’s chart. Medications, allergies, and
problems may also be manually entered as structured data into the chart prior to a patient’s
appointment by the medical assistant or nurse.
Different members of the care team will review and reconcile the different lists. Medical Assistants and
Nurses will most likely review and reconcile the medication and allergy lists, whereas physicians will
review and reconcile the problem list.
Several providers were disappointed that the discharge summary (inpatient) and the assessment/plan
narrative was not required information in the transition of care CCDA. This “tells the story” of why the
patient was hospitalized or why he or she was visiting a doctor.
There is some value in knowing the source of the data after adding the information to the chart.
Areas of improvement
•
•
•
•
To reduce the cognitive load required to differentiate data added from the transition of care
summary to the patient’s chart, we added a visual indicator next to the row.
An “I” information icon was added after the patient name to provide affordance that patient
demographic information could be found by hovering over the icon or name.
Drug to drug/allergy/problem checks would be helpful in this workflow. We intend on adding
this post certification.
Not all information is parsed from the CCDA; therefore, we intend on providing a link to
open/view the transition of care summary within this workflow post certification.
Falcon EHR - Quality Management System
Overview:
We use the same homegrown QMS consistently across all modules of Falcon EHR. Below are the
processes we use in the development, testing, implementation, and maintenance of each
certification criteria.
Development
Systems Requirements and Design
• Template and process for defining system requirements and design
• Review and hand-off process to development and quality assurance team
• Prototyping and physician advisory board review
• Outlines sign-off requirements by key stakeholders
Development
• Development coding standards and best practices – “Falcon EHR Development
Commandments”
• Unit Testing and Code Review templates
• Tool standards for IDE and debugging
Configuration Management
• Source control
• Separation of duties for release management
• Environment management
• Pre-production testing requirements
Testing (Quality Assurance)
•
•
•
•
•
Test case templates
Defect management tool and process
Prioritization definitions
Defined Entry and Exit criteria
User Acceptance Testing with end users
Implementation
•
•
•
•
User security and access management procedures
Practice training and implementation guidelines
Electronic help guides available in application
Assigned Customer Account Managers
Maintenance
•
•
•
•
•
Dedicated 24/7 Helpdesk with trained support specialists
BMC Remedy tool
SLA’s definitions for support tickets
ITIL based processes and monitoring tools
Emergency Bug Fix/Maintenance Release processes
Attestation to QMS Report
I, Kelly Port, attest that the above Quality Management System report is accurate in describing the
approaches utilized by Falcon to ensure consistent quality and performance across all functions within
our system.
Signed: _______________________________________
Title: _Director_________________________________
Date: _5/29/2013_______________________________
Audit Attestations
Test Requirement: DTR 170.314.d.2-4: Protect Audit Log
Test Requirement: DTR 170.314.d.2-5: Detection of Audit Log Alteration
This is a letter of attestation that identifies the method(s) by which Falcon EHR protects 1)
recording of actions related to electronic health information, 2) recording of audit log status,
and 3) recording of encryptions status) from being changed, overwritten, or deleted by the EHR
technology.
Copying Electronic Health Information
Falcon EHR does not allow the user to copy a patient’s health information. We do allow the user
to deactivate (delete), view, modify, and create the following elements, but we do not provide
the functionality to copy them. We cannot think of a clinical scenario in which copying the
following elements would be relevant to our users:
•
Problems
•
Medications
•
Allergies
•
Labs
•
Demographics
In support of the clinical summary measure, Falcon does allow download and transmission of a
patient’s encounter. In the clinical summary screen, the Recipient, Sender, Date, and indication
of whether it was Printed or Emailed (with email address) is recorded. See screen shot below.
Audit Attestations
Audit Log Protection. Actions and statuses recorded in accordance with paragraph (d)(2)(i)
must not be capable of being changed, overwritten, or deleted by the EHR technology.
The audit screen within Falcon records actions related to create, view and modify transactions.
The screen itself that displays the records of these actions is not editable. It is a view-only
screen where audit transactions can be reviewed or monitored. See screen shot below.
Audit Attestations
Audit Log Status and Encryption Status
The audit log for a physician practice is enabled by default within Falcon.
As shown above, the screen does not allow a user to disable audit log status or disable
encryption status.
Detection. EHR technology must be able to detect whether the audit log has been altered.
To meet the prevention requirement Falcon has implemented a trigger on the audit table that
stores the audit transactions which prevents users from accessing or altering the table. If a user
tries to access the table an exception is thrown. Falcon has opted for removing all potential
access to the table versus implementing functionality to detect whether the audit log has been
altered.
In order to detect whether the audit log has been compromised, functionality has been
implemented to record transactions (username, last setup and last change date-time) if any
insert or updates were made to the audit log itself. Due to the trigger placed, the expectation is
this log should have 0 transactions at any time. If any events get logged, the development team
will be alerted and the username and transaction date-times will be reviewed to determine any
breach that requires follow-up.
Technical Spec
Project Name: Falcon Device Data Encryption
Table of Contents
1.0
Introduction .................................................................................................................................... 2
2.0
Falcon Core ..................................................................................................................................... 2
2.1
3.0
Architectural Principles & Technologies Employed ..................................................................... 2
Falcon Mobile.................................................................................................................................. 2
3.1
Architectural Principles ............................................................................................................... 3
3.2
Technologies Employed .............................................................................................................. 3
3.3
User Ability to Disable Encryption ............................................................................................... 4
4.0
Walkthrough of Encryption ............................................................... Error! Bookmark not defined.
4.1
Initiate a Session ........................................................................... Error! Bookmark not defined.
4.2
Examine stored data ..................................................................... Error! Bookmark not defined.
Table of Figures
Figure 1 - Excerpt from FIPS140-2 showing FIPS acceptance of Falcon Mobile's encryption algorithm...... 3
Figure 2 - App settings ................................................................................................................................ 4
Figure 3 - Falcon Mobile icon on iOS home screen ....................................... Error! Bookmark not defined.
Figure 4 - Falcon Mobile home screen .......................................................... Error! Bookmark not defined.
Figure 5 – Column structure of the PatientRoundingSchedules table .......... Error! Bookmark not defined.
Figure 6 - Patient data is non-readable in the database ............................... Error! Bookmark not defined.
DaVita Confidential
February 5, 2011
Technical Spec
Falcon Device Encryption Summary
1.0
Introduction
Falcon supports two types of devices utilized by the end customer: personal computers and
tablets. Our product offering and technology stack is described as Falcon Core which is
employed by personal computers and Falcon Mobile employed by tablets. Falcon employs
encryption processes and protocols to manage and protect patient data. The purpose of this
document is to give the reader a detailed overview of the technologies employed and how those
technologies work together to ensure that patient data is secure while housed on these end
user devices.
2.0
Falcon Core
As a standard practice, Falcon Core does not persist any PHI data on end user devices.
Regardless, any data stored on the client is stored in a secure manner.
2.1
Architectural Principles & Technologies Employed
Data stored is inherently de-identified, de-normalized and then encrypted using AES256
encryption. Data is never decrypted at rest and before data can be accessed it must be
decrypted.
Accessing decrypted data requires both an access username and access password. The
encryption password as well as the access username and access password can be
changed at will. Changing any one of these keys will cause all clients to delete their local
data stores. By practice, the encryption password is changed with every code release –
typically every two weeks.
The encryption password, access username and access password is stored in the central
database in a secure co-location facility which has limited access and follows DaVita
access policies. The end user is not able to modify any of these settings.
All data, including PHI data is encrypted during transmission ustin standard HTTPS
protocols.
3.0
Falcon Mobile
Falcon has invested capability in protecting patient data for our mobile product offering as PHI
data is stored on the local device. Falcon Mobile is designed from the ground up to ensure that
patient data is always encrypted and therefore secure to the degree possible using industry
standard design patterns and technologies.
6/4/2014
DaVita Confidential
Page | 2
Technical Spec
Falcon Device Encryption Summary
3.1
Architectural Principles
Falcon Mobile uses SQLite to store data onto a mobile device. In order to make this
medium secure for storage of sensitive information include patient data, we have made
the decision to keep all sensitive data (including patient identifying data) encrypted at
all times. This includes times when the data would be considered to be “at rest”, but it
also includes all times when the application is actually running.
3.2
Technologies Employed
Falcon Mobile uses AES128 for encryption of all patient identifying data. Per Drummond
Test Scenario 170.314.d.7, we have confirmed that the algorithm used is listed as “FIPS
APPROVED” in FPIS140-2 ( see figure 1)
Figure 1 - Excerpt from FIPS140-2 showing FIPS acceptance of Falcon Mobile's encryption algorithm
6/4/2014
DaVita Confidential
Page | 3
Technical Spec
Falcon Device Encryption Summary
3.3
User Ability to Disable Encryption
Falcon Mobile does not give the end user any ability to disable encryption. It is always
active for all users. Please see figure 2, which shows that there is no setting for adjusting
the encryption.
Figure 2 - App settings
6/4/2014
DaVita Confidential
Page | 4