EHR Usability Test Report Of Rcopia Product Version 4

1y ago
12 Views
2 Downloads
1.61 MB
45 Pages
Last View : 27d ago
Last Download : 3m ago
Upload by : Randy Pettway
Transcription

EHR Usability Test Report of RcopiaProduct Version: 4Report based on NISTIR 7742 Common Industry Format for Usability Test ReportsDate of Usability Test: September 1 to September 8, 2017Date of Report: September 10, 2017Report Prepared By:The Usability People, LLC4000 Legato Road, Suite 1100Fairfax, VA 22033www.TheUsabilityPeople.com1

TABLE OF CONTENTSExecutive Summary. 3Introduction . 4Method . 5Participants. 5Study Design . 8Tasks . 9Test Location . 10Test Environment. 11Test Forms and Tools . 12Participant Instructions . 13Procedure . 14Usability Metrics . 16Data Scoring. 17Results . 19Data Analysis and Reporting. 19Reliability. 19Effectiveness and Efficiency . 19Satisfaction . 20Specific Task Result Details (Including criteria evaluated) . 25Discussion of Findings . 34Effectiveness. 34Efficiency. 34Satisfaction . 35Summary of Major Findings . 35Risk Analysis . 35Areas for Improvement . 37Appendices . 38Appendix A: Recruiting Screener . 38Appendix B: Informed Consent Form . 39Appendix C: Participant Guide . 40Appendix D: System Usability Scale Questionnaire . 42Appendix E: Computer System Usability Questionnaire. 43Appendix F. References. 452

Executive SummaryOn September 1 to September 8, 2017, The Usability People, LLC conducted a summativeusability test of the DrFirst.com Rcopia V4 system. The test was conducted in the Fairfax,VA office of The Usability People over remote tele-conferencing sessions usingGotoMeeting. The purpose was to test and validate the usability of the current userinterface and provide evidence of usability of Rcopia V4 as the EHR Under Test (EHRUT).Ten (10) healthcare providers matching the target demographic criteria participated in theusability test using the EHRUT in simulated, but representative tasks.The study, based upon the ISO-9241-11 standard, focused on measuring the effectivenessof, efficiency of, and satisfaction with Rcopia among a sample of participants representingpotential users of the system. Performance data was collected on nine (9) tasks typicallyconducted on an EHR. Tasks created were based upon the criteria specified within the testprocedure structure for evaluating conformance of Electronic Health Record (EHR)technology to the certification criteria defined in certification criteria identified in 45 CFRPart 170 Subpart C of the Health Information Technology: 2015 Edition Health InformationTechnology (Health IT) Certification Criteria1.Results of the study indicated that the Rcopia system was satisfactory with regards toeffectiveness and efficiency and that the participants were very satisfied with the system.2015 Edition Health Information Technology (Health IT) Certification Criteria, 2015 Edition Base ElectronicHealth Record (EHR) Definition, and ONC Health IT Certification Program Modifications13

IntroductionThe Electronic Health Record System Under Test (EHRUT) tested for this study, Rcopia V4,was specifically designed to present medical information to healthcare providers ondesktop computers in standard healthcare settings. This study tested and validated theusability of the Rcopia V4 user interface and provides evidence of the usability of Rcopiawith representative exercises and in realistic user conditions. To this end, measures ofeffectiveness and efficiency, such as time on task, number of errors made, and completionrates were captured during usability testing. Satisfaction was assessed and user commentscollected using two industry-standard questionnaires: the System Usability Scale (SUS) andthe Computer System Usability Questionnaire (CSUQ).4

MethodParticipantsTen (10) individuals (7 women and 3 men) participated in the EHRUT (RcopiaV4). Participants were recruited from contacts obtained from DrFirst.com retail customersand from a database of participants maintained by The Usability People, LLC. Those whoresponded to the invitation to take part in the study were directed to an onlinequestionnaire that served as the participant screener. (The screening questionnaire isprovided as Appendix A.) Participants meeting the criteria for participation in the studywere contacted and scheduled via email, telephone and using an online scheduling systemThose who responded to the invitation to take part in the study were directed to an onlinequestionnaire that served as the participant screener. (The screening questionnaire isprovided as Appendix A.) Participants meeting the criteria for participation in the studywere contacted and scheduled via email, or telephone and confirmed for their testingsession.Participants in the usability test of Rcopia had a variety of healthcare backgrounds anddemographic characteristics.Table 1 presents participant characteristics, including demographics, professionalexperience, computing experience, and previous EHR experience. Participantcharacteristics reflect the audience of current and future users and meet the criteriadesignated in the 2015 Edition Certification Companion Guide for Safety-enhanced design 45 CFR 170.315(g)(3). None of the participants were from the vendor organization(DrFirst.com) that produced and supplied the evaluated system nor did any participanthave any direct connection to the testing organization (The Usability People, LLC). Allparticipants were compensated for their time.5

Table 1. Participant CharacteristicsPartIDGenderP01Female30 to 39Trade TechnicalVocational TrainingP02Male50 to 59Associate degreeP03Female30 to 39Bachelor’s degreeP04Female30 to 39Associate degreeP05P06MaleFemale40 to 4930 to 39Master’s degreeBachelor’s degreeP07Female20 to 29Bachelor’s degreeP08Male30 to 39Bachelor’s degreeP09Female40 to 49Master’s degreeP10Female20 to 29Trade TechnicalVocational calAnalystMedical CFOClinicalInformatics, RNInformaticsNurse EHRCoordinatorRegisteredNurseInformaticsNurse e Experience(Months.) hNeeds60480 None2041200 None1201200 None8400 None12060108600 None0 None108840 None2402400 None72960 None606072 None6

Summary of Participant Characteristics:Participants had experience with the occupation and expertise that aligns with thecapability under testing. The cohort of users who are selected as participants was variedwith the product and its intended users and was not be limited to clinicians. Thedemographic characteristics of the test participant characteristics reflected the audience ofcurrent and future users.GenderMaleFemale37Age Range20 to 2930 to 3940 to 4950 to 5960 to 6970 to 79252100EducationHigh schoolgraduate, diplomaor the equivalentSome collegecredit, no degreeTrade technicalvocational trainingAssociate degreeBachelors degreeMasters degreeDoctorate DegreeAge Range070 to 7960 to 69050 to 59240 to 491230 to 392420520 to 292EducationDoctorate DegreeYears of Experiencewith RcopiaNone9Up to 3 years03 to 5 years05 to 10 years1More than 10years0Masters degreeBachelors degreeAssociate degreeTrade technical vocational Some college credit, no High school graduate, 0123457

Study DesignThe overall objective of this usability test was to uncover areas where the Rcopia systemperformed well – that is, effectively, efficiently, and with satisfaction – and areas where thesystem failed to serve the clinical documentation and workflow needs of users. Data fromthis test may be used as a baseline for future tests of updated versions of Rcopia and/or forcomparing Rcopia with other EHRs presenting the same tasks. In short, this testing servesas both a means to record or benchmark current usability and to identify areas whereimprovements must be made.Participants had a range of experience with EHRs in general, and also had direct experienceand/or training with the Rcopia system. Participants completed the test of Rcopia usabilityduring individual 30-40-minute GotoMeeting sessions. During the test, each participantinteracted with various components of the Rcopia system. Each participant was providedwith the same instructions.Rcopia was evaluated for effectiveness, efficiency and satisfaction as defined by thefollowing measures collected and analyzed for each participant: Number of tasks successfully completed without assistance Time to complete the tasks Number and description of errors Path deviations Participant’s verbalizations (comments) Participant’s satisfaction ratings of the system8

TasksThe Usability People constructed a total of nine (9) tasks in close collaboration with theDrFirst.com team, to be realistic and representative of the clinical documentation andworkflow activities a user might engage with the Rcopia system in actual medical settings.The nine (9) tasks were created based upon the criteria specified within the test procedurestructure for evaluating conformance of Electronic Health Record (EHR) technology to thecertification criteria as defined in 45 CFR Part 170 Subpart C of the Health InformationTechnology: Standards, Implementation Specifications, and Certification Criteria forElectronic Health Record Technology.The tasks focused on the following 2015 Edition certification criteria specified by ONC: Section 170.315(a)(1) Computerized provider order entry – medications Section 170.315(a)(4) Drug-drug, drug-allergy interaction checks Section 170.315(a)(6) Problem list Section 170.315(a)(7) Medication list Section 170.315(a)(8) Medication allergy list Section 170.315(b)(3) Electronic prescribingA copy of the tasks presented to participants in the usability test of Rcopia can be found inAppendix C.9

Test LocationAll participants were tested on the Rcopia system during remote conferencing sessionsusing GotoMeeting. Each participant was requested in advance to secure a quiet room withminimal distractions and a desktop or laptop computer that could connect to the Internetwith a GotoMeeting session. Although the type of computer, operating system and displayresolution of the remote participant system was unknown, the system that was used by thetest administrator and controlled by the remote participant was a Dell Inspiron Laptoprunning the Windows 10 operating system at a resolution of 1366x768 pixels. During agiven GotoMeeting session, only the test administrator and participant communicated withone another.The GotoMeeting usability test session was conducted by a test administrator from thetesting organization (The Usability People, LLC) working from a small conference room atThe Usability People’s Fairfax, VA location. Seater near the administrator, a data loggerfrom the testing organization also took detailed notes on each session, including usercomments and other ratings following each task. During a session both the testadministrator and the data logger(s) could see only the participant’s screen and hear theparticipant’s comments, questions, and responses.10

Test EnvironmentWhile the EHRUT typically would be used in a hospital, healthcare office, or ambulatorycenter facility, testing of the Rcopia system was conducted via remote connection duringindividual GotoMeeting sessions. Each participant called into a GotoMeeting session andwas connected by the test administrator to the application.The Rcopia application itself ran on a web-based browser platform on a LAN connectionusing a sample database that was set up specifically for the test. Participants used a mouseand keyboard when interacting with the EHRUT and were given remote control of theadministrator’s workstation to perform the tasks.11

Test Forms and ToolsAs part of the usability test, several documents and instruments were used. Examples ofthe documents used during the usability test, including an informed consent form, thetasks, and post-test questionnaires, can be found in Appendices B to E, respectively.Participants’ interaction with Rcopia was captured and recorded digitally using the Moraescreen capture software running on the test administrator’s workstation. Verbal responseswere recorded through either the microphone integrated into the participant’s computeror through a telephone connection. This information was electronically transmitted to theadministrator and to the data logger during each test session.12

Participant InstructionsThe administrator read the following instructions aloud to each participant:Thank you for participating in this study. Your input is very important. Our sessiontoday will last about 30 to 40 minutes. During that time you will use an instance of anelectronic health record. I will ask you to complete a few tasks using this system and answersome questions.Please note that we are not testing you; we are testing the system. Therefore if youhave any difficulty this may mean that something needs to be improved in the system. I will behere in case you need specific help, but I am not able to instruct you or provide help in how touse the application.Overall, we are interested in how easy (or how difficult) this system is to use, what in itwould be useful to you, and how we could improve it. I did not have any involvement in itscreation, so please be honest with your opinions.All of the information that you provide will be kept confidential and your name will notbe associated with your comments at any time. Should you feel it necessary you are able towithdraw at any time during the testing.Participants were then given nine (9) tasks to complete.13

ProcedureUpon connection to the online meeting tool (GotoMeeting), each participant was greeted,his or her identity verified, and matched to a name on the participant schedule. Participantnames were replaced with participant IDs so that a given individual’s data cannot be linkedto his/her identity. Prior to beginning testing, each participant reviewed and signed aninformed consent form (See Appendix B).Staff members of the Usability People, a usability test administrator administered the test.The administrator moderated the session by providing both verbal and writteninstructions for the overall usability test and for each of the tasks comprising the test. Theadministrator also monitored task success, path deviations, number and description oferrors, and audio-recorded participant verbal comments. A data logger logged task times,obtained post-task rating data, and took notes on participant comments and administratorfeedback.For each of the nine (9) tasks, participants were provided written instructions onscreenand were able to flip back and forth between the EHRUT and the written instructions.Following the administrator’s instructions, each participant performed each task by firstreading the task then stating in his or her own words his or her interpretation of the taskrequirements. When the participant’s interpretation matched the actual goal of the task,the administrator instructed the participant to begin and task timing began. Task time wasstopped and recorded when the test administrator observed on their workstation that theparticipant had successfully completed the task. If a participant failed to complete a taskbefore the expected amount of time for each task, that task was marked as “Timed Out.”After each task, the test administrator asked the participant, “On a scale from 1 to 5, where1 is ‘Very Difficult and 5 is ‘Very Easy,’ how satisfied were you with the ease of use for thistask?” This same procedure was conducted for each of the nine (9) tasks.14

Following completion of the nine (9) EHR tasks, the administrator electronically presentedto the participant two post-test questionnaires (System Usability Scale (SUS), see AppendixD and Computer System Usability Questionnaire (CSUQ), see Appendix E). After theparticipant completed both questionnaires, the administrator thanked each participant forhis or her time and allowed the participant to make any comments on or ask any questionsabout the system and/or the tasks presented. For each session, the participant’s schedule,demographic information, task success rate, time on task, errors, deviations, verbalresponses, and post-test questionnaire were digitally recorded. The system was then resetto proper test conditions for the next participant.15

Usability MetricsAccording to the NIST Guide to the Processes Approach for Improving the Usability ofElectronic Health Records (NIST IR 7741, November 2010) EHRs should support a processthat provides a high level of usability for all users. The goal is for users to interact with thesystem effectively, efficiently, and with an acceptable level of satisfaction. To this end,metrics for effectiveness, efficiency and user satisfaction were captured during theusability testing. The goals of the test were to assess: Effectiveness of Rcopia by measuring participant success rates and errors. Efficiency of Rcopia by measuring the average task time and path deviations. Satisfaction with Rcopia by measuring ease-of-use ratings.Table 2 details how tasks were scored, errors evaluated, and the time data analyzed:16

Data ScoringTable 2. Scoring Protocols for Effectiveness, Efficiency, and SatisfactionMeasuresRationale and ScoringEffectiveness:Task SuccessA task was counted as “Success” if the participant was able to achieve thecorrect outcome, without assistance, within the time allotted on a per taskbasis.The total number of Successes was calculated for each task and thendivided by the total number of times that task was attempted. Results areprovided as a percentage.Effectiveness:Task FailuresIf the participant abandoned the task, did not reach the correct answer orperformed it incorrectly, or reached the end of the allotted time beforesuccessful completion, the task was counted as “Fail.” No task times weretaken for failed attempts.The total number of errors was calculated for each task and divided by thetotal number of times that task was attempted. Results are presented asthe average error rate.Note: Not all deviations are counted as errorsEffectiveness:PromptedSuccessesBecause some tasks are dependent upon the successful completion ofprevious tasks, participants may receive a limited number of “prompts” tohelp prepare the system data for the pre-requisites for subsequent tasks.When a participant was able to complete the data entry on a task with 3or fewer prompts, the task was counted as an “Assisted” competition. Notask times were recorded for Assisted completions.Efficiency:Task DeviationsThe participant’s path (i.e., steps) through the application was recorded.Deviations occur if for example, the participant navigated to an incorrectscreen, clicked on an incorrect menu item, followed an incorrect link, orinteracted incorrectly with an on-screen control.17

MeasuresRationale and ScoringEfficiency:Task TimeEach task was timed from the administrator’s prompt “Begin” until said,“Done.” If the participant failed to say, “Done,” timing stopped when theparticipant stopped performing the task.Only task times for tasks that were successfully completed were includedin the average task time analysis. Average time per task was calculated foreach task.Satisfaction:Ease of UseratingsSystemSatisfactionParticipant’s subjective impression of the ease of use of the application wasmeasured by administering both a single post-task question as well as twopost-session questionnaires.After each task, the participant determined on a scale of 1 to 5 theirsubjective satisfaction with performance on the task. These data areaveraged across participants.To measure participants’ confidence in and likeability of the EHR overall,the testing team administered electronic versions of the System UsabilityScale (SUS) and the Computer System Usability Questionnaire (CSUQ). Seethe SUS questionnaire as Appendix D., and the CSUQ as Appendix E.18

ResultsData Analysis and ReportingThe results of the usability test of the Rcopia system were analyzed according to themethods described in the Usability Metrics section above and are detailed below. Note thatthe results should be evaluated relative to the study objectives and goals, as outlined in thestudy design section above. The data should yield actionable results that, if corrected, yieldmaterial, positive impact on user performance.ReliabilityDuring the entire data collection phase, it was observed that the system provided aconsistent and reliable interface to each participant as they completed their tasks. As eachparticipant completed their assigned tasks, the system provided the same information andresponded to their input with the same verbiage and using the same mode ofcommunication (e.g. Pop-up message, or embedded assistance).Effectiveness and EfficiencyTable 3 presents a summary of overall task performance showing task, mean time on task,task completion rates, mean path deviations and mean task satisfaction:19

Table 3. Usability Test ResultsTaskMeanTaskTimeCompletionRate (%)Mean #PathDeviationsSDMeanTaskSatisfactionSDTask 1. Med Allergy List - Add Alergy1:470:54100%0.70SD0.644.301.00Task 2. Drug-Allergy Interaction Check0:560:37100%0.500.674.101.22Task 3. Electric Prescribing2:491:1090%1.601.743.501.20Task 4. Electronic RX Cancel/Change3:211:3590%2.001.843.301.10Task 5. Electronic RX - Renew/Refill0:420:21100%0.200.604.800.60Task 6. Problem List CDS Intervention0:350:12100%0.000.005.000.00Task 7. Medication List; Medication Stop0:220:09100%0.000.004.800.60Task 8. Medication Order;Drug/DrugInteractionTask 9. Medication Allergy List - 00%0.000.004.500.67As Table 3 shows, relative to optimal performance standards as defined by DrFirst.com andThe Usability People, participant performance in the Rcopia EHR usability test was quitesatisfactory. The overall average task completion rate was ninety-eight (98) percent.SatisfactionIndividual Task SatisfactionParticipants verbally indicated their satisfaction with the ease of use for each task using ascale of “1” (“Very Difficult”) to “5,” (“Very Easy”). As Figure 1 shows individual tasksatisfaction ranged from a low of 3.3 out of 5 on Task 4 (E-Prescribing cancel/change) to ahigh of 5 out of 5 on Tasks 6 (Problem List Intervention).20

Figure 1. Satisfaction Ratings of Individual Tasks21

Individual Participant SatisfactionIn general, the participants were satisfied with the ease of use of the Rcopia system. Thefollowing chart displays overall satisfaction for each participant:The average overall task satisfaction rate was 4.33 out of 5.22

System Usability ScaleThe System Usability Scale (SUS) is a simple, 10-item Likert-type attitude scale providing aglobal subjective assessment of usability from the user’s perspective (John Brooke atDigital Equipment Company developed the SUS in 1986). The SUS scale is scored from 0 to100; scores under 60 represent systems with less than optimal usability, scores over 80 areconsidered better than average. See Appendix D for a copy of the SUS.The mean total SUS score for the Rcopia EHR was seventy-six (76) and ranged from a low offorty-eight (48) and a high of ninety-five (95). Overall, participant-users rated theirsatisfaction with the Rcopia EHR system to be within the very high-range of a usable andsatisfying EHR.The following chart shows the SUS score by each participant:23

Computer System Usability QuestionnaireUsing the Computer System Usability Questionnaire (CSUQ; Lewis, J. R. (1995). (See: IBMComputer Usability Satisfaction Questionnaires: Psychometric Evaluation and Instructionsfor Use. International Journal of Human-Computer Interaction, 7:1, 57-78).), participantsrated each of 19 items of the CSUQ questionnaire on a scale from 1 to 7, with a rating of 7being most in agreement with the positively-worded item. Responses for each item weresummed and averaged to four scales – Interface Quality, Information Quality, SystemUsefulness- and an overall scale. See Appendix E for a copy of the CSUQ.Figure 2 displays CUSQ ratings for each of the four scales. In general, participants in theRcopia study rated system usability to be very high. On Interface Quality, the average scorefor the participants was 5.63/7; on Information Quality, the average score 5.57/7; onSystem Usefulness, the average score was 5.96/7; and the overall average CUSQ score was5.76/7.24

Specific Task Result Details (Including criteria evaluated)Task sk 1. Med Allergy List - Add AllergyTask SuccessSuccessSuccessExpected Time on TaskAverage Time on TaskAverage Task SatisfactionAverage #Path DeviationsPercent Success# Path 1:474.300.70100%(SD)0:541.000.6425

Task sk 2. Drug-Allergy Interaction CheckTask SuccessSuccessSuccessExpected Time on TaskAverage Time on TaskAverage Task SatisfactionAverage #Path DeviationsPercent Success# Path 0:564.100.50100%(SD)0:371.220.6726

Task sk 3. Electric PrescribingTask FailSuccessSuccessExpected Time on TaskAverage Time on TaskAverage Task SatisfactionAverage #Path DeviationsPercent Success# Path 2:493.501.6090%(SD)1:101.201.7427

Task sk 4. Electronic RX Ca

usability test of the DrFirst.com Rcopia V4 system. The test was conducted in the Fairfax, VA office of The Usability People over remote tele-conferencing sessions using GotoMeeting. The purpose was to test and validate the usability of the current user interface and provide evidence of usability of Rcopia V4 as the EHR Under Test (EHRUT).

Related Documents:

Kareo EHR Usability Study Report of Results EXECUTIVE SUMMARY A usability test of Kareo EHR version 4.0 was conducted on August 21 & 23, 2017, in Irvine, CA by 1M2Es, Inc. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the

Professional EHR - An EHR catered towards small-mid sized healthcare providers, offering an easy to use interface. TouchWorks EHR - Touchworks EHR software is a product from Allscripts, a vendor that provides multi-site and specialty support as well as configurable desktop. Sunrise EHR - A comprehensive EHR system for large clinical enterprises.

EHR Replacement EHR replacement - Do it right! 1 In good company! (EHR replacement is popular) About the author. A letter from Dr. Gary. 2 Decision: Replace the EHR. Now what? 3 EHR replacement: Do it right! 4 Project manage. Manage change. 5 Select the right EHR solution. 6 Get the right EHR support at the right time.

EHR Implementation Status, 2018 5%. 18%. 19%. 28%. 13%. 16%. 2018 EHR Implementation Status. Not yet implemented. Fully implemented EHR. Less than 6 months. 6-12 months. 12-18 months. More than 18 months Of behavioral health provider organizations who purchased an EHR, 77% describe the EHR as fully implemented The majority (37%) of EHR

usability testing is a very frequently used method, second only to the use of iterative design. One goal of this chapter is to provide an introduction to the practice of usability testing. This includes some discussion of the concept of usability and the history of usability testing, various goals of usability testing, and running usability tests.

The purpose of this study was to test and validate the usability of the current user interface and provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness, efficiency, and user satisfaction, such as time on task and task ratings, were captured during the usability testing. 3. Method 3.1 Participants

genomics in the EHR? Non-exceptionalist perspective: Any genetic or genomic test can be ordered from an EHR via CPOE Any genetic test report can be included in the EHR as a text document Some EHR systems (or LIS modules within an EHR): Support storage of discrete genetic findings (variants, quantitative

Agile Development and Scrum The agile family of development methods were born out of a belief that an approach more grounded in human reality – and the product development reality of learning, innovation, and change – would yield better results. Agile principles emphasize building working software that