MODULE 6 Usability Measurement At The Overall Test/Product Level

10m ago
15 Views
1 Downloads
1.62 MB
39 Pages
Last View : 4d ago
Last Download : 3m ago
Upload by : Gia Hauser
Transcription

MODULE 6 UX Curriculum 2020 Usability Measurement at the Overall Test/Product Level Jeff Sauro, PhD & Jim Lewis, PhD 1

Users Tasks Hypothesis Measure Before Metrics Make Changes Measure After Methods Define Measure, Analyze & Improve Hypothesize, Operationalize Randomize, Analyze, Synthesize 6 Sigma Scientific Method Framework to Improve the User Experience 22

Objectives 1. Learn the most common UX metrics for overall assessment 2. Learn a useful combination of overall UX metrics and openended questions 3. Go through examples 4. Discussion 3

Measure the User Experience: Overall Level Attitudes (Subjective) Perceived usability Perceived ease-of-use Perceived usefulness Satisfaction Loyalty 4

Common UX Metrics for Overall Assessment 5

Attitudes Predict Behavior Through Intentions TRA Attitude toward Behavior Theory of Reasoned Action TAM Technology Acceptance Model Intention Behavior Ajzen & Fishbein, 1967 Perceived Usefulness (U) Attitude toward Using (A) Behavioral Intention to Use (BI) r .50 Perceived Ease of Use (E) Actual System Use Davis, 1990 When measured well, specific attitudes predict many specific behaviors r .5 6 More info: https://measuringu.com/attitudes-behavior/

Satisfaction: General & Specific General Specific Overall how satisfied are you with the [Brand][Product]? Not at All Satisfied Slightly Satisfied Somewhat Satisfied How satisfied are you with : Very Satisfied Customer service Setup and customization Purchasing process Completely Satisfied 7

Satisfied but not Loyal 90% 40% of car customers are satisfied or very satisfied Repurchase the same brand 8

Loyalty: Net Promoter Score How likely are you to recommend to a colleague or friend? Not at all likely 0 Extremely likely 1 2 3 4 5 6 7 Detractors NPS 8 Passives % of Promoters (9s & 10s) - 9 10 Promoters % of Detractors (0-6) Net Promoter, NPS, and Set promoter Score are trademarks of Satmetrix Systems, Inc., Bain & Company, and Fred Reichheld 9

Is the NPS Harmful? 1. Does the NPS really predict the future (of business metrics)? 2. Is the Net Promoter Score really better than satisfaction? Validity 3. Does the LTR item predict actual recommendation behavior? 4. Do you really need 11 points in the scale?: More points generally better (especially for top-box) 5. Is a single item sufficient to measure a construct? Yes, for Simple Constructs 6. Is the NPS Reliable? Reliability Yes, as reliable as satisfaction, brand 7. Is the NPS Box Scoring Screwy?: Top boxes are better predictors of behavior 10 More info: https://measuringu.com/nps-harmful/; https://tinyurl.com/y4zu75lk

NPS & Future Revenue: Reasonable Evidence Historical Growth: 2000-2002 NPS data to predict 1999-2002 growth (historical growth). Future Growth: 2013 NPS data was a reasonable predictor of 20132015 growth in 11 of 14 industries r .35 Reichheld 2006 Sauro 2019 11

NPS Better than Sat?: Not really but it’s short & ubiquitous NPS Adj. R-Sq .90 Consumer Reports Adj. R-Sq .94 Sat Sat JD Power Adj. R-Sq .92 12

Does NPS predict actual recommendation behavior? 70% % of Reported Recommendations 30-90 Days) 62% 60% 50% 45% 40% Purchase 30% Recommendation 18% 20% 14% 10% 8% 7% 1% 1% 0% 0% 0 1 0% 0% 0 2 1% 0 3 1% 0% 15% 15% 3% 4% 4 5 6 Likelihood to Recommend 5% 1% 7 8 9 10 Between 51% and 77% of recommendations come from promoters. 13 More info: https://measuringu.com/promoters-recommend/

Does NPS predict actual recommendation behavior? 50% 40% 35% 30% 24% 23% Negative Comments (%) 20% 20% 19% Postive Comments (%) 16% 10% 6% 5% 0.6% 0% 0 0% 1 1% 2 10% 9% 1% 3 2% 3% 9% 6% 4 5 6 Likelihood to Recommend 2% 7 4% 1% 8 9 2% 10 Detractors accounted for 90% of negative comments. 14

NPS Report Card Predict Growth: C Better than Satisfaction: D Number of Points: A- Single Item Sufficient: B NPS Reliable: A- NPS Box Scoring Wacky: B Diagnose UX Problems: N/A Tell you What to Fix: N/A Should Always be Used: N/A 15

System Usability Scale (SUS) I think that I would like to use this system frequently I found the system unnecessarily complex I thought the system was easy to use I think that I would need the support of a technical person to be able to use this system I found the various functions in this system were well integrated I thought there was too much inconsistency in this system I would imagine that most people would learn to use this system very quickly I found the system very cumbersome to use I felt very confident using the system I needed to learn a lot of things before I could get going with this system 1 Strongly Disagree 2 3 4 5 Strongly Agree 16

Using SUS Items are rescaled from 0 to 100 instead of from 0 to 40 SUS scores are not percentages 0 20 40 Poor 80 60 Average 100 Excellent 17

Raw SUS Scores & Percentile Ranks 100% A 80% Average SUS Score Percentile Rank B 68 60% C 40% (500 products) D 20% F 0% 0 20 40 60 68 80 100 SUS Score 18

Consumer Software: SUS & NPS SUS 50 60 90 80 70 100 NPS -20 -10 0 10 20 30 40 50 SUS explains 30%-50% of NPS. 19

SUS & Future Behavior (Nonlinear) SUS 0 20 50 24% Return Rate 68 80 100 16% Return Rate 10% Return Rate 20

UMUX-Lite [This system’s] capabilities meet my requirements. [This system] is easy to use.* 5 or 7 point Agree/Disagree Response Options * Can predict SUS scores with 90% accuracy 21 More info: https://www.researchgate.net/publication/339333658 Perceived Usability and the Modified Technology Acceptance Model

SUPR-Q Usability This website is easy to use. It is easy to navigate within the website. Credibility The information on the website is credible. The information on the website is trustworthy. Loyalty Four Factor Standardized Questionnaire Average subscale items to get subscale scores Average subscales to get overall score Interpret scores using normative percentiles How likely are you to recommend this website to a friend or colleague? I will likely visit this website in the future. Appearance I found the website to be attractive. The website has a clean and simple presentation. 22 More info: https://measuringu.com/10-things-suprq/

SUPR-Qm 1. I can't live without the app on my phone. 2. The app is the best app I've ever used. 3. I can’t imagine a better app than this one. 4. I would never delete the app. 10. This app integrates well with the other features of my mobile phone. 11. I will definitely use this app many times in the future. 5. Everyone should have the app. 12. The design of this app makes it easy for me to find the information I'm looking for. 6. I like discovering new features on the app. 13. I find the app to be attractive. 7. The app has all the features and functions you could ever want. 14. The app’s capabilities meet my requirements. 8. I like to use the app frequently. 15. It is easy to navigate within the app. 16. The app is easy to use. 9. The app is delightful. 23 More info: https://uxpajournal.org/supr-qm-measure-mobile-ux/

A Useful Combination 24

Attitudes Predict Behavior Through Intentions TRA Attitude toward Behavior Theory of Reasoned Action TAM Technology Acceptance Model Intention Behavior Ajzen & Fishbein, 1967 Perceived Usefulness (U) Attitude toward Using (A) Behavioral Intention to Use (BI) r .50 Actual System Use Perceived Ease of Use (E) Davis, 1990 UX Metrics SUS SEQ UMUX-Lite Likelihood to Recommend (NPS) Likelihood to Use/ Purchase SUPR-Q (Usability, Appearance, Trust/Credibility, Loyalty) r .79 Referrals r .53 Usage r .78 Purchases When measured well, specific attitudes predict many specific behaviors r .5 25 More info: https://measuringu.com/attitudes-behavior/

Comprehensive Metrics Models 26

Google HEART Framework Five components Happiness: Attitudinal measures – CSAT, Ease, LTR Engagement: Frequency and amount of usage Adoption: Number of new users Retention: Number of continuing users Task Success: Completions, Time, Errors HEART builds on other frameworks See chart on the left HEART is light on validation Despite this, it is a catchy mnemonic You’re likely already using many of the components 27 More info: https://measuringu.com/heart-framework/

Leading and Lagging Metrics in UX Management Lag vs. lead Lagging measures are outcome Leading measures are drivers of outcomes Depending on perspective, some measures are both Monitor lags, focus on affecting leads Leads are more diagnostic than lags Mixed method design and development approaches (e.g., UCD, Scientific Thinking) improve primary/secondary leads and monitor intermediate lags 28

Example of Use of Test-Level Metrics 29

Facebook Multi-Product Benchmark August 2019 30 30

SUS Scores System Usability Scale 100% A Twitter:* 79.4 Instagram:* 78.1 80% LinkedIn:* 76.9 Groups: 79.88 Events: 76.80 Percentile Rank B Fundraisers: 72.60 Watch: 70.89 60% Marketplace: 69.11 C 50% Gen. Pop : 72.60 Games: 70.43 Pages: 67.30 40% D Gaming Video: 55.30 20% F 0% 0 Worst Imaginable 20 40 Poor 60 SUS Score 80 68 OK Good Excellent Groups N 40, Events N 39, Fundraisers N 40, Gen. Pop N 39, Watch N 42, Games N 41, Marketplace N 45, Pages N 42, Gaming Video N 44 Comparison scores from estimated SUS scores predicted from SUPR-Q ratings in the 2018 Social Media SUPR-Q benchmark. 100 Best Imaginable 31

Strongly agree Post-Product Attitudes Graphs 5 4 3 2 3.5 4.2 3.8 4.0 3.5 3.9 3.8 3.5 4.0 4.5 4.6 4.3 3.8 4.2 4.1 4.0 3.7 3.9 3.9 3.9 3.7 4.1 3.8 3.7 3.8 4.2 4.3 4.0 Mo dern Uncluttered Fast Focused Organized Attractive Understandable 1 Pages Calculations include 90% confidence intervals. Events Groups Fundraisers 32

Facebook Multi-Product Benchmark October 2019 33 33

UMUX-Lite Scores Usability Metric for User Experience - Lite – Across Benchmarks Q1 UMUX-Lite Q2 UMUX-Lite Q3 UMUX-Lite Q1 Percentile Q2 Percentile Q3 Percentile Gaming Video / Games (Q1, Q2) 4.07 4.12 4.18 60% 69% 78% General Product -- 4.34 4.40 -- 94% 97% Groups -- 4.37 4.36 -- 96% 95% Marketplace 4.25 4.35 4.25 87% 95% 87% News Feed 4.52 4.58 4.46 99% 99% 98% Profiles -- 4.27 4.40 -- 89% 97% Search -- 4.47 4.33 -- 99% 93% Stories 3.99 4.15 4.17 44% 74% 77% Watch 4.09 4.19 4.20 63% 80% 81% Product *Instant Games and Gaming Video were tested together as one product during Q1 and Q2; Only Gaming Video tasks were included in Calculations include 90% confidence intervals. 34

Example of Use of Product-Level Metrics (Benchmark Survey) 35

Usability Ratings (SUS) from 2011, 2014, & 2017 Grade “A” 86 80.4 81 Grade “B” 84.7 84 85.6 78.2 78 78 82.6 76.6 76 75 82 80.8 75.1 74.8 71 71 Grade “C” 66 61 56 2011 2014 2017 2011 2014 2017 2011 2014 2017 2011 2014 2017 2011 2014 2017 36

NPS from 2011, 2014, & 2017 65% 56% 55% 45% 45% 37% 44% 39% 38% 38% 37% 2017 2011 35% 32% 32% 34% 2014 2017 2011 38% 31% 25% 9% 8% 15% 5% -5% -15% 2011 2014 2017 2011 2014 2017 2011 2014 2014 2017 -25% 37

Summary 1. Common standardized UX metrics for overall assessment For overall, high-level assessment, focus is on measurement of attitudes Attitudes predict behavior through intentions 2. Useful combination of standardized UX metrics and questions Fundamental: Perceived usability (SUS), Ease/Usefulness (UMUX-Lite), Loyalty (NPS), Open ended Alternatives: C-SAT, SUPR-Q, SUPR-Qm Link to Quiz 6 on course website: https://www.measuringu.com/fb, pw: measuringu 38

Discussion Remote UX Testing Platform (Desktop & Mobile) UX Research UX Curriculum 2020 Measurement & Statistical Analysis Eye Tracking & Lab Based Testing Reports, Books, Training & Workshops 39

SUPR-Q Usability This website is easy to use. It is easy to navigate within the website. Credibility The information on the website is credible. The information on the website is trustworthy. Loyalty How likely are you to recommend this website to a friend or colleague? I will likely visit this website in the future .

Related Documents:

usability testing is a very frequently used method, second only to the use of iterative design. One goal of this chapter is to provide an introduction to the practice of usability testing. This includes some discussion of the concept of usability and the history of usability testing, various goals of usability testing, and running usability tests.

Usability Testing Formal usability testing involves real users and more resources and budget than do the discount usability techniques. Usability testing is the observation and analysis of user behavior while users use a prod- uct or product prototype to achieve a goal. It has five key components: 1. The goal is to improve the usability of a .

Usability is the quality attribute that measures the easiness of an interface [4]. Battleson et al. (2001) asserted that usability testing is the best approach to asses a website's usability [5]. The meaning of usability, overall, proposes that there are four common factors that influence usability of the interactive

Teacher’s Book B LEVEL - English in school 6 Contents Prologue 8 Test paper answers 10 Practice Test 1 11 Module 1 11 Module 2 12 Module 3 15 Practice Test 2 16 Module 1 16 Module 2 17 Module 3 20 Practice Test 3 21 Module 1 21 Module 2 22 Module 3 25 Practice Test 4 26 Module 1 26 Module 2 27 Module 3 30 Practice Test 5 31 Module 1 31 Module .

The Usability Metric for User Experience (UMUX) scale is a new addition to the set of standardized usability questionnaires, and aims to measure perceived usability employing fewer items that are in closer conformity with the ISO 9241 (1998) definition of usability (Finstad, 2010).

Kareo EHR Usability Study Report of Results EXECUTIVE SUMMARY A usability test of Kareo EHR version 4.0 was conducted on August 21 & 23, 2017, in Irvine, CA by 1M2Es, Inc. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the

usability test of the DrFirst.com Rcopia V4 system. The test was conducted in the Fairfax, VA office of The Usability People over remote tele-conferencing sessions using GotoMeeting. The purpose was to test and validate the usability of the current user interface and provide evidence of usability of Rcopia V4 as the EHR Under Test (EHRUT).

Hallway/Guerilla Usability Testing8 Conducts usability tests with random users in any setting (for example, in a coffee shop or hallway). . Usability testing is an effective research methodology to ensure government products work and can be easily used by clients. The following are six steps for successful usability testing