Download presentation
Presentation is loading. Please wait.
Published byBertha Oliver Modified over 9 years ago
1
Inside the Mind of the 21st Century Customer Alan Page
7
Evaluating a Scenario Ease of use Responsiveness UsefulnessVisual Appeal
8
Example: Showcase (marketing speak) Scenario (project manager speak) End-to-End Test (something actionable) (Optional) Variations Xbox Knows You Better Identity Step in front of the console, get recognized and see my curated content Light/Dark settings Child/Adult/Gender/Height/Size/Apparel Curated content
9
Ease of Use Am I able to complete the scenario? Is it complicated? Do I need to perform extraordinary steps to get what I need done? Are there glitches in the system that make it difficult? Was it hard to find how to execute this scenario? Are the features hidden? Is the experience consistent?
10
Responsiveness Did I feel like the scenario was fast and fluid? At any point did I feel like I had to wait a frustrating amount of time? Was I effectively distracted while waiting for an action to take place? (ex. Movie or animation while I wait)
11
Usefulness Would you use this scenario yourself to accomplish this specific outcome? Does this scenario meet a need for our consumer? Does this let me do something that I want to do?
12
Visual Appeal Is the experience exciting to see and hear? Is the UI polished? Does the UX make the experience enjoyable?
13
Evaluation Scale 5 Love It! – You love it so much you’d shout it out at the top of your lungs through a bullhorn from every rooftop you encounter 4 Like It – You like it a lot and might mention it during a lull in conversation at a dinner party 3 Meh – You can live with it and neither like it or hate it. It’s nothing special. There are some improvements that can be made 2 Don’t Like It – The experience leaves a bad taste in your mouth. You’d use it if you really had to otherwise, you’d stay away from it. 1 Hate it! – You hate it so much that you would only use it if you were under a Hogwarts compulsion spell 0 Not Implemented
14
ScenarioLove ItLike ItMehDon't LikeHate Do this4310111 Do that3291420 Do the other thing4014501 Do something else2035715
17
Controlled “A/B” Experiments Concept: Randomly split traffic between two (or more) versions A (Control) B (Treatment) Collect metrics of interest Analyze
18
Let’s play a game Three choices are: A wins B wins A and B are approximately the same (*) results based on statistically significant data
19
MSN Real Estate “Find a house” widget variations Revenue to Microsoft generated every time a user clicks search/find button Raise your left hand if you think A Wins Raise your right hand if you think B Wins Don’t raise your hand if you think they’re about the same A B
20
MSN Real Estate If you did not raise a hand, please sit down If you raised your right hand, please sit down A was 8.5% better
21
Office Online Test new design for Office Online homepage A Clicks on revenue generating links (red below) Raise your left hand if you think A Wins Raise your right hand if you think B Wins Don’t raise your hand if you think they’re about the same B
22
Office Online If you did not raise a hand, please sit down If you raised your right hand, please sit down A was 64% better The Office Online team wrote A/B testing is a fundamental and critical Web services… consistent use of A/B testing could save the company millions of dollars
23
MSN Home Page Search Box Click-through rate for Search box and popular searches A B Differences: A has taller search box (overall size is the same), has magnifying glass icon, “popular searches” B has big search button Raise your left hand if you think A Wins Raise your right hand if you think B Wins Don’t raise your hand if they are the about the same
24
Search Box Search Box If you raised any hand, please sit down Don’t change for the sake of change Sometimes what you have is enough 24
25
Tester Developer Designer
26
Math Geek Developer Designer
27
Choose your test / experiment Measure the results Determine what to do next
28
BuildMeasureLearn
30
http://kadavy.net/blog/posts/aa-testing/
31
“Given the baseline conversion rate on opens, the sample size simply isn’t large enough to get a reliable result.”
33
Tracing (Build) Collect Logs & upload (Measure) Analyze (Learn) startTime = GetTime()); CoolActivity(); Trace(ID, ACTIVITY_LENGTH, GetTime() - startTime)
35
Is your product ready to ship?
36
Does your product satisfy your internal quality criteria? Is your product ready for customer feedback? Is your team ready for customer feedback?
38
There are no Silver Bullets
39
Customer data can help Use both qualitative and quantitative data Customer data can replace some testing Data does not replace excellent human testing
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.