Download presentation
Presentation is loading. Please wait.
Published byDavid Mills Modified over 8 years ago
1
Strategies LLCTaxonomy November 3, 2006Copyright 2006 Taxonomy Strategies LLC. All rights reserved. Testing Your Taxonomy Ron Daniel, Jr.
2
2 Taxonomy Strategies LLC The business of organized information Testing Your Taxonomy Your taxonomy will not be perfect or complete and will need to be modified based on changing content, user needs, and other practical considerations. Developing a taxonomy incrementally requires measuring how well it is working in order to plan how to modify it. In this session, you will learn qualitative and quantitative taxonomy testing methods including: Tagging representative content to see if it works and determining how much content is good enough for validation. Card-sorting, use-based scenario testing, and focus groups to determine if the taxonomy makes sense to your target audiences and to provide clues about how to fix it. Benchmarks and metrics to evaluate usability test results, identify coverage gaps, and provide guidance for changes.
3
3 Taxonomy Strategies LLC The business of organized information Qualitative taxonomy testing methods MethodProcessWhoRequiresValidation Walk-thruShow & explain Taxonomist SME Team Rough taxonomy Approach Appropriateness to task Walk-thruCheck conformance to editorial rules Taxonomist Draft taxonomy Editorial rules Consistent look and feel Usability Testing Contextual analysis (card sorting, scenario testing, etc.) Users Test Proctor Analyst Rough taxonomy Tasks & Answers Reaction to taxonomy Tasks are completed successfully User Satisfaction Survey Users Test Proctor Analyst Rough Taxonomy UI Mockup Search prototype Reaction to new interface Reaction to search results Time to complete task is reduced Tagging Samples Tag sample content with taxonomy Taxonomist Team Indexers Sample content Rough taxonomy (or better) Content ‘fit’ Fills out content inventory Training materials for people & algorithms Basis for quantitative methods
4
4 Taxonomy Strategies LLC The business of organized information Walk-through method— Show & explain ABC Computers.com All Business Employee Education Gaming Enthusiast Home Investor Job Seeker Media Partner Shopper First Time Experienced Advanced Supplier Audience All Home & Home Office Gaming Government, Education & Healthcare Medium & Large Business Small Business Line of Business All Asia-Pacific Canada EMEA Japan Latin America & Caribbean United States Region- Country Desktops MP3 Players Monitors Networking Notebooks Printers Projectors Servers Services Storage Televisions Other Brands Product Family Award Case Study Contract & Warranty Demo Magazine News & Event Product Information Services Solution Specification Technical Note Tool Training White Paper Other Content Type Content Type Business & Finance Interpersonal Development IT Professionals Technical Training IT Professionals Training & Certification PC Productivity Personal Computing Proficiency CompetencyIndustry Banking & Finance Communica- tions E-Business Education Government Healthcare Hospitality Manufacturing Petro- chemocals Retail / Wholesale Technology Transportation Other Industries Service Assessment, Design & Implementati on Deployment Enterprise Support Client Support Managed Lifecycle Asset Recovery & Recycling Training
5
5 Taxonomy Strategies LLC The business of organized information Qualitative taxonomy testing methods MethodProcessWhoRequiresValidation Walk-thruShow & explain Taxonomist SME Team Rough taxonomy Approach Appropriateness to task Walk-thruCheck conformance to editorial rules Taxonomist Draft taxonomy Editorial rules Consistent look and feel Usability Testing Contextual analysis (card sorting, scenario testing, etc.) Users Test Proctor Analyst Rough taxonomy Tasks & Answers Reaction to taxonomy Tasks are completed successfully User Satisfaction Survey Users Test Proctor Analyst Rough Taxonomy UI Mockup Search prototype Reaction to new interface Reaction to search results Time to complete task is reduced Tagging Samples Tag sample content with taxonomy Taxonomist Team Indexers Sample content Rough taxonomy (or better) Content ‘fit’ Fills out content inventory Training materials for people & algorithms Basis for quantitative methods
6
6 Taxonomy Strategies LLC The business of organized information Walk-through method— Editorial rules consistency check Abbreviations Ampersands Capitalization General…, More…, Other… Languages & character sets Length limits Multiple parents Plural vs. singular form Scope notes Serial comma Sources of terms Spaces Synonyms & acronyms Term order (Alphabetic or …) Term label order (Direct vs. inverted) … Rule NameEditorial Rule AbbreviationsAbbreviations, other than colloquial terms and acronyms, shall not be used in term labels. Example: Public Information NOT: Public Info. AmpersandsThe ampersand [&] character shall be used instead of the word ‘and’. Example: Licensing & Compliance NOT: Licensing and Compliance CapitalizationTitle case capitalization shall be used. Example: Customer Service NOT: CUSTOMER SERVICE NOT: Customer service NOT: customer service General…, More…, Other… The term labels “General…”, “More…”, and “Other…” shall be used for categories which contain content items that are not further classifiable. Example:“Other Property” “Other Services” “General Information” “General Audience” ……
7
7 Taxonomy Strategies LLC The business of organized information Qualitative taxonomy testing methods MethodProcessWhoRequiresValidation Walk-thruShow & explain Taxonomist SME Team Rough taxonomy Approach Appropriateness to task Walk-thruCheck conformance to editorial rules Taxonomist Draft taxonomy Editorial rules Consistent look and feel Usability Testing Contextual analysis (card sorting, scenario testing, etc.) Users Test Proctor Analyst Rough taxonomy Tasks & Answers Reaction to taxonomy Tasks are completed successfully User Satisfaction Survey Users Test Proctor Analyst Rough Taxonomy UI Mockup Search prototype Reaction to new interface Reaction to search results Time to complete task is reduced Tagging Samples Tag sample content with taxonomy Taxonomist Team Indexers Sample content Rough taxonomy (or better) Content ‘fit’ Fills out content inventory Training materials for people & algorithms Basis for quantitative methods
8
8 Taxonomy Strategies LLC The business of organized information For alpha test of a grocery site 15 Testers put each of 71 best- selling product types into one of 10 pre-defined categories Categories where fewer than 14 of 15 testers put product into same category were flagged % of Testers Cumulative % of Products With Poly- Hierarchy 15/1554%69% 14/1570%83% 13/1577%93% 12/1583%100% 11/1585%100% <11/15100% “Cocoa Drinks – Powder” is best categorized in both “Beverages” and “Grocery”. How to improve? Allow products in multiple categories. (Results are for minimum size = 4 votes) 8 Taxonomy Strategies LLC The business of organized information Usability Testing Method: Closed Card Sort
9
9 Taxonomy Strategies LLC The business of organized information Usability testing method— Task-based card sorting (1) 15 representative questions were selected Perspective of various organizational units Most frequent website searches Most frequently accessed website content Correct answers to the questions were agreed in advance by team. 15 users were tested Did not work for the organization Represented target audiences Testers were asked “where would you look for …” “under which facet… Topic, Commodity, or Geography?” Then, “… under which category?” Then, “…under which sub-category?” Tester choices were recorded Testers were asked to “think aloud” Notes were taken on what they said Pre- and post questions were asked Tester answers were recorded
10
10 Taxonomy Strategies LLC The business of organized information Usability testing method— Task-based card sorting (2) 3. What is the average farm income level in your state? 1.Topics 2.Commodities 3. Geographic Coverage 1.Topics 1.1 Agricultural Economy 1.2Agriculture-Related Policy 1.3Diet, Health & Safety 1.4Farm Financial Conditions 1.5Farm Practices & Management 1.6Food & Agricultural Industries 1.7Food & Nutrition Assistance 1.8Natural Resources & Environment 1.9Rural Economy 1.10Trade & International Markets 1.4Farm Financial Conditions 1.4.1Costs of Production 1.4.2Commodity Outlook 1.4.3Farm Financial Management & Performance 1.4.4Farm Income 1.4.5Farm Household Financial Well-being 1.4.6Lenders & Financial Markets 1.4.7Taxes
11
11 Taxonomy Strategies LLC The business of organized information Analysis of task-based card sorting (1) Find-it TasksUser 1User 2User 3User 4User 5 1. CottonCotton AsiaCotton 2. Mad cowCattleFood SafetyCattle 3. Farm incomeFarm Income US StatesFarm Income 4. Fast food Food Consumption Diet Quality & Nutrition Food Expenditures Diet Quality & Nutrition 5. WICWIC Program 6. GE CornCorn 7. Foodborne illness Foodborne Disease Consumer Food Safety Foodborne Disease 8. Food costsFood PricesMarket StructureMarket Analysis Food Expenditures Retailing & Wholesaling 9. TobaccoTobacco 10. Small FarmsFarm Structure 11. TraceabilityFood SystemLabeling Policy Food Safety Innovations Food Safety PolicyFood Prices 12. HungerFood Security 13. Trade balance Commodity Trade Trade & Intl Markets Commodity TradeMarket Analysis Commodity Trade 14. Conservations Cropping Practices Conservation Policy 15. Trade restrictionsTrade Policy Food Safety & TradeWTOMarket Analysis Commodity Trade
12
12 Taxonomy Strategies LLC The business of organized information Analysis of task-based card sorting (2) In 80% of the trials users looked for information under the categories that we expected them to look for it. Breaking-up topics into facets makes it easier to find information, especially information related to commodities.
13
13 Taxonomy Strategies LLC The business of organized information Analysis of task-based card sorting (3) Test Questions % Correct % Agree 1. Cotton91%82% 2. Mad cow73%64% 3. Farm income100%55% 4. Fast food91%73% 5. WIC100% 6. GE corn100% 7. Foodborne illness82% 8. Food costs55%27% 9. Tobacco100% 10. Small farms91% 11. Traceability36%18% 12. Hunger100%73% 13. Trade balance36%64% 14. Conservation91% 15. Trade restrictions55%36% Possible change required. Change required. Possible error in categorization of this question because 64% thought the answer should be “Commodity Trade.” On these trials, only 50% looked in the right category, & only 27-36% agreed on the category. Policy of “Traceability” needs to be clarified. Use quasi-synonyms.
14
14 Taxonomy Strategies LLC The business of organized information User satisfaction method— Card Sort Questionnaire (1) Was it easy, medium or difficult to choose the appropriate Topic? Easy Medium Difficult Was it easy, medium or difficult to choose the appropriate Commodity? Easy Medium Difficult Was it easy, medium or difficult to choose the appropriate Geographic Coverage? Easy Medium Difficult
15
15 Taxonomy Strategies LLC The business of organized information User satisfaction method— Card Sort Questionnaire (2) EasierMore Difficult
16
16 Taxonomy Strategies LLC The business of organized information Task-Based Card Sorting “Bakeoff” Goal: Compare two different sets of headings – “Blue” and “Orange” Method: Scenarios written for 8 general tasks. 15 users used one set of headings, then the other, to accomplish the task. Users were surveyed on satisfaction after each task, then again at the end. Be aware of test design and be sure to counterbalance the order in which people see the different schemes! This is easier with an even number of participants. TaskBothBlueOrangeT-Test Improve Processes with New Technology6.908.805.000.06 Get Email Remotely6.205.407.000.26 Look Up Features/Model Information7.708.407.000.23 Research a Product6.908.205.600.28 Compare Product Features6.107.005.200.19 Try out a Trial Copy8.908.409.400.14 Choose Secure Product6.906.807.000.93 Choose Software I Already Know6.106.206.000.95
17
17 Taxonomy Strategies LLC The business of organized information Strengths and Weaknesses of Task-Based Card Sorts A task-based card sort is a test of the navigation headings, without additional context from the viewed pages. Due to the low-fidelity interface, it is easy to create and conduct. As a pure navigation test, it provides concentrated information about navigation alone. This makes it particularly appropriate for comparing the two navigation schemes. It provides concentrated information about the wording of headings and spotlights any confusion they may cause. A tightly focused method to gather this type of information. These appear in the qualitative analysis more than in the quantitative. Due to the lack of context, it is a difficult test of navigation. Due to the lack of content, users will have limited confidence that they have reached the right spot. This will be reflected in lower satisfaction scores than for the fully- implemented navigation.
18
18 Taxonomy Strategies LLC The business of organized information Qualitative taxonomy testing methods MethodProcessWhoRequiresValidation Walk-thruShow & explain Taxonomist SME Team Rough taxonomy Approach Appropriateness to task Walk-thruCheck conformance to editorial rules Taxonomist Draft taxonomy Editorial rules Consistent look and feel Usability Testing Contextual analysis (card sorting, scenario testing, etc.) Users Test Proctor Analyst Rough taxonomy Tasks & Answers Reaction to taxonomy Tasks are completed successfully User Satisfaction Survey Users Test Proctor Analyst Rough Taxonomy UI Mockup Search prototype Reaction to new interface Reaction to search results Time to complete task is reduced Tagging Samples Tag sample content with taxonomy Taxonomist Team Indexers Sample content Rough taxonomy (or better) Content ‘fit’ Fills out content inventory Training materials for people & algorithms Basis for quantitative methods
19
19 Taxonomy Strategies LLC The business of organized information User interface survey— Which search UI is ‘better’? Criteria User satisfaction Success completing tasks Confidence in results Fewer dead ends Methodology Design tasks from specific to general Time performance Calculate success rates Survey subjective criteria Pay attention to survey hygiene: Participant selection Counterbalancing T-scores Source: Yee, Swearingen, Li, & Hearst
20
20 Taxonomy Strategies LLC The business of organized information User interface survey — Results (1) Which Interface would you rather use for these tasks? Google-like Baseline Faceted Category Find images of roses1516 Find all works from a certain period230 Find pictures by 2 artists in the same media129 … Overall assessment: Google-like Baseline Faceted Category More useful for your usual tasks428 Easiest to use823 Most flexible624 More likely to result in dead-ends283 Helped you learn more131 Overall preference229 … Source: Yee, Swearingen, Li, & Hearst
21
21 Taxonomy Strategies LLC The business of organized information User interface survey — Results (2) Faceted Category Google-like Baseline Source: Yee, Swearingen, Li, & Hearst
22
22 Taxonomy Strategies LLC The business of organized information Qualitative taxonomy testing methods MethodProcessWhoRequiresValidation Walk-thruShow & explain Taxonomist SME Team Rough taxonomy Approach Appropriateness to task Walk-thruCheck conformance to editorial rules Taxonomist Draft taxonomy Editorial rules Consistent look and feel Usability Testing Contextual analysis (card sorting, scenario testing, etc.) Users Test Proctor Analyst Rough taxonomy Tasks & Answers Reaction to taxonomy Tasks are completed successfully User Satisfaction Survey Users Test Proctor Analyst Rough Taxonomy UI Mockup Search prototype Reaction to new interface Reaction to search results Time to complete task is reduced Tagging Samples Tag sample content with taxonomy Taxonomist Team Indexers Sample content Rough taxonomy (or better) Content ‘fit’ Fills out content inventory Training materials for people & algorithms Basis for quantitative methods
23
23 Taxonomy Strategies LLC The business of organized information Tagging samples— How many items? Goal Number of ItemsCriteria Illustrate metadata schema1-3Random (excluding junk) Develop training documentation10-20Show typical & unusual cases Qualitative test of small vocabulary (<100 categories) 25-50Random (excluding junk) Quantitative test of vocabularies * 3-10X number of categories Use computer-assisted methods when more than 10- 20 categories. Pre-existing metadata is the most meaningful. *Quantitative methods require large amounts of tagged content. This requires specialists, or software, to do tagging. Results may be very different than how “real” users would categorize content.
24
24 Taxonomy Strategies LLC The business of organized information Tagging samples— Manually tagged metadata sample AttributeValues TitleJupiter’s Ring System URLhttp://ringmaster.arc.nasa.gov/jupiter/ DescriptionOverview of the Jupiter ring system. Many images, animations and references are included for both the scientist and the public. Content TypesWeb Sites; Animations; Images; Reference Sources AudiencesEducators; Students OrganizationsAmes Research Center Missions & ProjectsVoyager; Galileo; Cassini; Hubble Space Telescope LocationsJupiter Business FunctionsScientific and Technical Information DisciplinesPlanetary and Lunar Science Time Period1979-1999
25
25 Taxonomy Strategies LLC The business of organized information Tagging samples— Spreadsheet for tagging 10’s-100’s of items 1) Clickable URLs for sample content 2) Review small sample and describe 3) Drop-down for tagging (including ‘Other’ entry for the unexpected 4) Flag questions
26
26 Taxonomy Strategies LLC The business of organized information Rough Bulk Tagging— Facet Demo (1) Collections: 4 content sources NTRS, SIRTF, Webb, Lessons Learned Taxonomy Converted MultiTes format into RDF for Seamark Metadata Converted from existing metadata on web pages, or Created using simple automatic classifier (string matching with terms & synonyms) 250k items, ~12 metadata fields, 1.5 weeks effort OOTB Seamark user interface, plus logo
27
27 Taxonomy Strategies LLC The business of organized information Rough Bulk Tagging— OOTB Facet Demo (2)
28
28 Taxonomy Strategies LLC The business of organized information Quantitative Methods Quantitative methods are possible when: You have a large quantity of tagged data You have logs of how people are using your site
29
29 Taxonomy Strategies LLC The business of organized information Best quantitative process – query log & click trail examination How can we characterize users and what they are looking for? Query Log & Click Trail Examination Only 30-40% of organizations interested in Taxonomy Governance examine query logs* Basic reports provide plenty of real value Greatest value comes from: Identifying a person as responsible for search quality Starting a “Measure & Improve” mindset Greatest challenge: Getting a person assigned (≥ 10%) Getting logs turned back on UltraSeek Reporting Top queries Queries with no results Queries with no click-through Most requested documents Query trend analysis Complete server usage summary Click Trail Packages iWebTrack NetTracker OptimalIQ SiteCatalyst Visitorville WebTrends Source: Metadata Maturity Model Presentation, Ron Daniel, ESS’05
30
30 Taxonomy Strategies LLC The business of organized information Early quantitative method: How evenly does it divide the content? Documents do not distribute uniformly across categories Zipf (1/x) distribution is expected behavior 80/20 rule in action (actually 70/20 rule) Leading candidate for splitting Leading candidates for merging
31
31 Taxonomy Strategies LLC The business of organized information Early quantitative method: How evenly does it divide the content? (2) Methodology: 115 randomly selected URLs from corporate intranet search index were manually categorized. Inaccessible files and ‘junk’ were removed. Results: Slightly more uniform than Zipf distribution. Above the curve is better than expected.
32
32 Taxonomy Strategies LLC The business of organized information Late quantitative method - How does taxonomy “shape” match that of content? Background: Hierarchical taxonomies allow comparison of “fit” between content and taxonomy areas Methodology: 25,380 resources tagged with taxonomy of 179 terms. (Avg. of 2 terms per resource) Counts of terms and documents summed within taxonomy hierarchy Results: Roughly Zipf distributed (top 20 terms: 79%; top 30 terms: 87%) Mismatches between term% and document% flagged Term Group % Terms % Docs Administrators7.815.8 Community Groups2.81.8 Counselors3.41.4 Federal Funds Recipients and Applicants 9.534.4 Librarians2.81.1 News Media0.63.1 Other7.32.0 Parents and Families2.86.0 Policymakers4.511.5 Researchers2.23.6 School Support Staff2.20.2 Student Financial Aid Providers1.70.7 Students27.47.0 Teachers25.111.4 Source: Courtesy Keith Stubbs, US. Dept. of Ed.
33
33 Taxonomy Strategies LLC The business of organized information Conclusion Simple walkthroughs are only the start of how to test a taxonomy. Tagging modest amounts of content, and usability tests such as task-based card sorts, provide strong information about problems within the taxonomy. Caveat: They may tell you which headings need to be changed, they won’t tell you what they should be changed to. If you are not looking at query logs and click trails, you don’t know what site visitors are doing. Taxonomy changes do not stand alone: Search system improvements Navigation improvements Content improvements Process improvements
34
Strategies LLCTaxonomy November 3, 2006Copyright 2006 Taxonomy Strategies LLC. All rights reserved. Questions? Ron Daniel, Jr. rdaniel@taxonomystrategies.com http://ww.taxonomystrategies.com rdaniel@taxonomystrategies.com http://ww.taxonomystrategies.com
35
35 Taxonomy Strategies LLC The business of organized information Bibliography K. Yee, K. Swearingen, K. Li, M. Hearst. "Searching and organizing: Faceted metadata for image search and browsing." Proceedings of the Conference on Human Factors in Computing Systems (April 2003) http://bailando.sims.berkeley.edu/papers/flamenco-chi03.pdfhttp://bailando.sims.berkeley.edu/papers/flamenco-chi03.pdf R. Daniel and J. Busch. "Benchmarking Your Search Function: A Maturity Model.” http://www.taxonomystrategies.com/presentations/maturity-2005-05- 17%28as-presented%29.ppt http://www.taxonomystrategies.com/presentations/maturity-2005-05- 17%28as-presented%29.ppt Donna Maurer, “Card-Based Classification Evaluation”, Boxes and Arrows, April 7, 2003. http://www.boxesandarrows.com/view/card_based_classification_eval uation http://www.boxesandarrows.com/view/card_based_classification_eval uation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.