Work package 2: User analysis and usability testing Responsible partner: Danish Data Archives WP leader: Anne Sofie Fink, DDA.

Slides:



Advertisements
Similar presentations
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
Advertisements

Goal Setting Learning to Work Efficiently and Effectively.
Develop an Information Strategy Plan
ACCOUNTING INFORMATION SYSTEMS
Alternate Software Development Methodologies
Expert Appraisal. Types of Expert Appraisal Some modifications of the instructional material are based on the developer’s hindsight; Most should be based.
A portal for Danish Research - some considerations and ideas Anne Sofie Fink Danish Data Archives IASSIST 2002.
A Discussion of Validity in Qualitative Research Anne Sofie Fink Data Archivist The Danish Data Archives.
IS 214 Needs Assessment and Evaluation of Information Systems Managing Usability © Copyright 2001 Kevin McBride.
Administrivia  Review Deliverable 2 –Overview (audience) –Excellent additions  User Goals  Usability Goals  User Group (who are you designing for?)
Scientific workflow systems are problem-solving environments designed to allow researchers to perform complex tasks simply by piecing together individual.
Project Sharing  Team discussions –Share results of heuristic evaluations –Discuss your choice of methods and results  Class-level discussion –Each spokesperson.
Day 2 (Week 1, W) Wrap-up Introductory Materials –Formation of Teams –General comments on the survey –Grading – Solution, Design Rationale –E-portfolio.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Evaluation Through Expert Analysis U U U
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
Web Design Process CMPT 281. Outline How do we know good sites from bad sites? Web design process Class design exercise.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
Petter Nielsen Information Systems/IFI/UiO 1 Software Prototyping.
CCLVET Cross Cultural Learning and Teaching in Vocational Education and Training Overview LEONARDO DA VINCI Transfer of Innovation AGREEMENT NUMBER – LLP-LDV-TOI-08-AT-0021.
RUP Implementation and Testing
NORWEGIAN SOCIAL SCIENCE DATA SERVICES MADIERA Project Management.
Northcentral University The Graduate School February 2014
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
1 Usability and accessibility of educational web sites Nigel Bevan University of York UK eTEN Tenuta support action.
Integrating Usability Engineering and Agile Software Development: A Literature Review 陳振炎教授 楊哲豪
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
SIXTH FRAMEWORK PROGRAMME FP INCO-MPC-1 MEditerranean Development of Innovative Technologies for integrAted waTer managEment.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
Cluster Management Scorecard FITT (Fostering Interregional Exchange in ICT Technology Transfer)
Grundtvig Learning Partnership Project TeachingFlex.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Chapter 3: Software Project Management Metrics
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Identifying needs and establishing requirements Data gathering for requirements.
Work Package 6 L2C Kick-off meeting Fontainebleau, March 7th 2006.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Prototyping life cycle Important steps 1. Does prototyping suit the system 2. Abbreviated representation of requirements 3. Abbreviated design specification.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
Agenda Debrief on past module development Tools for online content development Module development template Timeline Suggested guidelines for developing.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Implementation recommendations 1st COPRAS review Presentation at 2nd COPRAS annual review, 15 March 2006, CEN/CENELEC meeting centre, Brussels Bart Brusse.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
Eurostat Sharing data validation services Item 5.1 of the agenda.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
© PeopleAdvantage 2013 All Rights Reserved We will Show You How to Easily Conduct Effective Performance Appraisals LCSA Conference 2013.
Software Design and Development Development Methodoligies Computing Science.
RES 320 expert Expect Success/res320expertdotcom FOR MORE CLASSES VISIT
User-Centered Design Services for MSU Web Teams Sarah J. Swierenga, Director Usability & Accessibility Center MSU Webmasters 402 Computer Center February.
WP8: Demonstrators (UniCam – Regione Marche)
WP6. Quality Plan 6.2 Develop a monitoring, evaluation, and quality plan Edited by September, 2017.
The Development Process of Web Applications
Imran Hussain University of Management and Technology (UMT)
Software Development Unit 4 Outcome 1
Software Requirements
Usability Evaluation, part 2
د. حنان الداقيز خريف /28/2016 Software Quality Assurance ضمان جودة البرمجيات ITSE421 5 – The components of the SQA.
SY DE 542 User Testing March 7, 2005 R. Chow
Usability Techniques Lecture 13.
BioMedBridges – Work Packages 2 & 12
MBSE Usability Activity Team Model-based Systems Engineering (MBSE) Initiative MBSE IS Workshop June 2011.
Rapid Application Development (JAD)
Presentation transcript:

Work package 2: User analysis and usability testing Responsible partner: Danish Data Archives WP leader: Anne Sofie Fink, DDA

Why perform user analysis? A user analysis should provide the development team with an understanding of the users so that they can produce software that will be used by target communities “A user analysis provides an objective basis for making decisions, and is the first step to building a better product faster for less.” (softwaredesignworks.com) Setting the scene

Benefits from user analysis (softwaredesignworks.com) : developers will have fewer discussions about what users want and how to best provide it contributors will spend less time reworking a vague design once they begin coding the project team will avoid re-engineering a project to accomodate overlooked, but critical product requirements Development will cost less Setting the scene (cont. 1)

Why perform usability testing: Usability tests should provide the development team with information about usability problems the users have when using the product Definition of usability: “The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.” (ISO : 1998 (E)) Setting the scene (cont. 2)

Usability measures are: Effectiveness: How well the user achieves the goals he/she sets out to achieve using the system Efficiency: The resources consumed in order to achieve goals Satisfaction: How the user feels about his/her use of the system Additional measures: Memorability: Interface is easy to remember Errors: Low error rate and easy to recover from errors Setting the scene (cont. 3)

Summing up: User analysis is about getting an idea about the product the users want Usability testing is about how well the product performs when used Cornerstones: Getting information about needs and usage ‘Translating’ this information into maps of user requirements Creating an information source to developers and other team members Setting the scene (cont. 4 )

Ensure that the software tools developed through the MADIERA project offers functionality and solutions that are in accordance with the users’ needs. (user analysis) Ensure that the software tools developed are user friendly and appropriate for the target user groups. (Usability testing) Objectives for WP2

Deliverables of WP2

The software developed will be in accordance with the needs of the users Usability test report on the Beta Version of the demonstrator (M14) Completion of the usability testing and incorporation of the feedback will ensure development in accordance with the users requirements Milestones for WP2

Secondary analysis of existing data –Analysis of data collected in the FASTER project A new user analysis focused at researchers –Method: a qualitative approach is chosen since what is needed is understanding not quantification –The technique Contextual Inquiry by Hugh Beyer & Karen Holtzblatt is chosen (prior experience from FASTER) –Team members taking part in data collection must have a thorough understanding of the project and it’s limitations to work upon About user analysis

Four test sequences –No. 1: Test of Demonstrator Prototype –No. 2: Test of Demonstrator Beta Version –No. 3: Test of Demonstrator Version 1 –No. 4: Test of final version (M32 - August ‘05) Test 1-3 will be made by internal teams Test 4 will involve external parties Small scale tests will run whenever a version is released Methods: heuristics evaluation, think-aloud-tests, and quantitative user tests About usability testing

All partners have 2 person-months for the work As responsible partner the DDA has 10 month The work will run throughout the project period Suggestions for organisation of the work: –From each partner a member for a user analysis/usability team (WP2 Team/User Expert Team) is appointed, at least one member should be part of the development team –The project leader will provide the team members with back ground material and guidelines for carrying the work –The team will have meetings and teleconferences The work

Data collection and reporting for the user analysis Review of reports written by WP leader Appointing internal members for usability tests Reporting feedback from usability test Taking parts in discussion about methods, techniques, users’ needs etc. Acting the role of the user’s representative within the local project teams and at other occasions … Responsibilities of the ‘User Expert Team’

Who’s who in the User Expert Team –setting up of a mail forum Analysis of prior user interviews in the light of the aims of the MADIERA Project (WP leader) Review of this report (User Expert Team) Literature and instructions for the new user analysis made available from internal web site Teleconference to discuss new user analysis Interviewing Kick-off of WP2

Provide information which is useful to the development team in the light of the project and it’s limitations Adaptation of methods and techniques to the project Few team members involved directly - but ‘deeply’ Focus on implementation of the knowledge provided by user analysis and usability test, tight link to the development team Procedures for reporting user information to the project Use relevant occasions for collection of user feedback - (this will support the visibility of the project) A few comments