L643: Evaluation of Information Systems Week 7: February 18, 2008.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Critical Reading Strategies: Overview of Research Process
CS305: HCI in SW Development Evaluation (Return to…)
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Theory of Planned Behaviour and Physical Activity EPHE 348.
Chapter 8 Information Systems Development & Acquisition
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
Usability presented by the OSU Libraries’ u-team.
The Health Belief Model
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Chapter 1 Conducting & Reading Research Baumgartner et al Chapter 1 Nature and Purpose of Research.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Problem Identification
1 Why consumers use and do not use technology-enabled services By R. H. Walker and L. W. Johnson Presented by (student name) Article 25.
An evaluation framework
Gender Issues in Systems Design and User Satisfaction for e- testing software Prepared by Sahel AL-Habashneh. Department of Business information systems.
Company LOGO B2C E-commerce Web Site Quality: an Empirical Examination (Cao, et al) Article overview presented by: Karen Bray Emilie Martin Trung (John)
Research Methodology Lecture No :27 (Sample Research Project Using SPSS – Part -A)
ISEM 3120 Seminar in ISEM Semester
Theory of Reasoned Action and Theory of Planned Behavior
Introducing the Computer Self-Efficacy to the Expectation-Confirmation Model: In Virtual Learning Environments 授課老師:游佳萍 老師 學 生:吳雅真 學 號:
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Information Systems Development : Overview. Information systems development practice Concept and role of a systems development methodology Approaches.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Organizational Behavior MBA-542 Instructor: Erlan Bakiev, Ph.D.
Dreamweaver Domain 3 KellerAdobe CS5 ACA Certification Prep Dreamweaver Domain 1 KellerAdobe CS5 ACA Certification Prep Dreamweaver Domain 1: Setting Project.
Factors affecting contractors’ risk attitudes in construction projects: Case study from China 박병권.
THE EXPERIENCE OF WORK:
Advisor: 謝焸君 教授 Student: 賴千惠
Chapter 14: Usability testing and field studies
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
1 Usability of Digital Libraries Sanda Erdelez & Borchuluun Yadamsuren Information Experience Laboratory University of Missouri – Columbia USA Libraries.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
2 Focus —How firms “manage” the development of Internet products for the $140B IT industry Impact —What “management” approaches are linked to firm-level.
Determinants of successful virtual communities: Contributions from system characteristics and social factors Nova Novita Ira Geraldina Intan Oviantari.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
L643: Evaluation of Information Systems Week 11: March 17, 2008.
Heuristic evaluation Functionality: Visual Design: Efficiency:
Usability Testing CS774 Human Computer Interaction Spring 2004.
The Application of Cognitive Processes to Organizational Surveys: How Informants Report About Interorganizational Relationships Joan M. Phillips Mendoza.
A technical writer’s role in software quality – an experiment Asha Mokashi, SCT Software Solutions, Bangalore.
1 ISE 412 Usability Testing Purpose of usability testing:  evaluate users’ experience with the interface  identify specific problems in the interface.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
English Language Learners’ (ELLs) Attitudes toward Computers Sei-Hwa Jung University of Maryland MICCA 2005.
QUALITY ASSURANCE MANAGEMENT CONTROLS Chapter 9. Quality Assurance (QA) Management is concerned with ensuring: 1) The information system produced by the.
1 The Impact of Knowledge Sharing Modes and Projects Complexity on Team Creativity in Taiwan ’ s Information Systems Development Team 1 Mei-Hsiang Wang.
Writing Software Documentation A Task-Oriented Approach Thomas T. Barker Chapter 5: Analyzing Your Users Summary Cornelius Farrell Emily Werschay February.
APCS Nguyen Ngoc Dan Vy – Tran Thi Hong Diem – Instructor: Do Lenh Hung Son.
Theories of Health Behaviors Gero 302. Health Belief Model Has intuitive Logic and clearly stated central tenents Behind the HBM values and expectancy.
LibQUAL Survey Results Customer Satisfaction Survey Spring 2005 Sidney Silverman Library Bergen Community College Analysis and Presentation by Mark Thompson,
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
TECHNOLOGY ACCEPTANCE MODEL
Quality Is in the Eye of the Beholder: Meeting Users ’ Requirements for Internet Quality of Service Anna Bouch, Allan Kuchinsky, Nina Bhatti HP Labs Technical.
MODEL: Mass Media Person- CME Hypermedia Planned- Behavioral New Paradigm: E-Commerce MODEL: Mass Media Person- CME Hypermedia Planned- Behavioral New.
FELICIAN UNIVERSITY Creating a Learning Community Using Knowledge Management and Social Media Dr. John Zanetich, Associate Professor Felician University.
Copyright © 2004 by South-Western, a division of Thomson Learning, Inc. All rights reserved. Marketing Research, Decision-Support Systems, and Sales Forecasting.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Copyright ©2012 Pearson Education Chapter 2 Job Attitudes 2-1 Essentials of Organizational Behavior, 11/e Global Edition Stephen P. Robbins & Timothy A.
Department of Computer Science Continuous Experimentation in the B2B Domain: A Case Study Olli Rissanen, Jürgen Münch 23/05/2015www.helsinki.fi/yliopisto.
The Role of Public Commitment in an Academic Context
Awatef Ahmed Ben Ramadan
Data Collection Methods
SY DE 542 User Testing March 7, 2005 R. Chow
Presentation transcript:

L643: Evaluation of Information Systems Week 7: February 18, 2008

2 A Toolkit for Strategic Usability (Rosenbaum et al., 2000) Most used organizational approaches or usability methodologies: Heuristic evaluation (70%) Lab usability testing (65%) Fit into current engineering processes (63%) Task analysis (62%)

3 A Toolkit for Strategic Usability (Rosenbaum et al., 2000) More extensive use of approaches and methodologies Usability test w/ portable lab equipment UI staff members co-located with engineering Field studies High-level/founder support Usage scenarios Participatory design

4 A Toolkit for Strategic Usability (Rosenbaum et al., 2000) Less extensive use of approaches and methodologies Educate/train other functional groups Focus groups Surveys (52%) Corporate mandates/ usability objectives UI group resorts to UI, not development

5 A Toolkit for Strategic Usability (Rosenbaum et al., 2000) Examined the relationship between: Effectiveness ratings and % of reporting use (Figure 1) Size of organizations and usability methods Types of companies and how successful respondents from these companies rate organizational approaches and usability methods? Hypothesizes Do usability consultancies rank some or all usability methods more effective than do in-house usability professionals? [Yes] Do smaller companies have a better focus on their customer populations, and thus find contextual inquiries and task analysis more effective? [No]

6 Website Usability (Palmer, 2002) Web usability (Nielsen, 2000) Navigation Response time Credibility Content Media richness (Daft & Lengel, 1986, etc.)

7 Website Usability (Palmer, 2002) Hypo 1: websites exhibiting lower download delay will be associated w/ great perceived success by site users Hypo 2: more navigable websites will be associated with greater perceived success by site users Hypo 3: higher interactivity in websites will be associated with greater perceived success by site users

8 Website Usability (Palmer, 2002) Hypo 4: More responsive websites will be associated with greater perceived success by site users Hypo 5: higher quality content in websites will be associated with greater perceived success by site users.

9 Website Usability (Palmer, 2002) In this article, how did he collect data? Is it appropriate? What’s the research design? What are the findings? What are the implications? Are there any problems with this study?

10 Are Wiki Usable? (Désilets, et al., 2005) Quasi-ethnographic methods In-session data: Observing subjects asking Qs Recorded interactions with the instructor Post-session data: Inspecting the subjects’ work

11 Are Wiki Usable? (Désilets, et al., 2005) A-priori categories in severity: Catastrophe Impasse Annoyance Bottom-up classification of events: Hypertext Link creation and management Image uploading Creating/editing pages Hypertext authoring, etc.

12 Updated D&M IS Success Model (2002, 2003) Information Quality System Quality Service Quality Intention To Use Use User Satisfaction Net Benefits CreationUseConsequences

13 Updated D&M IS Success Model (2002, 2003) Information Quality System Quality Service Quality Intention To Use Use User Satisfaction Net Benefits CreationUseConsequences

14 IS Effectiveness: A User Satisfaction Approach (c.f., Thong & Yap, 1996) Criticisms for user satisfaction: Questionable operationalizations of the user satisfaction construct Poor theoretical understanding of the user satisfaction construct Misapplication of user satisfaction instruments

15 IS Effectiveness: A User Satisfaction Approach (c.f., Thong & Yap, 1996) Existing literature: Organizational effectiveness No strong model of organizational effectiveness No agreement on its measurement Information systems effectiveness Difficulty of measuring org effectiveness  measuring system usage  user satisfaction User satisfaction LOTS of criticisms on previous measurement Similarity between user satisfaction and the social & cognitive psychologists’ notion of an attitude

16 IS Effectiveness: A User Satisfaction Approach Definition of satisfaction: the extent to which users believe the IS available to them meets their information requirements Assumption: if you are satisfied with the system, it is increasing your effectiveness Based on assumptions: workers are rational and want to be effective

17 Theory of Reasoned Action (Fishbein & Ajzen, 1975) Beliefs about consequences of behavior X Normative beliefs about behavior X Attitude toward behavior X Subjective norm concerning behavior X Intention to perform behavior X Behavior X

18 End-user Computing Satisfaction: Figure 1 (Doll & Torkzadeh, 1988; 1994; Doll et al., 2004) End-user Computing Satisfaction ContentAccuracyFormatEase of useTimeliness C1 C2C3 C4 A1A2 F1F2 E1E2 T1T2

19 End-user Computing Satisfaction: Figure 3 (Doll & Torkzadeh, 1988; 1994, 2004) This model is robust, i.e., it can be used to measure/compare different subgroups (hypothesis-1) Some difference in structural weights (hypothesis-2; Table 5)

20 User Satisfaction with Knowledge Management System (Ong & La, 2004) 21-item questions that include: Knowledge content (5Qs) Knowledge map (4Qs) Knowledge manipulation (4Qs) Personalization (4Qs) Knowledge community (4Qs) 5 global items Intention to use, intention to recommend Overall satisfaction, success of KMS

21 Questionnaire for User Interaction Satisfaction (QUIS) QUIS 7.0 ( A demographic questionnaire 6 scales to measure reactions of the system 4 measures of interface factors: Screen factors Terminology & system feedback Learning factors System capabilities See the questions at: vey.htm vey.htm Variations2’s use of QUIS ( survey.pdf) survey.pdf

22 Activity According to Nevo and Wade (2007), 29% of IT project failed, and another 53% were challenged Suppose you work for IU COAS IT department. The dean of COAS decided to install a new system that would facilitate application process for graduate students because the current system doesn’t work quite well. You talked to the vendor of the software, and they mentioned that other schools have already implemented the system and were happy with it. Come up with a plan that would reduce “disappointment” within COAS when you introduce this new system Make sure to justify your decisions

23 More Instruments for User Satisfaction For electronic health record: For college computing (our own UITS):