Presentation is loading. Please wait.

Presentation is loading. Please wait.

L643: Evaluation of Information Systems Week 11: March 17, 2008.

Similar presentations


Presentation on theme: "L643: Evaluation of Information Systems Week 11: March 17, 2008."— Presentation transcript:

1 L643: Evaluation of Information Systems Week 11: March 17, 2008

2 2 Measurement Instrument It is not necessary to use the instruments discussed in class However, it would be helpful to consider the existing instruments Review the ones we discussed Search the IS World survey instrument database: http://www.isworld.org/surveyinstruments/surveyin struments.htm http://www.isworld.org/surveyinstruments/surveyin struments.htm

3 3 Pilot Testing of Instruments Instrument providers: Give instructions to the testers E.g., their roles, actions, etc Provide your instrument Provide a feedback sheet/ or ask a list of questions afterwards (debriefing) Testers Try to answer the questions Ask questions when instructions are not clear Be honest and sincere Provide constructive feedback

4 4 Pilot Testing of Instruments First Pilot Test Session Second Pilot Test Session Unicoop (5)Zipcar (5) Need to send an elec copy to nansuwan@indiana.edu and cc’ the msg to nhara@indiana.edu nansuwan@indiana.edunhara@indiana.edu Library Thing (3)Evergreen (4) John Fluevog Boots & Shoes (4) Quadrem (4)

5 5 Announcements The Evaluation Design Proposal, Measurement Instrument, and Instrument Memo will be due at 5pm on Thursday, March 26th, instead of next Monday Any group will need a computer lab?

6 6 Evaluation Project Process Problem/ Purpose Hypotheses or Questions Definitions Literature Review Sample Instrumentation Procedures/ Design Data Collection & Analysis Recommendation Pilot (test) See also UCLA’s report

7 7 Survey Research (Babbie, 2004) Choose appropriate question forms Open-ended questions (qualitative) Closed-ended questions (quantitative) The response categories provided should be exhaustive The answer categories must be mutually exclusive When would be appropriate to use open-ended Qs? When would be appropriate to use closed-ended Qs?

8 8 Survey Research (Babbie, 2004) Questions: Avoid asking multiple questions in a single item Qs must be relevant Qs must be concise Try to avoid confusions/ biased items & terms Respondents: whether they are capable of answering the questions must be willing to answer

9 9 Survey Research (Babbie, 2004) Format Clear instructions Easy to understand Ordering items Pre-testing

10 10 Survey Research Types of surveys Telephone surveys http://www.pewinternet.org/reports.asp (Pew Internet & American Life) http://www.pewinternet.org/reports.asp Mail questionnaire Low response rates How to increase response rates?

11 11 Survey Research (Babbie, 2004) Administration of survey: Self-administration: Via snail mail Via Web (e.g., Survey Monkey)Survey Monkey Interview survey Telephone survey

12 12 Assessing Program Impact: Alternative Designs (Rossi, et al., 2004) Bias in estimating program effects Selection bias (nonequivalent comparison design) Targets drop out Targets refuse to participate Environmental trends Interfering events, e.g., natural disaster Maturation – people grow

13 13 Risks of Quantitative Studies (Nielsen, 2004) List 3 major problems of quantitative studies Considering these problems, how would you treat most of the IS research based on quantitative studies? (disregard them or accept as it is?) Can you think of any situations in which quantitative studies would be useful? For your group project, what type of research design would be useful? Why?

14 14 Quality and Effectiveness in Web- based Customer Support Systems (Negash, et al., 2003) Information Quality Informativeness entertainment System Quality Interactivity Access Service Quality Tangible Reliability Assurance Responsiveness Empathy Effectiveness User satisfaction H1 H2 H3

15 15 Quality and Effectiveness in Web- based Customer Support Systems (Negash, et al., 2003) SERVQUAL: Tangibles: physical features of the system Reliability: the system’s consistency of performance and dependability Responsiveness: readiness of the system to provide service Assurance: knowledge & courtesy expressed in the system Empathy: the care and individualized attention http://www.12manage.com/methods_zeithaml_servqual.html

16 16 Quality and Effectiveness in Web- based Customer Support Systems (Negash, et al., 2003) Information Quality Informativeness entertainment System Quality Interactivity Access Service Quality Tangible Reliability Assurance Responsiveness Empathy Effectiveness User satisfaction H1 H2 H3 -.051.449.348

17 17 Zones of Tolerance (Kettinger & Lee, 2005) Perceived Service Adequate Service Desired Service Expected Service SERVQUALZOT SERVQUAL Minimum service level Desired service level Perceived service level Perceived adequacy Perceived superiority

18 18 E-service quality in libraries (Hernon & Calvert, 2005) Adaptations of SERVQUAL to include e- SERVQUAL instrument SERVQUAL focuses on 5 dimensions: Assurance Empathy Reliability Responsiveness Tangibles

19 19 E-service quality in libraries (Hernon & Calvert, 2005) LibQUAL focuses on 4 dimensions: Access to information Affect of service Library as place Personal control

20 20 E-service quality in libraries (Hernon & Calvert, 2005) Customization/personalization Importance #4 Unimportance #2 Security/privacy/trust Importance #5 Unimportance #3 Web site aesthetics Importance #10 Unimportance #1

21 21 E-service quality in libraries (Hernon & Calvert, 2005) ideal perceived Retain Improve Reallocate resources Revisit

22 22 Choosing Measures to Evaluate … (Ryan, et al., 2001) Are you measuring something simply because you can? Consider cost & time Just why you are measuring something Pre-test the measure’s proposed definition Training the relevant staff (see the 5 Qs on p. 123)

23 23 Choosing Measures to Evaluate … (Ryan, et al., 2001) Data collection methodologies: Qualitative techniques: Case study Content analysis Policy analysis Focus groups Interviews Quantitative techniques: In-house, mail or electronic surveys

24 24 Choosing Measures to Evaluate … (Ryan, et al., 2001) Data collection methodologies: Automated techniques: Pop-up surveys Network traffic measures Web log file analysis Are there confidentiality and privacy concerns? How can the burden of data collection be balanced ore reduced?


Download ppt "L643: Evaluation of Information Systems Week 11: March 17, 2008."

Similar presentations


Ads by Google