Download presentation
Presentation is loading. Please wait.
Published byClaribel Elliott Modified over 9 years ago
1
A Case-Study of a Web-Based Method for Repeated-Measures and Multi-Source Research Michael J. Walk, M.S. University of Baltimore michael.walk@ubalt.edu SCiP—Chicago, IL, Nov. 2008
2
Web-Based Research Valuable research tool for Psychology Dominated by cross-sectional, between- subject designs
3
Speaking Abstractly WebRTS: Web-Based Research Task System A web-based system for poly-task online research designs. Adhered to methodological and ethical recommendations for online research. Tested WebRTS in a multi-source research design (N = 28).
4
The Task Page – Purpose To control the ordering of stimulus presentation To prevent repeating or premature completion of tasks (Reips, 2000) To separate the research design into short, distinct tasks (Reips, 2000) To allow participants to return to complete at a later time
5
The Task Page - Function Find out who the user is Find out what tasks are done so far Put a check mark by those tasks Find out what the next task is Make this task an active hyperlink Find out what tasks are in queue Make these tasks inactive text
8
WebRTS – Basic Page Structure Header Same on every page = coherence (Nosek, Banaji, & Greenwald, 2002) Logout Button (if logged in) University Logo = trustworthiness (Reips, 2000) Content Footer Dynamic (page links to important pages) = navigability Email the researcher link = experimenter presence Quit the study link = debriefing (Nosek, Banaji, & Greenwald, 2002)
10
Website Flow Homepage Informed Consent Login Page Processing script Task update script Task 1Task 2Task k Task Page Debriefing Yes Last Task? New Participant No Current Participant Quit? Internet Give consent Pre-Consent Exit Interview Refuse consent
11
WebRTS – Auxiliary Pages Forgotten password retrieval Administrator’s page Send reminder emails to participants Add fields to database
12
WebRTS – Technology Specs MySQL database (hosting and database provided FREE by www.agilityhoster.com)www.agilityhoster.com PHP server-side scripts JavaScript for form validation Pilot version www.ubpsychportal.org/ssa Updated version (WebRTS 2.0) www.ubcareerlab.org/cip
13
Case Study Used in a multi-source, poly-task research design testing relationships between self-monitoring (Gangestad & Snyder, 2000) and accuracy of predicting personality ratings (e.g., Walk, Mitchell, & Yun, 2008) Anecdotal usability evidence all positive Brief follow-up survey administered online in Aug. 2008 (3 months after Case Study) N = 6 (21%) Green & Pearson’s (2006) Web-Site Usability Instrument 16 items, (1) strongly disagree to (7) strongly agree 4 Open-ended questions
14
Notable Usability Results I completed the task on the Web site without much effort. (M = 6.33, SD =.82) After learning to use part of the Web site, I easily learned to use another part. (M = 6.17, SD = 1.17) The Web site interface was consistent throughout the site. (M = 6.17, SD = 1.17) All items M > 5
15
Item means
16
Open-Ended Questions What part or page of the website did you like the best? I liked it all. It was user friendly. I liked the fact that the survey was multiple choice. This meant that it took less time to complete the tasks. I liked the fact that the website was easy to navigate when completing the tasks. Easy to readfollow The set-up of the whole thing was easy to use. i dont know
17
Open-Ended Responses (cont.) What part or page of the website did you like the least? No complaints. Unfortunately, I did not like the part of the survey that had someone else fill out a survey… The part of the website that I did not like was the portion in which you had to have a supervisor or co- worker complete… lacks color, kind of boring NA I dont know
18
WebRTS 2.0 System pages are “easily” configurable Titles, pictures, colors, researcher names, host institution, etc. Can set open / close dates Can set max participants Can randomly assign to ordering conditions Prevents repeat submissions by the same user.
19
WebRTS – Uses and Applications Repeated measures designs Longitudinal designs Poly-task cross-sectional designs Multi-source (e.g., participant & rater) designs Diary designs
20
References Gangestad, S. W., & Snyder, M. (2000). Self-monitoring: Appraisal and reappraisal. Psychological Bulletin, 126(4), 530-555. Green, D., & Pearson, J. M. (2006). Development of a web site usability instrument based on ISO 9241-11. Journal of Computer Information Systems, 66-72. Nosek, B. A., Banaji, M. R., & Greenwald, A. G. (2002). E-research: Ethics, security, design, and control in psychological research on the internet. Journal of Social Issues, 58(1), 161-176. Reips, U. (2000). The web experiment method: Advantages, disadvantages, and solutions. In Birnbaum, M. H., ed. Psychological Experiments on the Internet. Academic Press: San Diego, CA. Walk, M. J., Mitchell, T., & Yun, G. (2008). Know thy social self? Self- monitoring predicts accuracy in rating one’s reputation. Poster session presented at the 20th Annual Meeting of the Association for Psychological Science, Chicago, IL, May 2008. Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.