Can Online Course-Based Assessment Methods be Fair and Equitable?

Slides:



Advertisements
Similar presentations
Is College Success Associated With High School Performance? Elizabeth Fisk, Dr. Kathryn Hamilton (Advisor), University of Wisconsin - Stout Introduction.
Advertisements

2006 International Symposium of Computer Assisted Language Learning,June 2-4, Beijing China Tutor feedback in online English language learning: tutor perceptions.
I.B. Psych Exam Review.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Higher Course Assessment Question Paper Structure  Section 1 20 multiple choice questions 20 marks  Section 2 restricted and extended response questions.
Research Methods Steps in Psychological Research Experimental Design
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Chris Evans, University of Winchester Dr Paul Redford, UWE Chris Evans, University of Winchester Dr Paul Redford, UWE Self-Efficacy and Academic Performance:
Self Competence and Depressive Symptoms in Ethnic Minority Students: The Role of Ethnic Identity and School Belonging Praveena Gummadam and Laura D. Pittman.
Understanding Meaning and Importance of Competency Based Assessment
Using a VLE for Efficient and Effective Feedback Nick Lund Manchester Metropolitan University.
Self-assessment Accuracy: the influence of gender and year in medical school self assessment Elhadi H. Aburawi, Sami Shaban, Margaret El Zubeir, Khalifa.
1 Psych 5510/6510 Chapter 13 ANCOVA: Models with Continuous and Categorical Predictors Part 2: Controlling for Confounding Variables Spring, 2009.
IB: Language and Literature
Development of Assessments Laura Mason Consultant.
‘Rachel is typing…’: The Influence of Instant Messaging on Anxiety, Likeability, and Relational Evaluation Antoine Lebeaut, Ashley Strunk, Matt Landy,
Good teaching for diverse learners
Kaitlyn Patterson & Wendy Wolfe
Effectiveness of interactive distance instruction
Better to Give or to Receive?: The Role of Dispositional Gratitude
Service-related research: Therapy outcomes audit
Approaches to social research Lerum
Research Problems, Purposes, & Hypotheses
Jenn Shinaberger Lee Shinaberger Corey Lee Coastal Carolina University
(Saima Eman, University of Sheffield, UK)
An evaluation of the online universal COPING parent programme:
Training Trainers and Educators Unit 6 – Developing Aims and Learning Outcomes and Planning a Learning Session Aim To provide participants with the knowledge.
Widening Participation whilst Narrowing Attainment Gaps between Student Groups: A Realistic Objective for Higher Education? Introduction: How this study.
Parental Alcoholism and Adolescent Depression?
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Empathy in Medical Care Jessica Ogle (D
The Relationship Between Emphasis of Cell-phone Use on Performance and Anxiety: Classroom Implications Jordan Booth, Leah Cotton, Jeni Dillman, Kealey.
Jenn Shinaberger Corey Lee Lee Shinaberger Coastal Carolina University
Evaluation of An Urban Natural Science Initiative
Staff and student experience of flipped teaching
Between-Subjects, within-subjects, and factorial Experimental Designs
Higher Course Assessment.
Which of these is “a boy”?
Carrie O’Reilly, Ph.D., M.S.N., RN Touro University Nevada
Statistical Analyses & Threats to Validity
Introduction to Assessment and Monitoring
Kayla McCaleb, Sara Sohr-Preston, and Karen Phung
Factorial Experimental Designs
Starter: complete the research methods paper
By Xiaoye May Wang Kin Fai Ellick wong, and Jessica y. y. kwong 2010
Experiments: Part 1.
Chapter Six Training Evaluation.
What are the key elements of maths that you need to focus on
Chapter Eight: Quantitative Methods
RESEARCH METHODS Trial
Social Change Implications
Emma Senior & Mark Telford.
Research Methods in Psychology
LEAP Online: striving for excellence in learning development
Ontario`s Mandated High School Community Service Program: Assessing Civic Engagement After Four Years S. D. Brown, S.M. Pancer, P. Padanyi, M. Baetz, J.
Experiments: Part 1.
Indiana University School of Social Work
Main Effects and Interaction Effects
Dr Claire Kotecki (STEM) & Dr Prithvi Shrestha (WELS)
Assessment in Career Counseling
Alea Simmons, Undergraduate Honors Student
Experiments: Part 1.
Learning online: Motivated to Self-Regulate?
Training Trainers and Educators Unit 6 – Developing Aims and Learning Outcomes and Planning a Learning Session Aim To provide participants with the knowledge.
EDEN Research Workshop October 2014
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Factorial Designs Factorial design: a research design that includes two or more factors (Independent Variables) A two-factor design has two IVs. Example:
Week 2 Evaluation Framework
Effects of Sexualization in Advertisements
Educational Testing Service
Presentation transcript:

Can Online Course-Based Assessment Methods be Fair and Equitable? Relationships between students’ preferences, computer- related attitudes, and performance when taking an assessment either online or offline Claire Hewson (OU) (John Charlton, Bolton; Mark Brosnan, Bath) CALRG conference, 12th June, 2013 Version A – White Background Presentation overall Title: GillSans- Light, 24pt – Colour: R:0 G:36 B:125 Date, venue and presenter name: GillSans, 20pt - Colour: R:0 G:36 B:125 Text for text slide: GillSans, 28pt – Colour: R:0 G:36 B:125, Line Spacing: 1, Before Paragraph: 0.5, After Paragraph: 0 Sub title for text slide: GillSans-Light, 28pt – Colour: R:0 G:36 B:125, Line Spacing: 1, Before Paragraph: 0.5, After Paragraph: 0 Main copy text: GillSans, 19pt - Colour: R:0 G:36 B:125, Line Spacing: 1, Before Paragraph: 0.5, After Paragraph: 0 Bullet pointed text: 19pt - Colour: R:0 G:36 B:125, Line Spacing: 0.8, Before Paragraph: 0.5, After Paragraph: 0

Background There has recently been a lot of interest in the validity of online assessment methods, such as psychometric tests (e.g. Hewson & Charlton, 2005) BUT little work which assesses the validity of online course-based assessments. However, increasingly educators are being encouraged to use online methods for course delivery and assessment. Thus research into their ability to provide a fair and accurate measure of what is intended, i.e. course-related learning outcomes, is important (e.g. Hewson, Charlton & Brosnan, 2007; Whitelock, 2009).

Rationale Validity of online assessment methods (course-based or otherwise) cannot be assumed (e.g. Hewson, 2012). Educators and institutions have expressed concerns about adopting online assessment methods in 'high stake' summative contexts (evidence: personal communications; lack of uptake compared with formative contexts; Boyle & Hutchison 2009; Roy & Armarego, 2003). To alleviate concerns and justify using online assessment methods, rigorous evaluative research into their validity is needed (Cassady & Gridley, 2005; Hewson, 2012).

Rationale Potential factors which may influence performance: Mode of assessment Computer-related attitudes Computer-related experience Demographic factors Perceptions and preferences Interaction? Individual difference factors

Prior Research Mode Effects (online - offline) Evidence for mode effects exists, e.g. Joinson (2001) found people are more candid online (non-course-based context). Also people may give less socially desirable responses (Joinson, 1999). Evidence for mode effects in a proctored (e.g. exam) online assessment context has also been found (e.g. Goldberg & Pedulla, 2002; Ricketts & Wilks, 2002). Others have reported finding a lack of mode effect (e.g. Cassady & Gridley, 2005).

Prior Research Individual Difference Factors Performance on computer-related tasks has been found to be related to computer attitudes (e.g. computer anxiety: Brosnan, 1998). BUT findings in relation to course-based tasks have been equivocal (Mahar, Henderson & Deane, 1997). Demographic factors have also been found to be related to performance on computer tasks (esp. gender: Brosnan, 1998). Online assessment-related preferences and attitudes have not previously been assessed in terms of their impact upon performance (Hewson, 2012).

Current Project The purpose of this project was to examine – in a quasi- experimental design – whether simple online assessments could be demonstrated to be as valid (as measures of course- related outcomes) as traditional offline assessments. LTSN (HEA) funding was obtained to support the research described here.

Study Aims To assess the validity of a simple online assessment method (MCQ test) by considering whether the following factors influence performance scores: Actual mode (online, pen and paper) Self-reported mode preferences Computer attitudes (anxiety, engagement)

Method Participants Cohort 1 (n=31): enrolled on a psychology methods module at Bolton university; Cohort 2 (n=42): as above but in the subsequent year; Cohort 3 (n=21): enrolled on a cognitive psychology module at Bath university. Design IVs: assessment mode; mode preference. DVs: MCQ assignment scores; computer attitudes scores (using scales from Charlton, 2002).

Method Procedure Cohorts 1 and 2 were pseudo-randomly assigned to take the same compulsory module-related MCQ test either online or offline (pen and paper). Cohort 3 took a non-compulsory module-related MCQ assessment online. All participants were invited to complete a computer attitudes questionnaire (derived from Charlton, 2002), in return for a monetary reward. At the end of the MCQ assignment students were asked to report their preferred assessment mode.

Results 1. Mode Effects No mode effect (n.s.) Main effect of cohort, F(1,85)=6.88, p=.01, partial η2=.075 Note: (post hoc) power to detect a medium effect size (η2=.059) was around 0.6 for these analyses.

Results 2. Mode preferences No main effects No interaction: F(2,68)=.842, ns, partial η2=.024, observed power = .019

Results 3a. Relationship between Computer Attitudes and MCQ Scores Scores didn’t differ between cohorts; cohorts 1 and 2 were combined for the correlational analyses.

Results 3b. Relationship between Computer Attitudes and MCQ Scores All effect sizes were small (r =.10) to small / medium (r =.30), and n.s. CE and MCQ score correlation for the online group approached significance. However, this coefficient (r =.287) was not significantly different from that for the offline group (-.033) (z =1.30, p=.097, one-tailed).

Results 4. CA, CE and MCQ Scores, controlling for performance on a previous related (research methods) course. For both online and offline groups, MCQ scores were found to be strongly correlated with marks on a previous methods module (r(40)=.640 and r(34)=.551, respectively). Thus partial correlations were carried out, controlling for marks on this previous module, to consider the unique influence of CA and CE on MCQ scores, after prior performance on this strongly related module had been accounted for. Partial correlations were all lower, except for the offline group CE and MCQ scores.

Results 5. Computer Attitudes and Overall Module Marks Interestingly, CA and CE were more strongly related (in the expected direction) to overall module marks, which were derived wholly from offline assessments, than to online MCQ assessment marks. However, while this was true for the group (cohorts 1 and 2) taking the research methods-related assessments, no such relationship was found for the group (cohort 3) taking a module with no mathematical / statistical content. A relationship between computer attitudes and performance on tasks with mathematical / statistical content may thus be speculated, to account for this finding.

Discussion Summary of Findings Taking the same MCQ assessment in online or offline mode does not significantly impact upon performance, at least in the current context(s). Performance scores did not differ depending on whether the assignment was taken in the preferred or non-preferred mode. Only very weak evidence for any effect of CA or CE on performance on online assessments was found, and these relationships were weaker than those observed for CA / CE and marks based on offline assessments (for a statistics / maths-related module).

Conclusions & Issues for Further Research Overall, the current results provide support for the validity of online assessments of the type used here. However, restrictions on the generalisability of these findings should be noted: the task used here was relatively straightforward; future research should consider such issues in relation to more complex and demanding computer-based tasks the effect of different contexts (e.g. weighted versus non- weighted assessments) and content (as partially explored here) also deserves further exploration the cohorts used in this study had fairly high levels of computer experience and literacy; further research should consider whether these findings generalise to less experienced and more computer anxious populations.

The End References Hewson, C. (2012 ). Can online course-based assessment methods be fair and equitable? Relationships between students' preferences and performance within online and offline assessments. Journal of Computer Assisted Learning, 28(5), 488-498. Hewson, C., Charlton, J., & Brosnan, M. (2007). Comparing online and offline administration of multiple choice question assessments to psychology undergraduates: do assessment modality or computer attitudes influence performance? Psychology Learning and Teaching, 6(1), 37-46.

Appendices

Results (appendix) A. The Role of Demographic Factors Only weak, n.s., relationships (female=1, male=2). No significant relationships were observed between MCQ test scores and age or gender.

Results (appendix) B. The MCQ Assignment www.claireraq.myzen.co.uk/LTSNproject/example1.html