Presentation is loading. Please wait.

Presentation is loading. Please wait.

Can Online Course-Based Assessment Methods be Fair and Equitable?

Similar presentations


Presentation on theme: "Can Online Course-Based Assessment Methods be Fair and Equitable?"— Presentation transcript:

1 Can Online Course-Based Assessment Methods be Fair and Equitable?
Relationships between students’ preferences, computer- related attitudes, and performance when taking an assessment either online or offline Claire Hewson (OU) (John Charlton, Bolton; Mark Brosnan, Bath) CALRG conference, 12th June, 2013 Version A – White Background Presentation overall Title: GillSans- Light, 24pt – Colour: R:0 G:36 B:125 Date, venue and presenter name: GillSans, 20pt - Colour: R:0 G:36 B:125 Text for text slide: GillSans, 28pt – Colour: R:0 G:36 B:125, Line Spacing: 1, Before Paragraph: 0.5, After Paragraph: 0 Sub title for text slide: GillSans-Light, 28pt – Colour: R:0 G:36 B:125, Line Spacing: 1, Before Paragraph: 0.5, After Paragraph: 0 Main copy text: GillSans, 19pt - Colour: R:0 G:36 B:125, Line Spacing: 1, Before Paragraph: 0.5, After Paragraph: 0 Bullet pointed text: 19pt - Colour: R:0 G:36 B:125, Line Spacing: 0.8, Before Paragraph: 0.5, After Paragraph: 0

2 Background There has recently been a lot of interest in the validity of online assessment methods, such as psychometric tests (e.g. Hewson & Charlton, 2005) BUT little work which assesses the validity of online course-based assessments. However, increasingly educators are being encouraged to use online methods for course delivery and assessment. Thus research into their ability to provide a fair and accurate measure of what is intended, i.e. course-related learning outcomes, is important (e.g. Hewson, Charlton & Brosnan, 2007; Whitelock, 2009).

3 Rationale Validity of online assessment methods (course-based or otherwise) cannot be assumed (e.g. Hewson, 2012). Educators and institutions have expressed concerns about adopting online assessment methods in 'high stake' summative contexts (evidence: personal communications; lack of uptake compared with formative contexts; Boyle & Hutchison 2009; Roy & Armarego, 2003). To alleviate concerns and justify using online assessment methods, rigorous evaluative research into their validity is needed (Cassady & Gridley, 2005; Hewson, 2012).

4 Rationale Potential factors which may influence performance:
Mode of assessment Computer-related attitudes Computer-related experience Demographic factors Perceptions and preferences Interaction? Individual difference factors

5 Prior Research Mode Effects (online - offline)
Evidence for mode effects exists, e.g. Joinson (2001) found people are more candid online (non-course-based context). Also people may give less socially desirable responses (Joinson, 1999). Evidence for mode effects in a proctored (e.g. exam) online assessment context has also been found (e.g. Goldberg & Pedulla, 2002; Ricketts & Wilks, 2002). Others have reported finding a lack of mode effect (e.g. Cassady & Gridley, 2005).

6 Prior Research Individual Difference Factors
Performance on computer-related tasks has been found to be related to computer attitudes (e.g. computer anxiety: Brosnan, 1998). BUT findings in relation to course-based tasks have been equivocal (Mahar, Henderson & Deane, 1997). Demographic factors have also been found to be related to performance on computer tasks (esp. gender: Brosnan, 1998). Online assessment-related preferences and attitudes have not previously been assessed in terms of their impact upon performance (Hewson, 2012).

7 Current Project The purpose of this project was to examine – in a quasi- experimental design – whether simple online assessments could be demonstrated to be as valid (as measures of course- related outcomes) as traditional offline assessments. LTSN (HEA) funding was obtained to support the research described here.

8 Study Aims To assess the validity of a simple online assessment method (MCQ test) by considering whether the following factors influence performance scores: Actual mode (online, pen and paper) Self-reported mode preferences Computer attitudes (anxiety, engagement)

9 Method Participants Cohort 1 (n=31): enrolled on a psychology methods module at Bolton university; Cohort 2 (n=42): as above but in the subsequent year; Cohort 3 (n=21): enrolled on a cognitive psychology module at Bath university. Design IVs: assessment mode; mode preference. DVs: MCQ assignment scores; computer attitudes scores (using scales from Charlton, 2002).

10 Method Procedure Cohorts 1 and 2 were pseudo-randomly assigned to take the same compulsory module-related MCQ test either online or offline (pen and paper). Cohort 3 took a non-compulsory module-related MCQ assessment online. All participants were invited to complete a computer attitudes questionnaire (derived from Charlton, 2002), in return for a monetary reward. At the end of the MCQ assignment students were asked to report their preferred assessment mode.

11 Results 1. Mode Effects No mode effect (n.s.) Main effect of cohort,
F(1,85)=6.88, p=.01, partial η2=.075 Note: (post hoc) power to detect a medium effect size (η2=.059) was around 0.6 for these analyses.

12 Results 2. Mode preferences No main effects
No interaction: F(2,68)=.842, ns, partial η2=.024, observed power = .019

13 Results 3a. Relationship between Computer Attitudes and MCQ Scores
Scores didn’t differ between cohorts; cohorts 1 and 2 were combined for the correlational analyses.

14 Results 3b. Relationship between Computer Attitudes and MCQ Scores
All effect sizes were small (r =.10) to small / medium (r =.30), and n.s. CE and MCQ score correlation for the online group approached significance. However, this coefficient (r =.287) was not significantly different from that for the offline group (-.033) (z =1.30, p=.097, one-tailed).

15 Results 4. CA, CE and MCQ Scores, controlling for performance on a previous related (research methods) course. For both online and offline groups, MCQ scores were found to be strongly correlated with marks on a previous methods module (r(40)=.640 and r(34)=.551, respectively). Thus partial correlations were carried out, controlling for marks on this previous module, to consider the unique influence of CA and CE on MCQ scores, after prior performance on this strongly related module had been accounted for. Partial correlations were all lower, except for the offline group CE and MCQ scores.

16 Results 5. Computer Attitudes and Overall Module Marks
Interestingly, CA and CE were more strongly related (in the expected direction) to overall module marks, which were derived wholly from offline assessments, than to online MCQ assessment marks. However, while this was true for the group (cohorts 1 and 2) taking the research methods-related assessments, no such relationship was found for the group (cohort 3) taking a module with no mathematical / statistical content. A relationship between computer attitudes and performance on tasks with mathematical / statistical content may thus be speculated, to account for this finding.

17 Discussion Summary of Findings
Taking the same MCQ assessment in online or offline mode does not significantly impact upon performance, at least in the current context(s). Performance scores did not differ depending on whether the assignment was taken in the preferred or non-preferred mode. Only very weak evidence for any effect of CA or CE on performance on online assessments was found, and these relationships were weaker than those observed for CA / CE and marks based on offline assessments (for a statistics / maths-related module).

18 Conclusions & Issues for Further Research
Overall, the current results provide support for the validity of online assessments of the type used here. However, restrictions on the generalisability of these findings should be noted: the task used here was relatively straightforward; future research should consider such issues in relation to more complex and demanding computer-based tasks the effect of different contexts (e.g. weighted versus non- weighted assessments) and content (as partially explored here) also deserves further exploration the cohorts used in this study had fairly high levels of computer experience and literacy; further research should consider whether these findings generalise to less experienced and more computer anxious populations.

19 The End References Hewson, C. (2012 ). Can online course-based assessment methods be fair and equitable? Relationships between students' preferences and performance within online and offline assessments. Journal of Computer Assisted Learning, 28(5), Hewson, C., Charlton, J., & Brosnan, M. (2007). Comparing online and offline administration of multiple choice question assessments to psychology undergraduates: do assessment modality or computer attitudes influence performance? Psychology Learning and Teaching, 6(1),

20 Appendices

21 Results (appendix) A. The Role of Demographic Factors
Only weak, n.s., relationships (female=1, male=2). No significant relationships were observed between MCQ test scores and age or gender.

22 Results (appendix) B. The MCQ Assignment


Download ppt "Can Online Course-Based Assessment Methods be Fair and Equitable?"

Similar presentations


Ads by Google