Download presentation
Presentation is loading. Please wait.
Published byAmina Hunton Modified over 9 years ago
1
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part I March 16, 2009
2
Literature Review
3
Research Data One concern frequently expressed in published studies is the response rate
4
Oneonta’s Response Rates
5
Layne et al. (1999)
6
Thorpe (2002)
7
Dommeyer et al. (2003) Incentives for electronic evaluations (randomly assigned): 1) modest grade incentive => 87% response rate 2) in-class demonstration => 53% response rate 3) early grade notification => 51% response rate 4) no incentive => 29% response rate
8
Dommeyer et al. (2003)
9
Kulik (2005), Study 1
10
Kulik (2005), Study 2
11
Donovan et al. (2006)
12
Avery et al. (2006)
13
Heath et al. (2007)
14
Ardalan et al. (2007) ?
15
Whose voice is not heard? Is there a non-response bias?
16
Richardson (2005) “It is therefore reasonable to assume that students who respond to feedback questionnaires will be systematically different from those who do not respond in their attitudes and experience of higher education.” (p. 406, emphasis added)
17
Layne et al. (1999) Statistically significant predictors of responding to electronic course evaluations: –GPA –class –subject area
18
Dommeyer (2002) Statistically significant predictors of responding to electronic course evaluations: –none! Variables examined: –gender –expected grade –rating of professor’s teaching
19
Thorpe (2002) Statistically significant predictors of responding to electronic course evaluations: –final grade –gender –GPA
20
Avery et al. (2006) Statistically significant predictors of responding to electronic course evaluations: –anticipated final grade –gender –race/ethnicity –class size
21
Conclusion There is a fairly consistent, documented history of bias in response rates, resulting in some groups being under-represented
22
Are paper forms biased? Perhaps, but the response rates are much higher, so whatever bias exists is not as problematic as with electronic forms that yield much lower response rates
23
Are the averages different with fewer responses? Does an electronic format result in higher or lower overall average ratings?
24
Conclusion Some studies show that electronic evaluations result in higher overall averages, some lower, and some not statistically different than paper- based forms
25
Responses from Survey of Teaching Faculty February 4 - 13, 2009
26
Procedure Wednesday, February 4: Survey opened; e-mail invitation sent to all teaching faculty Monday, February 9: Reminder announcement in Senate Wednesday, February 11: E-mail sent to all department chairs Friday, February 13: Survey closed
27
Survey Responses Number of respondents: 178
28
Respondents’ Division
29
Faculty Rank of Respondents
30
Respondents’ Length of Service
31
1. Are you in favor or opposed to the College conducting all course evaluations online?
33
2. How strongly do you feel about the College conducting all course evaluations online?
35
Summary of Written Responses Faculty (even some who are in favor of online evaluations) say they are “worried” about the following: –low response rates –lack of security –non-discrimination (all instructors get rated the same) –biased sample (because of who might not respond)
36
Summary of Written Responses, cont. One person reported previous positive experience with online evaluations at another institution
37
Summary of Written Responses, cont. Some faculty who oppose online evaluations have had experience with either the pilot project last summer, online course evaluations at previous institutions, or other online aspects of their courses Faculty speaking from first-hand experience explicitly mentioned their concern about low response rates
38
Summary of Written Responses, cont. Faculty are concerned about the emotional/mental state of students when completing evaluations online They also worry about whether students might be influenced by others around them at the time
39
Summary of Written Responses, cont. Overall, the language and tone of faculty opposed to online evaluations was far more strongly and emphatically voiced than the (rather muffled) approval of those in favor
40
Summer 2008 Pilot
41
Response Rates and Overall Experience No summary data available Anecdotal data (from the survey and personal conversations): Percentage of faculty who participated in the pilot who are now in favor of online evaluations: 0% Percentage of faculty who participated in the pilot who are now opposed to online evaluations: 100%
42
Student Feedback
43
Committee Conclusions
44
Data Sources Survey of teaching faculty Published, peer-reviewed literature Consultation with Patty Francis and Steve Johnson Anecdotal evidence from other institutions Local campus experience
45
Conclusions: Paper Forms Advantages: –higher response rate, less likely for bias in results –more faculty are confident about obtaining valid results through this method –controlled setting for administration –students are familiar with the format
46
Conclusions: Paper Forms Disadvantages: –time required to process forms –delay in receiving results –use of paper resources => Note that none of these disadvantages is related to the validity or accuracy of the data
47
Conclusions: Digital Forms Advantages: –results could be delivered to faculty more quickly –saves paper and some processing time
48
Conclusions: Digital Forms Disadvantages: –lower response rate –no good options for incentives –more likely for bias in results, concerns about validity –a majority of faculty have significant reservations –concerns among both faculty and students about security/privacy
49
Conclusions: Digital Forms Disadvantages, cont.: –questions about faculty being able to opt out –questions about students being able to opt out –student responses can be posted online for others to see
50
One Final Consideration SPI data are currently used to evaluate faculty for: –merit pay –contract renewal –tenure/continuing appointment –promotion –performance awards => If faculty lack confidence in the integrity and accuracy of course evaluation data, any decisions that are made on the basis of these data are likely to be questioned in a way that we believe is unhealthy for our institution.
51
Recommendation #1 All course evaluations should be administered using paper forms. We believe the current consensus among faculty and students will shift at some point toward favoring an electronic format. But we are not nearly there yet.
52
Recommendation #2 Electronic course evaluations should not even be an option. Aggregated results cannot be interpreted meaningfully (especially if differential incentives are offered). EXCEPTION: Distance-learning courses
53
Recommendation #3 Since significant man-hours are needed to process course evaluation forms for our campus, the College Senate should advocate strongly for allocating additional (seasonal) help for processing these forms.
54
Stay tuned...... for Part II of our recommendations regarding changes to the form used for course evaluation.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.