Download presentation
Presentation is loading. Please wait.
1
Some Suggestions for Improvement
ICE Evaluations Some Suggestions for Improvement
2
Outline Background information and assumptions
Content of evaluation forms Logistical problems with processing ICE information
3
Background Information
Exchange of s by professors last summer Arts and Sciences “Task Team” currently looking at various ways of evaluating teaching My points here are mostly compatible with both
4
Background Assumptions
Student Evaluations will continue to be used They will be used for two purposes: Instructors’ own improvement of courses and teaching Assessment of teachers by administrators We should make ICE evaluations as effective as possible for both purposes
5
Suggestions About Content of ICE Forms
(go to evaluations file)
6
Remove the “One Number” Overall Average At Bottom of Page
It gives less information, not more It is all people will look at if it’s available Administrators assessing teachers Teachers planning future courses Not all the categories have to do with the instructor, so it’s unfair to assign these ratings to instructor FAS Task Team unanimously agreed
7
Keep the text of individual questions
In some formats of ICE reports, the questions are missing This encourages looking only at numbers So include the actual questions
8
Why Not Also Get Rid of the “Category” Average Numbers?
All the same reasons apply But if this is too much, then really, please, please get rid of the “one number” average
9
Some Specific Questions on ICE Form Need Revision
#20 “The Material was not too difficult” means that the highest rating is for material that is far too easy Combine questions to make question “The difficulty and pace of the course were appropriate”
10
Question #10 “Demonstrated Favorable Attitude toward students”
Task Team recommendation: change to “treated students with proper respect” Reason: the old wording favors teachers who are lenient about, for example, plagiarism, arriving to class late, talking during class…
11
Other questions to revise
#7 “Was readily available for consultation outside of class” Question #12 “Evaluated Work Fairly”
12
Too Many Questions Researchers seem to agree with the common-sense idea that too many questions on an evaluation form leads students to give up Some ICE questions seem repetitive or unnecessary
13
How to include fewer questions
Again, combine Questions to make question “The difficulty and pace of the course were appropriate” Drop Question #15 and #16 about stating and covering objectives of course, since #17 “Course organization was logical and adequate” covers these
14
“Additional Items” on ICE form
After the university-wide questions, a section of “additional items” is included Currently, each faculty (FAS, Engineering, etc.) can choose from an “item bank” of approved questions Instead, each department should choose any questions they want, whether from item bank or not
15
Why let Departments Choose?
Departments are in the best position to design questions that are appropriate for their discipline For example, why think that the same questions would be appropriate to a chemistry course, an education course, and an English literature course? Too much bureaucratic regulation is not beneficial to a university
16
Logistical Problems with Processing ICE Information
17
Course evaluations are often “lost” or assigned to wrong course
Intstructors have students fill out evaluation forms, then no ICE report appears for that course Has happened at least five times in philosophy department in three years Other professors reported the same problem in last summer’s exchange
18
The cause? If students fill in the wrong section number, or department number, or course number, then the evaluations all automatically are assigned to the wrong course (or to no course)
19
The Solution Is not to assign blame (as in “Well, this is the department’s fault, because the graduate assistant who gave the evaluations must have told students the wrong numbers”) But instead is to try to redesign system so that this mistake (which is easy to make) does not lead to corruption of data
20
The Solution (part II) A simple but less effective solution: Tell all instructors to give the course information themselves to students themselves, by e.g. writing on the board (this at least makes instructors responsible) A (slightly) more difficult but more effective solution: have some kind of “cover sheet” for each course, which the computer will read. If the individual ICE forms disagree with information on cover sheet, automatically assign it to the correct course
21
A More Widespread Problem
When the evaluations for a course are mysteriously absent, sometimes evaluations from one or two (or more) students appear anyway Or, when a teacher doesn’t administer evaluations, she still gets results from one or two students anyway And probably this “phantom evaluation” process occurs, undetected, in MOST courses
22
Cause of Phantom Evaluations
It’s the same cause as for the missing evaluations for a whole course If one or two (or more) students write the wrong course numbers, their evaluations will be assigned to the wrong course (even if all the rest of the student forms go to the right course) This probably happens VERY OFTEN So it’s all the more reason to fix the problem
23
How to Avoid “Phantom Evaluation” Problem
The same way as avoiding the more large-scale assignment of evaluations to wrong courses Have some kind of “cover sheet” for each course, which the computer will read. If the individual ICE forms disagree with information on cover sheet, automatically assign it to the correct course
24
Another Logistical Problem
The ICE form includes a “response rate” for indicating the percentage of enrolled students who fill out an evaluation form But for at least two of the last four semesters, these figures are inaccurate
25
Why is the “Response Rate” Often Inaccurate?
The response rate is, of course, meant to be an indication of the percentage of students enrolled in the course who actually fill out the ICE form But the total number of “enrolled students” is not accurate The AUBsis site in fall and fall gave a total number of enrolled students at the BEGINNING of the term, not at the end So any students who dropped the class were still included in the “enrolled students” total So suppose 25 students were enrolled at the beginning of the term, but 5 dropped. And suppose 15 students filled out the ICE form. The official “response rate” would be 60%. But the real response rate, of students still enrolled, would be 75%.
26
Solution to the “response rate” problem
If OIRA uses the AUBsis information for this, OIRA and the registrar should coordinate the uses to which the data will be put. So the “enrolled students” number must reflect the number of students enrolled at the end of the term, not the beginning.
27
OIRA office responses to faculty
28
OIRA has not Responded to Faculty Correspondence About Problems
A delicate issue Numerous examples Why it matters Solution? I admit I don’t know. Maybe a full-time office manager?
29
One final issue: Use of ICE Reports
Literature on evaluations often mentions proper use by administrators A quick glance is worse than no information at all Items to focus on: percentage of students responding; type of course (graduate vs. undergrad, introductory vs. advanced); particular questions; distribution of answers (are one or two terrible ratings dragging average down?) NOT ONE NUMBER
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.