IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott
5/1/ Purpose of Presentation Interpretation of the Student Ratings of Instruction Forms Reports Interpreting the Diagnostic Form Report for Improved Teaching Effectiveness
5/1/ IDEA is an acronym for.. Individual Development and Educational Assessment
5/1/ IDEA Uses IDEA system Should be able to use IDEA for ID = Individual Development Should be able to use IDEA for EA= Educational Assessment
5/1/ Improvement of Student Learning Student Ratings can have a positive impact if... The instrument Is “learning focused” Provides a diagnostic The emphasis for “summative” faculty evaluation is appropriate 30%-50% of the overall evaluation of teaching Results are not over-interpreted Faculty trust the process
5/1/ IDEA: What you should know about the student ratings Reliability and validity of the IDEA system How to interpret IDEA reports and use IDEA resources for faculty improvement plans How to interpret and adjusted vs. unadjusted scores How to use group summary reports for program improvement
5/1/ IDEA: What you should know about the student ratings How to complete the FIF How to interpret the reports How to use IDEA reports to improve teaching effectiveness How to use IDEA resources to improve teaching Student ratings are data that must be interpreted
5/1/ Student Ratings- Reliable? Valid? In general, student ratings tend to be statistically reliable, valid and relatively free from bias (probably more so than other data used to evaluate teaching) Reliability – the likelihood that you will get the same results if the survey is administered again to the same group of students Validity – measures what it supposed to/intended to measure
5/1/ Reliability & Validity Dog, Saul T. IDEA University Spring 2007 Composition I 1010 (MWF – 10:00) There were 12 students enrolled in the course and 9 students responded. Your results are considered unreliable because the number responding is so Small. The 75% response rate indicates that results are representative of the class as a whole. < 10 students Unreliable students Marginally Reliable Fairly Reliable Reliable >30 Highly Reliable
5/1/ Understanding the value of the IDEA System’s uniqueness? Student Learning Focus Diagnostic Component Scores Adjusted for Extraneous Influences What was instructor’s influence on learning? Documented Validity and Reliability National Comparative Data Group Summary Reports Program Assessment
IDEA as a Diagnostic to Guide Improvement And as a Tool to Evaluate Teaching Effectiveness
5/1/ Underlying Assumptions Students are not qualified to assess: Faculty expertise Appropriateness of goals, content, and organization of course Materials used in delivery How student work is evaluated, including grading practices
5/1/ Underlying Assumptions Nor are they qualified to assess “indirect” contributions to instruction Support for departmental efforts Assistance to colleagues Contributing to a positive atmosphere
IDEA Student Ratings of Instruction The Student Learning Model
5/1/ Student Learning Model Types of learning must reflect instructor’s purpose Effectiveness determined by student progress on objectives stressed by instructor
5/1/ Student Learning Model Specific teaching behaviors influence certain types of student progress under certain circumstances.
IDEA Student Ratings of Instruction- Forms Faculty Information Form Student Survey Diagnostic Form
IDEA: FIF Faculty Information Form
5/1/ Faculty Information Form One FIF per class being evaluated Course Information IDEA Department Codes Extended list: 12 Learning Objectives Course Description Items Best answered toward end of semester
5/1/ FIF: Selecting Objectives 3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help students accomplish the objective? Does the student’s progress on the objective influence his or her grade? In general, progress ratings are negatively related to the number of objectives chosen. Research Note 3
5/1/ Relevant Objectives Basic Cognitive Items 1, 2 Applications of Learning Items 3, 4 Expressiveness Items 6, 8
5/1/ Relevant Objectives Intellectual Development 7, 10, 11 Lifelong Learning 9, 12 Team skills 5
5/1/ Best Practices Multi-section courses Curriculum committee review Prerequisite-subsequent courses Incorporate into course syllabus
5/1/ Best Practices Discuss meaning of objectives with students Early in semester Inform that will be asked to rate their own progress on objectives Reflect on their understanding of course purpose and how parts of course fit the 12 objectives Discuss differences in perception of objectives’ meaning
Student Survey Diagnostic Form
5/1/ Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items Student and Course Student Characteristics: Items 36-39, 43 Course Management/Content: Items Global Summary: Items Experimental Items: Items Extra Questions: Items Comments
5/1/ False False Assumptions Effective instructors effectively employ all 20 teaching methods. The 20 teaching methods items are used to make an overall judgment about teaching effectiveness. Students should make significant progress on all 12 learning objectives
5/1/ Using Extra Questions 20 Extra Questions available May be used to address questions at various levels: Institution Department Course Or all three
5/1/ Student Survey How to use extra questions Comments Constructive
Report Background Comparison Groups Converted Scores
5/1/ The Report: Comparative Information Comparison Groups IDEA Discipline Institution
5/1/ Comparison Groups (norms) IDEA Comparisons Classes rated in , , Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5% of the database 128 institutions 44,455 classes
5/1/ Comparison Groups (norms) Discipline Comparisons Most recent 5 years of data Minimum of 400 classes Exclusions same as IDEA Comparisons Also exclude classes with no objectives selected
5/1/ Comparison Groups (norms) Institutional Comparisons Minimum of 400 classes Most recent 5 years of data Exclude classes with no objectives selected Include all class sizes
Report Background
5/1/ Report: Types of Scores Average Scores – Numerical averages on a 5- point scale Converted Scores – Compensate for different averages among course objectives and provide normative comparisons Raw Scores – unadjusted scores Adjusted scores – Compensate for extraneous factors beyond instructor’s control,.. “level the playing field”
5/1/ Converted Scores- WHY? Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale
5/1/ Converted Averages In classes where “Gaining Factual Knowledge” was an I or E Objective the average student rating of progress was 4.00 (5-point scale) In classes where “Gaining a broader understanding of intellectual/cultural activity” was an I or E objective, the average rating of progress was 3.69 If only 5-point averages are considered, those choosing the second objective would be at a disadvantage
5/1/ Norms: Converted Averages Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale Use T Scores Average = 50 Standard Deviation = 10 These are not percentiles
5/1/ Standard Deviation Tells us What?
5/1/ What do the converted ratings mean? Much Higher >63 score (highest 10%) Higher (next 20 percent – 71-90%) Similar (middle 40% of courses 31-70%) Lower (next 20 percent (11-30%) Much Lower <37 (lowest ten percent)
5/1/ Adjusted Scores Control for factors beyond instructor’s control Regression equations
5/1/ Adjusted Scores: Diagnostic Form Student Motivation (#39) Student Work Habits (#43) Class Size (Enrollment, FIF) Course Difficulty (multiple items) Student Effort (multiple items)
5/1/ Impact of Extraneous Factors Gaining Factual Knowledge – Average Progress Ratings Work Habits (Item 43) Student Motivation (Item 39) Low Avg. Avg. High Avg. High Low Low Avg Average High Avg High Technical Report 12, page 40
IDEA...The Report
5/1/ The IDEA Report Diagnostic Form Report What were students’ perceptions of the course and their learning? What might I do to improve my teaching?
5/1/ The Report: Questions What was the response rate and how reliable is the information contained in the report? What overall estimates of my teaching effectiveness were made by students? What is the effect of “adjusting” these measures to take into consideration factors I can’t control? How do my scores compare to other comparison groups?
5/1/ Summary Evaluation of Teaching Effectiveness
5/1/ Summary Evaluation of Teaching Effectiveness 50% 25%
5/1/ Summary Evaluation of Teaching Effectiveness
5/1/ Questions Addressed: Page 2 How much progress did students report on the learning objectives that I identified as “Essential”? How does this progress compare to the available comparison groups? How much progress did students report on the “Important” objectives? How does this progress compare to the available comparison groups? Do conclusions change if “adjusted” rather than “raw” ratings are used?
5/1/ Progress on Specific Objectives
5/1/ Questions: Teaching Effectiveness Which of the 20 teaching methods are most related to my learning objectives? How did students rate my use of these important methods? What changes should I consider in my teaching methods? Do these results suggest some general areas where improvement efforts should focus?
5/1/ Improving Teaching Effectiveness
5/1/ Improving Teaching Effectiveness POD-IDEA Center Notes POD-IDEA Center Learning Notes IDEA Papers IDEA Seminars
5/1/ Questions Addressed: Page 2 How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter? How distinctive is this class with regard to student self-ratings?
5/1/ Description of Course and Students
5/1/ Questions Addressed: Page 4 What was the average rating on each of the questions on the IDEA form? How much variation was there in these ratings? Are the distributions of responses relatively “normal” (bell-shaped) or is there evidence of distinctive subgroups of students? What are the results for the additional questions I used?
5/1/ Statistical Detail
5/1/ Statistical Detail
5/1/ Teaching & Student Learning Improvement By using the recommendations afforded by the IDEA analysis, individual faculty will be able to formulate changes in their pedagogical methods and course structure that have tangible results in next semester’s scores.
5/1/ IDEA Results Faculty with adjusted t-scores placing them in the similar, higher or much higher comparison categories in progress on relevant objective in the majority of classes evaluated, have arguably demonstrated effective teaching performance
5/1/ IDEA Results One semester of student ratings does not really serve any useful long-term evaluation purpose Multiple (4-5) evaluations spread over time to draw long-term implications
5/1/ IDEA Results IDEA ratings no more than 33% of the measure of teaching effectiveness Ill-served to define teaching effectiveness by a single measure of teaching performance
5/1/ Teaching Effectiveness – IDEA one piece of data Triangulation Other Sources of Evidence Self-evaluation/reflective statement Description of current teaching Course materials Graded appraisal tools (tests, essays, papers, etc.) Feedback from mentors/colleagues Peer observations Classroom assessment/research efforts
5/1/ Teaching Effectiveness– IDEA one piece of data Consider Teaching occurs over time…just a snapshot (compare progress in same course over several semesters) Type of course being evaluated Number of students responding, and the percentage of student responding General education course major course Global summary items Written comments Consider number, kind and difficulty of learning objectives selected
5/1/ Interpreting Diagnostic Form Reports Review results Use results to identify areas of improvement Use IDEA resources state.edu/podidea/index.html state.edu/podidea/index.html
5/1/ IDEA Center- POD Resources These succinct papers were written in collaboration with the Professional & Organizational Development Network in Higher Education (POD). As a resource to support teaching improvement, each is useful to anyone wanting to address specific ways to employ different teaching methods – each of which is utilized in the Diagnostic Form of the IDEA Student Ratings of Instruction System. (Learning Notes)
5/1/ IDEA Trainers Faculty trainers within the division A trained faculty member who can assist other faculty with interpreting IDEA data and use the IDEA resources to improve teaching effectiveness.
5/1/ IDEA Resources Institutional Effectiveness & Research Office & Website IDEA Online IDEA Papers POD-IDEA Center Notes
5/1/
5/1/ Faculty Workshop Dates Today, Saturday, August 23 rd Harriman Campus -1:00-2:30; O-101 Oak Ridge Campus Wednesday, Sept. 17 th, 12:30-2:00 PM, Room TBA Wednesday, Sept. 17 th, 6:00-7:30 PM, Room TBA Harriman Campus Wednesday, Oct. 1 st, 6:00-7:30 PM, Room TBA Cumberland Campus Wednesday, Oct. 8 th, 11:00-12:30 CST, Room TBA Additional Workshops as needed
5/1/ Questions