IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center
5/3/ Presentation Process at DSU for online IDEA surveys Review IDEA - Student Ratings of Instruction system Forms Reports Questions
5/3/ Process for IDEA Surveys Faculty receive for each course with a link to the FIF (new copy feature) Faculty receive unique URL for each course- must provide this to students Faculty receive status update on how many students completed Questions
IDEA as a Diagnostic to Guide Improvement And as a Tool to Evaluate Teaching Effectiveness
IDEA Student Ratings of Instruction The Student Learning Model
5/3/ Student Learning Model Types of learning must reflect instructor’s purpose Effectiveness determined by student progress on objectives stressed by instructor
IDEA Student Ratings of Instruction Overview Faculty Information Form Student Survey - Diagnostic Form
IDEA: FIF Faculty Information Form
5/3/ Faculty Information Form Some thoughts on selecting objectives Video for Faculty on completing the FIF
5/3/ Faculty Information Form One FIF per class being evaluated Course Information IDEA Department Codes Extended list: 12 Learning Objectives Course Description Items Optional Best answered toward end of semester
5/3/ FIF: Selecting Objectives 3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help students accomplish the objective? Does the student’s progress on the objective influence his or her grade? In general, progress ratings are negatively related to the number of objectives chosen. Research Note 3
5/3/ Best Practices Multi-section courses Curriculum committee review Prerequisite-subsequent courses Discuss meaning of objectives with students Incorporate into course syllabus
5/3/ New feature- as of 2/2010 Copy FIF objectives from one course to another Previous FIFs will be available in a drop down menu (linked by faculty e- mail address)
5/3/
Student Survey Diagnostic Form iles/Student_Ratings_Diagnostic_For m.pdf
5/3/ Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items Student and Course Student Characteristics: Items 36-39, 43 Course Management/Content: Items Global Summary: Items Experimental Items: Items Extra Questions: Items Comments
5/3/ False False Assumptions Effective instructors effectively employ all 20 teaching methods. The 20 teaching methods items are used to make an overall judgment about teaching effectiveness. Students should make significant progress on all 12 learning objectives
5/3/ Resources: Administering IDEA Client Resources IDEA Resources Best practices Directions to Faculty Using Additional Questions Some Thoughts on Selecting IDEA Objectives Disciplinary Selection of Learning Objectives Guide to Administering IDEA Team Teaching All resources on our website.
Report Background Comparison Groups Converted Scores
5/3/ The Report: Comparative Information Comparison Groups IDEA Discipline Institution
5/3/ Comparison Groups (norms) IDEA Comparisons Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5% of the database 128 institutions 44,455 classes Updated only periodically
5/3/ Comparison Groups (norms) Discipline Comparisons Updated annually (September 1) Most recent 5 years of data Approximately July 1-June 30 Exclusions same as IDEA Comparisons Also exclude classes with no objectives selected Minimum of 400 classes
5/3/ Comparison Groups (norms) Institutional Comparisons Updated annually (September 1) Most recent 5 years of data Approximately July 1-June 30 Most recent 5 years of data Includes Short and Diagnostic Form Exclude classes with no objectives selected Minimum of 400 classes
5/3/ Norms: Converted Averages Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale Use T Scores Average = 50 Standard Deviation = 10 They are not percentiles
Report Background Adjusted Scores
5/3/ Adjusted Scores Control for factors beyond instructor’s control Regression equations Link to video clip explaining Adjusted Scores 109
5/3/ Adjusted Scores: Diagnostic Form Student Work Habits (#43) Student Motivation (#39) Class Size (Enrollment, FIF) Student Effort (multiple items) Course Difficulty (multiple items)
IDEA...The Report
5/3/ The IDEA Report Diagnostic Form Report What were students’ perceptions of the course and their learning? What might I do to improve my teaching?
5/3/ Questions Addressed: Page 1 What was the response rate and how reliable is the information contained in the report? What overall estimates of my teaching effectiveness were made by students? What is the effect of “adjusting” these measures to take into consideration factors I can’t control? How do my scores compare to other comparison groups?
5/3/ Summary Evaluation of Teaching Effectiveness
5/3/ Questions Addressed: Page 2 How much progress did students report on the learning objectives that I identified as “Essential”? How does this progress compare to the available comparison groups? How much progress did students report on the “Important” objectives? How does this progress compare to the available comparison groups? Do conclusions change if “adjusted” rather than “raw” ratings are used?
5/3/ Progress on Specific Objectives
5/3/ Questions Addressed: Page 3 Which of the 20 teaching methods are most related to my learning objectives? How did students rate my use of these important methods? What changes should I consider in my teaching methods? Do these results suggest some general areas where improvement efforts should focus?
5/3/ Improving Teaching Effectiveness
5/3/ Improving Teaching Effectiveness IDEA Website: IDEA Papers ul-resources/knowledge-base/idea-papers ul-resources/knowledge-base/idea-papers
5/3/ Questions Addressed: Page 2 How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter? How distinctive is this class with regard to student self-ratings?
5/3/ Description of Course and Students
5/3/ Questions Addressed: Page 4 What was the average rating on each of the questions on the IDEA form? How much variation was there in these ratings? Are the distributions of responses relatively “normal” (bell-shaped) or is there evidence of distinctive subgroups of students? What are the results for the additional questions I used?
5/3/ Statistical Detail
5/3/ Statistical Detail
Questions & Discussion