Presentation is loading. Please wait.

Presentation is loading. Please wait.

IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott.

Similar presentations


Presentation on theme: "IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott."— Presentation transcript:

1 IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott

2 5/1/2015 2 Purpose of Presentation Interpretation of the Student Ratings of Instruction Forms Reports Interpreting the Diagnostic Form Report for Improved Teaching Effectiveness

3 5/1/2015 3 IDEA is an acronym for.. Individual Development and Educational Assessment

4 5/1/2015 4 IDEA Uses IDEA system Should be able to use IDEA for ID = Individual Development Should be able to use IDEA for EA= Educational Assessment

5 5/1/2015 5 Improvement of Student Learning Student Ratings can have a positive impact if... The instrument Is “learning focused” Provides a diagnostic The emphasis for “summative” faculty evaluation is appropriate 30%-50% of the overall evaluation of teaching Results are not over-interpreted Faculty trust the process

6 5/1/2015 6 IDEA: What you should know about the student ratings Reliability and validity of the IDEA system How to interpret IDEA reports and use IDEA resources for faculty improvement plans How to interpret and adjusted vs. unadjusted scores How to use group summary reports for program improvement

7 5/1/2015 7 IDEA: What you should know about the student ratings How to complete the FIF How to interpret the reports How to use IDEA reports to improve teaching effectiveness How to use IDEA resources to improve teaching Student ratings are data that must be interpreted

8 5/1/2015 8 Student Ratings- Reliable? Valid? In general, student ratings tend to be statistically reliable, valid and relatively free from bias (probably more so than other data used to evaluate teaching) Reliability – the likelihood that you will get the same results if the survey is administered again to the same group of students Validity – measures what it supposed to/intended to measure

9 5/1/2015 9 Reliability & Validity Dog, Saul T. IDEA University Spring 2007 Composition I 1010 (MWF – 10:00) There were 12 students enrolled in the course and 9 students responded. Your results are considered unreliable because the number responding is so Small. The 75% response rate indicates that results are representative of the class as a whole. < 10 students Unreliable 10-14 students Marginally Reliable 15-24 Fairly Reliable 25-39 Reliable >30 Highly Reliable

10 5/1/2015 10 Understanding the value of the IDEA System’s uniqueness? Student Learning Focus Diagnostic Component Scores Adjusted for Extraneous Influences What was instructor’s influence on learning? Documented Validity and Reliability National Comparative Data Group Summary Reports Program Assessment

11 IDEA as a Diagnostic to Guide Improvement And as a Tool to Evaluate Teaching Effectiveness

12 5/1/2015 12 Underlying Assumptions Students are not qualified to assess: Faculty expertise Appropriateness of goals, content, and organization of course Materials used in delivery How student work is evaluated, including grading practices

13 5/1/2015 13 Underlying Assumptions Nor are they qualified to assess “indirect” contributions to instruction Support for departmental efforts Assistance to colleagues Contributing to a positive atmosphere

14 IDEA Student Ratings of Instruction The Student Learning Model

15 5/1/2015 15 Student Learning Model Types of learning must reflect instructor’s purpose Effectiveness determined by student progress on objectives stressed by instructor

16 5/1/2015 16 Student Learning Model Specific teaching behaviors influence certain types of student progress under certain circumstances.

17 IDEA Student Ratings of Instruction- Forms Faculty Information Form Student Survey Diagnostic Form

18 IDEA: FIF Faculty Information Form

19 5/1/2015 19 Faculty Information Form One FIF per class being evaluated Course Information IDEA Department Codes Extended list: http://www.idea.ksu.edu/StudentRatings/deptcodes.html 12 Learning Objectives Course Description Items Best answered toward end of semester

20 5/1/2015 20 FIF: Selecting Objectives 3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help students accomplish the objective? Does the student’s progress on the objective influence his or her grade? In general, progress ratings are negatively related to the number of objectives chosen. Research Note 3

21 5/1/2015 21 Relevant Objectives Basic Cognitive Items 1, 2 Applications of Learning Items 3, 4 Expressiveness Items 6, 8

22 5/1/2015 22 Relevant Objectives Intellectual Development 7, 10, 11 Lifelong Learning 9, 12 Team skills 5

23 5/1/2015 23 Best Practices Multi-section courses Curriculum committee review Prerequisite-subsequent courses Incorporate into course syllabus

24 5/1/2015 24 Best Practices Discuss meaning of objectives with students Early in semester Inform that will be asked to rate their own progress on objectives Reflect on their understanding of course purpose and how parts of course fit the 12 objectives Discuss differences in perception of objectives’ meaning

25 Student Survey Diagnostic Form

26 5/1/2015 26 Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items 21-32 Student and Course Student Characteristics: Items 36-39, 43 Course Management/Content: Items 33-35 Global Summary: Items 40-42 Experimental Items: Items 44-47 Extra Questions: Items 48-67 Comments

27 5/1/2015 27 False False Assumptions Effective instructors effectively employ all 20 teaching methods. The 20 teaching methods items are used to make an overall judgment about teaching effectiveness. Students should make significant progress on all 12 learning objectives

28 5/1/2015 28 Using Extra Questions 20 Extra Questions available May be used to address questions at various levels: Institution Department Course Or all three

29 5/1/2015 29 Student Survey How to use extra questions Comments Constructive

30 Report Background Comparison Groups Converted Scores

31 5/1/2015 31 The Report: Comparative Information Comparison Groups IDEA Discipline Institution

32 5/1/2015 32 Comparison Groups (norms) IDEA Comparisons Classes rated in 1998-99, 1999-2000, 2000-2001 Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5% of the database 128 institutions 44,455 classes

33 5/1/2015 33 Comparison Groups (norms) Discipline Comparisons Most recent 5 years of data 2000-2005 Minimum of 400 classes Exclusions same as IDEA Comparisons Also exclude classes with no objectives selected

34 5/1/2015 34 Comparison Groups (norms) Institutional Comparisons Minimum of 400 classes Most recent 5 years of data Exclude classes with no objectives selected Include all class sizes

35 Report Background

36 5/1/2015 36 Report: Types of Scores Average Scores – Numerical averages on a 5- point scale Converted Scores – Compensate for different averages among course objectives and provide normative comparisons Raw Scores – unadjusted scores Adjusted scores – Compensate for extraneous factors beyond instructor’s control,.. “level the playing field”

37 5/1/2015 37 Converted Scores- WHY? Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale

38 5/1/2015 38 Converted Averages In classes where “Gaining Factual Knowledge” was an I or E Objective the average student rating of progress was 4.00 (5-point scale) In classes where “Gaining a broader understanding of intellectual/cultural activity” was an I or E objective, the average rating of progress was 3.69 If only 5-point averages are considered, those choosing the second objective would be at a disadvantage

39 5/1/2015 39 Norms: Converted Averages Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale Use T Scores Average = 50 Standard Deviation = 10 These are not percentiles

40 5/1/2015 40 Standard Deviation Tells us What?

41 5/1/2015 41 What do the converted ratings mean? Much Higher >63 score (highest 10%) Higher 56-62 (next 20 percent – 71-90%) Similar 45-55 (middle 40% of courses 31-70%) Lower 38-44 (next 20 percent (11-30%) Much Lower <37 (lowest ten percent)

42 5/1/2015 42 Adjusted Scores Control for factors beyond instructor’s control Regression equations

43 5/1/2015 43 Adjusted Scores: Diagnostic Form Student Motivation (#39) Student Work Habits (#43) Class Size (Enrollment, FIF) Course Difficulty (multiple items) Student Effort (multiple items)

44 5/1/2015 44 Impact of Extraneous Factors Gaining Factual Knowledge – Average Progress Ratings Work Habits (Item 43) Student Motivation (Item 39) Low Avg. Avg. High Avg. High Low 3.513.663.803.954.08 Low Avg. 3.603.763.914.054.07 Average 3.733.874.024.124.21 High Avg. 3.883.974.134.234.33 High 4.014.124.254.334.48 Technical Report 12, page 40

45 IDEA...The Report

46 5/1/2015 46 The IDEA Report Diagnostic Form Report What were students’ perceptions of the course and their learning? What might I do to improve my teaching?

47 5/1/2015 47 The Report: Questions What was the response rate and how reliable is the information contained in the report? What overall estimates of my teaching effectiveness were made by students? What is the effect of “adjusting” these measures to take into consideration factors I can’t control? How do my scores compare to other comparison groups?

48 5/1/2015 48 Summary Evaluation of Teaching Effectiveness

49 5/1/2015 49 Summary Evaluation of Teaching Effectiveness 50% 25%

50 5/1/2015 50 Summary Evaluation of Teaching Effectiveness

51 5/1/2015 51 Questions Addressed: Page 2 How much progress did students report on the learning objectives that I identified as “Essential”? How does this progress compare to the available comparison groups? How much progress did students report on the “Important” objectives? How does this progress compare to the available comparison groups? Do conclusions change if “adjusted” rather than “raw” ratings are used?

52 5/1/2015 52 Progress on Specific Objectives 4.1+4.1 4.0+4.0 +3.8 +3.9 6

53 5/1/2015 53 Questions: Teaching Effectiveness Which of the 20 teaching methods are most related to my learning objectives? How did students rate my use of these important methods? What changes should I consider in my teaching methods? Do these results suggest some general areas where improvement efforts should focus?

54 5/1/2015 54 Improving Teaching Effectiveness

55 5/1/2015 55 Improving Teaching Effectiveness POD-IDEA Center Notes www.idea.ksu.edu/podidea POD-IDEA Center Learning Notes IDEA Papers www.idea.ksu.edu/resources/Papers.html IDEA Seminars www.idea.ksu.edu

56 5/1/2015 56 Questions Addressed: Page 2 How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter? How distinctive is this class with regard to student self-ratings?

57 5/1/2015 57 Description of Course and Students

58 5/1/2015 58 Questions Addressed: Page 4 What was the average rating on each of the questions on the IDEA form? How much variation was there in these ratings? Are the distributions of responses relatively “normal” (bell-shaped) or is there evidence of distinctive subgroups of students? What are the results for the additional questions I used?

59 5/1/2015 59 Statistical Detail

60 5/1/2015 60 Statistical Detail

61 5/1/2015 61 Teaching & Student Learning Improvement By using the recommendations afforded by the IDEA analysis, individual faculty will be able to formulate changes in their pedagogical methods and course structure that have tangible results in next semester’s scores.

62 5/1/2015 62 IDEA Results Faculty with adjusted t-scores placing them in the similar, higher or much higher comparison categories in progress on relevant objective in the majority of classes evaluated, have arguably demonstrated effective teaching performance

63 5/1/2015 63 IDEA Results One semester of student ratings does not really serve any useful long-term evaluation purpose Multiple (4-5) evaluations spread over time to draw long-term implications

64 5/1/2015 64 IDEA Results IDEA ratings no more than 33% of the measure of teaching effectiveness Ill-served to define teaching effectiveness by a single measure of teaching performance

65 5/1/2015 65 Teaching Effectiveness – IDEA one piece of data Triangulation Other Sources of Evidence Self-evaluation/reflective statement Description of current teaching Course materials Graded appraisal tools (tests, essays, papers, etc.) Feedback from mentors/colleagues Peer observations Classroom assessment/research efforts

66 5/1/2015 66 Teaching Effectiveness– IDEA one piece of data Consider Teaching occurs over time…just a snapshot (compare progress in same course over several semesters) Type of course being evaluated Number of students responding, and the percentage of student responding General education course major course Global summary items Written comments Consider number, kind and difficulty of learning objectives selected

67 5/1/2015 67 Interpreting Diagnostic Form Reports Review results Use results to identify areas of improvement Use IDEA resources http://www.idea.k- state.edu/podidea/index.html http://www.idea.k- state.edu/podidea/index.html

68 5/1/2015 68 IDEA Center- POD Resources These succinct papers were written in collaboration with the Professional & Organizational Development Network in Higher Education (POD). As a resource to support teaching improvement, each is useful to anyone wanting to address specific ways to employ different teaching methods – each of which is utilized in the Diagnostic Form of the IDEA Student Ratings of Instruction System. http://www.idea.k-state.edu/index.html (Learning Notes)

69 5/1/2015 69 IDEA Trainers Faculty trainers within the division A trained faculty member who can assist other faculty with interpreting IDEA data and use the IDEA resources to improve teaching effectiveness.

70 5/1/2015 70 IDEA Resources Institutional Effectiveness & Research Office & Website http://www.roanestate.edu/effectiveness/resources/ IDEA Online IDEA Papers POD-IDEA Center Notes

71 5/1/2015 71

72 5/1/2015 72 Faculty Workshop Dates Today, Saturday, August 23 rd Harriman Campus -1:00-2:30; O-101 Oak Ridge Campus Wednesday, Sept. 17 th, 12:30-2:00 PM, Room TBA Wednesday, Sept. 17 th, 6:00-7:30 PM, Room TBA Harriman Campus Wednesday, Oct. 1 st, 6:00-7:30 PM, Room TBA Cumberland Campus Wednesday, Oct. 8 th, 11:00-12:30 CST, Room TBA Additional Workshops as needed

73 5/1/2015 73 Questions


Download ppt "IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott."

Similar presentations


Ads by Google