Student Evaluation of Teaching Task Force 1 Final Report and Proposal Presented to OSU Faculty Senate February 9, 2012.

Slides:



Advertisements
Similar presentations
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Advertisements

A presentation by: The University Student Evaluation of Teaching Task Force August, 2014.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Campus-wide Presentation May 14, PACE Results.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Strengthening Institutions Programs Title III
Using the Seven Principles as a Framework for the Evaluation of Student Ratings and Teaching Karl Wirth and Adrienne Christiansen Serie Center for Scholarship.
Faculty Affairs presents:. PPCs  Consist of 3 or 5 members  Are selected based on Program Personnel Standards (i.e. one per program or one per faculty.
Greenville Technical College Assessing and Developing Student Computing Technology Skills September 19, 2012 Dr. Lenna Young, Mark Krawczyk, and Mary Locke.
Student Evaluations. Introduction: Conducted: Qualtrics Survey Fall 2011 o Sample Size: 642 o FT Tenured: 158, FT Untenured: 59 o Adjunct: 190 o Students:
Faculty Evaluation Systems: Student Evaluations of Faculty What are we measuring? “The evaluation of teachers is a mark of a good college.” Ernest Boyer,
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
An Assessment Primer Fall 2007 Click here to begin.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
The Academic Assessment Process
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
The reform of A level qualifications in the sciences Dennis Opposs SCORE seminar on grading of practical work in A level sciences, 17 October 2014, London.
What are competencies – some definitions ……… Competencies are the characteristics of an employee that lead to the demonstration of skills & abilities,
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Grade 12 Subject Specific Ministry Training Sessions
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Techniques for Improving Student Learning Outcomes Lynn M. Forsythe Ida M. Jones Deborah J. Kemp Craig School of Business California State University,
Principles of Assessment
+ Measuring Teaching Quality in the Online Classroom Ann H. Taylor Director, Dutton e-Education Institute College of Earth and Mineral Sciences.
Jeanne M. Clerc, Ed.D. Western Illinois University (WIU) October 14, 2011.
Becoming a Teacher Ninth Edition
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Ohio’s Assessment Future The Common Core & Its Impact on Student Assessment Evidence by Jim Lloyd Source doc: The Common Core and the Future of Student.
Classroom Assessment and Grading
Assessment for Optimal Learning Tace Crouse Faculty Center for Teaching and Learning University of Central Florida.
EFFECTIVENESS OF TRAINING Group 5. Effectiveness of Training  What is “effectiveness of training”? Effectiveness means producing an intended result.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
Standards-Based Science Instruction. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Academic Year.  Still working well 17 reports submitted, 1 missing  9 of 18 departments expressed concerns about assessment 4 departments reported.
ASSESSMENT METHODS & TOOLS FOR ODL Philemon Mahlangu UNISA, Florida Campus Tel: (011)
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
HECSE Quality Indicators for Leadership Preparation.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Tailoring Course Evaluations/Student Feedback to Improve Teaching Jeffrey Lindstrom, Ph.D. Siena Heights University Webinar 6 October 2014.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Student Evaluation of Teaching Task Force 1 Final Report and Proposal Presented to OSU Faculty Senate February 9, 2012.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Faculty Evaluation for Online Learning Institutional Standards and Emerging Practices Ellen Hoffman Eastern Michigan University.
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
ESET: Combined Versus Uncombined Reports Stefani Dawn, PhD Assistant Director of Assessment Academic Programs, Assessment and Accreditation.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education September 2010.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Quality Assurance processes
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education Updated: June 2012.
Partnership for Practice
Improving the First Year: Campus Discussion March 30, 2009
Qualtrics Proposal Gwen Gorzelsky, Executive Director, TILT
The Efficacy of Student Evaluations of Teaching Effectiveness
Course Evaluation Ad-Hoc Committee Recommendations
The Heart of Student Success
SLOs, Curriculum, and Other Things that Shape Your Classroom
Presentation transcript:

Student Evaluation of Teaching Task Force 1 Final Report and Proposal Presented to OSU Faculty Senate February 9, 2012

Why Conduct SET? Improve both teaching and learning Provide students a voice in assessment of instruction/faculty Meet state OAR (3) requirements, which do not specify the current format: “Specific provision shall be made for appropriate student input into the data accumulated as the basis for reappointment, promotion, and tenure decisions, and for post-tenure review. Sources of such input shall include, but need not be limited to, solicitation of student comments, student evaluations of instructors and opportunities for participation by students in personnel committee deliberations.” 2

Previous situation: Paper form SET used  Questions 1 and 2: Intended to be Summative  Questions 3 and on: Intended to be Formative  Written comments: Instructor only

Current situation: Electronic SET is used for Formative evaluation Summative evaluation Program assessment Our task force is not involved with the transition SET to eSET

Charge of the committee: Identify the university values in teaching expectations. Compare and contrast the advantages and disadvantages of the student assessments of teaching (SAT) and student evaluations of teaching (SET) as a means of acquiring student input. If SET forms are deemed most appropriate, assess, using informed psychometrics, the validity and reliability of the current SET form and recommend changes as needed. If SAT forms are deemed most appropriate, consider new forms and provide recommendations. Assess the role of student input forms on teaching effectiveness and make recommendations for consistent use of the form in teaching evaluations across academic units.

Extensive literature on topic. Is there valid information for the evaluation of an instructor in SET data? Split vote on this. Should SET data be used for personnel decisions? Strong no.

Our findings show that student evaluations are strongly related to grades and that learning, as measured by future grades, is unrelated to student evaluations once current grades have been controlled. We also provide evidence that evaluations vary with instructor characteristics, the type of section, and composition of the class. We find, for example, that students sometimes give lower evaluations to women and to foreign-born instructors. We do not believe that our results are specific to our institutional setting, and expect our results to be qualitatively similar for higher education generally. EVALUATING METHODS FOR EVALUATING INSTRUCTION: THE CASE OF HIGHER EDUCATION Bruce A. Weinberg Belton M. Fleisher Masanori Hashimoto

William & Mary Journal of Women and the Law Volume 13 | Issue 1 Article 4 Observations on the Folly of Using Student Evaluations of College Teaching for Faculty Evaluation, Pay, and Retention Decisions and Its Implications for Academic Freedom William Arthur Wines Terence J. Lau

For administrators, the attractiveness of student evaluations of faculty is that they provide an easy, seemingly objective assessment of teaching that does not require justification. The ease of student evaluations comes in reducing the complexities of teaching performance to a series of numbers, particularly when commercial forms are used. The most common type of commercial student evaluation form utilizes a Likert-type scale for students to rate faculty related to a series of statements about the course and instruction. Each point on the scale is assigned a numerical value which allows the computation of composite scores for individual items, groups of items, or all of the items. Finally, the student ratings are often normed nationally and locally in spite of the near universal recommendations in the literature against norming of student ratings.

Literature Can Be Grouped into Following Areas: a.Evaluations Used for Improper Purposes b.Student Evaluations Reveal Bias Against Certain Groups 1.Double Standard 2.Beauty bias 3.Asian bias 4.“Miss Congeniality” bias 5.Thirty second snap-shot 6.Classroom environment 7.Correlation with anticipated grade 8.Smaller classes score higher

What the Task Force Learned From students: – Expect anonymity – Like the idea of formative feedback – Don’t know why student evaluations of teaching are conducted or how information is used From administrators: – Express a need for summative information 11

What the Task Force Learned From faculty: o Worry about inconsistent use of scores in current system o Have concerns about variability in value constructs o Doubt the validity of a single instrument for such a wide range of course types o Appreciate customization of proposed feedback o Written comments are more useful o Numerical data can give trends over time 12

Problems with Current SET Form Feedback comes too late Require value constructs (excellent, etc.), which tend to vary between students Global/overall ratings (#1 and #2) ignore complexity of teaching May be influenced by situational factors Inconsistent use in faculty evaluation – Discourages innovation – Creates perverse incentives 13

Formative, summative, and program assessment goals are contradictory! Formative: look for what is or is not working well in my class. Summative: show that I deserve a pay raise. Program: show that my department deserves more resources. Need to decouple these three functions! Summative data goes to personnel file.

Program assessment. How does a course fit program criteria? This is a curricular issue and should be addressed on the departmental level. The department is responsible for ensuring that instructors address learning outcomes. Our task force is not responsible for using the eSET for bac core purposes.

Summative assessment. Examples: Student focus groups Exit interviews Peer review Supervisor review If numerical data are used, proper statistical analysis needs to be performed. If we do SAT, use SAT as the start and context of a discussion between the supervisor and instructor. Current use of SET scores raises legal questions.

Formative assessment: we do it all the time. Clickers are great! But: NOT anonymous! Need for documented assessment.

Time spent on each homework set: A)Less than 1 hour B)Between 1 and 2 hours C)Between 2 and 3 hours D)Between 3 and 4 hours E)Between 4 and 5 hours F)More than 5 hours

Time spent on each homework set: A)Less than 1 hour2% B)Between 1 and 2 hours13% C)Between 2 and 3 hours27% D)Between 3 and 4 hours31% E)Between 4 and 5 hours19% F)More than 5 hours9%

Task Force’s Goals for Assessment Tool Focus on improving teaching Focus on elements that affect student learning Employ a formative approach Allow for evaluation of diverse teaching methods and philosophies Provide a flexible system that faculty can adapt to their course 20

An Assessment Tool Should... Permit feedback during the term, when it’s helpful to the class Allow instructors to choose items Limit access to the data to discourage misleading and invidious comparisons Address factors that affect learning (e.g., course design, classroom environment, materials) 21

Proposed Formative Categories Instructional design – Objectives – Exams and assignments – Materials and resources Engaging learning – Learning activities – Classroom environment – Extended engagement Instructional assessment – Fairness – Helpfulness – Opportunity to demonstrate knowledge 22

Self-reported course impact on the student – Motivation – Cognitive expansion – Skill development Alternative and supplemental teaching/learning environment – Laboratory and discussion – Clinical – Seminars – Team teaching – Field trips – Studio 23 Proposed Formative Categories

Proposal Change to a formative assessment tool Create a fully customizable instrument Rename “Student Assessment of Teaching” (SAT) Deploy online Allow teachers control of items used, timing/frequency, and access to data Report which items were used and when to administrators, but not results Have teachers share with supervisor steps taken to improve teaching (Periodic Review of Faculty: PROF) 24

We propose to run a pilot test in Fall 2012 and Winter Find implementation problems. Compare results from new and old format. Feedback on summative use. Four units are willing to participate.