The role of course evals in faculty reviews: One perspective John Petraitis CAFE March 25, 2011.

Slides:



Advertisements
Similar presentations
IDEA Course Evaluations New Faculty Academy Spring,
Advertisements

Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
Outcomes Assessment. Many academic institutions measure the success of service- learning based on participation, number of hours, or estimated monies.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
Mark Troy – Data and Research Services –
Successful Online Implementation Depends on a Culture of Trust Bethel University College of Arts and Sciences Dr. Richard Sherry Dean of Faculty Growth.
IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott.
Evaluation of Training
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Understanding and Using Your IDEA Evaluation Results Nina Campanicki & Ernie Linsay Faculty Development Day March 20, 2010.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
ASP 2013 Student Luncheon Workshop Tips for Preparing Teaching Portfolios for the Teaching Job Search Darlene Smucny, Ph.D. Collegiate Professor & Academic.
Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.
IDEA Student Ratings of Instruction Insight Improvement Impact ® Loyola University of Chicago April 10-11, 2013 Steve Benton, PhD Senior Research Officer.
TTU Teaching Evaluations at TTU Using the IDEA Instrument.
RUBRIC FOR CLASS DISCUSSION 0 Absent. 1 Present, not disruptive. Tries to respond when called on but does not offer much. Demonstrates very infrequent.
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Interpreting IDEA reports Diagnostic Form Short Form
Assessment Overview: Part 1. Overview Overview of IDEA Data – Not other college assessments like AACP surveys, experiential results, dashboards, etc.
Assessment Overview Drake CPHS. Overview Overview of IDEA Data Assessing college-wide teaching goal Advising Results Q&A.
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Student Learning Objectives (SLOs) Measuring Teacher Effectiveness Through the Use of Student Data SLO Process – Step 4 Monitoring Progress and Refining.
Now That My Marking’s Done, Now What??? Brent Galloway Lori Stuber.
Clear Purpose: Assessment for and of Learning: A Balanced Assessment System “If we can do something with assessment information beyond using it to figure.
IDEA Making an Old Enemy Your Friend. MYTH or REALITY?  IDEA is a for-profit corporation.
A User-Friendly Approach to Streamlining the Collection and Analysis of SLO Evidence Dave Karp & Tom Vitzelio.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
My Course Was Selected For LAE Assessment Reporting. What Do I Do Now??? LAE Assessment.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
Understanding Thinking as the Key to Content. Think For Yourself (8-1): Understanding content as something to be thought through. Selecting a subject.
IDEA Student Ratings System Loyola University September 9 & 10, 2015 Jake Glover, PhD Senior Education Officer.
 Shelley A. Chapman, PhD Texas A & M University February 2013.
The Teaching Process. Problem/condition Analyze Design Develop Implement Evaluate.
Home Base IIS Webinar: How Teachers Can Analyze Data and Reports In Schoolnet March 13, 2014.
IDEA & Campus Labs Partnership Loyola University Chicago - Fall 2015 September 14, 2015.
Teaching Portfolio at McGill Laura Winer Director, Teaching and Learning Services October 21, 2014.
An Exploratory Analysis of Teaching Evaluation Data Michael D. Martinez Department of Political Science College of Liberal Arts and Sciences University.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
Faculty Information Form Insight Improvement Impact ® University of Alabama Birmingham.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Deciphering SPA Requirements Kathy Hildebrand, Ph.D., Assistant Dean of Assessment & Continuous Improvement, College of Education Cynthia Conn, Ph.D.,
Using IDEA for Assessment Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Summary of CPHS Course Evaluations: AY and Chuck Phillips August 2009.
Select Slides… Spring 2013 Training Strengthening Teaching and Learning through the Results of Your Student Assessment of Instruction (SAI) For Faculty.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
INTRODUCTION TO ASSESSMENT METHODS USED IN MEDICAL EDUCATION AND THEIR RATIONALE.
CGD 218 AID Experience Tradition/cgd218aid.com FOR MORE CLASSES VISIT
Good teaching for diverse learners
Making an Old Enemy Your Friend
Teaching Evaluations at TTU Using the IDEA Instrument
LAE Assessment My Course Was Selected For LAE Assessment Reporting.
Teaching Evaluations at TTU Using the IDEA Instrument
Overview This snapshot slide is intended to provide a quick summary of the highlights of each candidate so that pertinent details are not missed by the.
Teaching Goal Overview Drake CPHS
IDEA Course Evaluations
LEARNAPALOZZA: SERVICE-LEARNING AT CPCC
Working with Student Success Data: Session 2 (Jan. 31, 2014)
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
IDEA Student Ratings of Instruction
Building Better Classes
Presentation transcript:

The role of course evals in faculty reviews: One perspective John Petraitis CAFE March 25, 2011

Summative vs. Formative Summative: – Course evals as ends – 1 st page of IDEA report Formative/Developmental – Course evals as means – Remaining pages of IDEA

Logic of IDEA One size does not fit all – Allows faculty to identify key objectives – E.g., exploration of personal value, vs. creativity vs. understanding principles Advertise the warts – Reliability and representativeness

Three Key Pieces 12-item Faculty Info Form (FIF) 12-item Faculty Info Form (FIF) 36-item Student Reaction to Instruction Form (SRIF) 36-item Student Reaction to Instruction Form (SRIF) Multi-page Report

FIF 12 learning objectives 12 learning objectives

FIF Which of the 12 objectives do I chose? Which of the 12 objectives do I chose? – Depends on course – Depends on instructor – Depends on section – Depends on dept Recipes for disaster – Choosing all of them – Choosing none of them (by not filling out FIF)

SRIF Two kinds of Q’s Two kinds of Q’s – 36 questions linked to 12 learning objectives. – Extraneous influences, e.g., Discipline Student motivation Class size

Marriage of FIF &SRIF Weighting of students responses (on SRIF) for results/report Weighting of students responses (on SRIF) for results/report M = 0 M = 0 I = 1 I = 1 E = 2 E = 2

The Report (page 1) Reliability and Representativeness – Reliability number (not percent) Based on number (not percent) of respondents Would adding a few more respondents potentially alter the results dramatically? – Representativeness percent (not number) Based on percent (not number) of respondents 65% response rate is considered representative

The Report (page 1) Reliability and Representativeness – Reliability number (not percent) Based on number (not percent) of respondents Would adding a few more respondents potentially alter the results dramatically? – Representativeness percent (not number) Based on percent (not number) of respondents 65% response rate is considered representative

The Report (page 1) Reliability and Representativeness – Representativeness is key – If representative, reviewers could use page 1 as summative and formative – If not representative, reviewer might use if it helps the candidate

Averages: – Section A: “I” and “E” objectives Adjustments to IDEA Ratings – Discipline (national comparisons) ‘level the playing field’ – “Extraneous Influences” on student ratings are used to adjust scores or ‘level the playing field’ SRIF Item #39: SRIF Item #39: I really wanted to take this course regardless of who taught it SRIF Item #43: SRIF Item #43: As a rule, I put forth more effort than other students on academic work Class size The Report (page 1)

(food for thought) Improving Teaching Effectiveness (food for thought) The Report (p. 2-3)

IDEA and your review file Do your own formative evaluation Do your own formative evaluation – What can you learn from pages 2-3? – Show your interest in professional development – Do it even if IDEA is not reliable or not representative Do your own summative evaluation Do your own summative evaluation – Make it easy. Make it clear.

Interpreting IDEA results Reliability – “Unreliable” if small # students responded (even in low- enrollment courses) – Would results fluctuate significantly if a few more people completed it? – IDEA’s standard is 10 respondents or 75% response rate. – Unreliable does not mean not useful – it just means that it cannot be part of IDEA’s data base for comparative purposes small classes w/ high response rates Consider high response rates in small classes as reliable information – FOR that class, if not for normative comparisons.