An Exploratory Analysis of Teaching Evaluation Data Michael D. Martinez Department of Political Science College of Liberal Arts and Sciences University.

Slides:



Advertisements
Similar presentations
End of Course Exam (EOC) for 11 th Grade United States History An Introduction.
Advertisements

NCAA Eligibility Center.  NCAA Eligibility Center Responsibilities.  Academic Initial-Eligibility Requirements.  Amateurism (Sports Participation).
Agenda CCR Assessment Updates Assessment Support for Educators Assessments and Beyond Accountability Updates 1.
Measurement in Psychology: Validity Lawrence R. Gordon Psychology Research Methods I.
Changing the Course Evaluation Process at by Jean-Pierre R. Bayard, PhD.
Academic Planning Workshop The Evergreen State College.
ICE Evaluations Some Suggestions for Improvement.
Allocations for Temporary FTE and Teaching Assistants: How Do They Work? Presentation to ABA Tuesday, January 12, 2010 Kathy Farrelly Director, Budget.
TA-orientation.ppt version: TA Orientation Fall 2007 William J. Rapaport (Outgoing) Director of Graduate Studies, SUNY Chancellor’s Award for.
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part II October 4, 2010.
UniLOA The University Learning Outcomes Assessment The Center for Learning Outcomes Assessment, Inc. ©
Reliability & Validity the Bada & Bing of YOUR tailored survey design.
Student Technological Mastery: It's Not Just the Hardware Wm. H. Huffman, Ph.D. Ann H. Huffman, Ph.D.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Internal Consistency Reliability Analysis PowerPoint.
Combining Face-to-Face Teaching with Online Content: Students’ Learning in Blended Courses Michael Milburn Psychology Department UMass/Boston.
Reliability, Validity, & Scaling
Fall 2008 Graduate Survey Assessment Day Presentation Office of Assessment Fitchburg State College.
2005 An Evaluation of a Required Upper-Division Liberal Studies Core Curriculum Florida Gulf Coast University College of Arts & Sciences Nora Egan Demers.
NCAA Eligibility Basics. What is the NCAA Eligibility Center?  The NCAA Eligibility Center is the organization that determines whether prospective college.
WHAT COMES NEXT? What can you do better than when you arrived at the beginning of the term? What did you put into it? The purpose of the 4K class… Did.
University of Arkansas Faculty Senate Task Force on Grades Preliminary Report April 19, 2005.
“Tools of the Trade” Presented by Gina Bailey, Assistant Director, Contracts and Grants Accounting Rodney Greer, Director Office of Research & Innovation,
MML R2R LSU Precalculus Redesign October 2003 – May 2006 Phoebe Rouse.
Eligibility Standards Rodney Garner. Clearinghouse Information 185,000 students register every year and only about 90,000 are certified Common core requirements.
Overview of College of Social Sciences Academic Departments Data Fall 2011 update Presentation to COSS Chairs Meeting COSS Dean’s staff September 7, 2011.
The role of course evals in faculty reviews: One perspective John Petraitis CAFE March 25, 2011.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
2010 Graduating Student Survey Assessment Day Presentation Office of Assessment Fitchburg State College.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Reliability & Agreement DeShon Internal Consistency Reliability Parallel forms reliability Parallel forms reliability Split-Half reliability Split-Half.
Online Course Evaluations Is there a perfect time? Presenters: Cassandra Jones, Ph.D., Director of Assessment Michael Anuszkiewicz, Research Associate.
Background Method Discussion References Acknowledgments Results  RateMyProfessors.com (RMP.com) launched in 1999 as an avenue for students to offer public.
1 of 14 Exploring race and gender differentials in student ratings of instructors: Lessons from a diverse liberal arts college Robert L. Moore, Hanna Song.
INTRO TO THEATRE ADJUSTING TO COLLEGE LIFE. Time management.
© 2008 Brigham Young University–Idaho Course Evaluations at BYU-Idaho 1.
Evolution of a Program Assessment Liberal Arts Degree.
Frederic Murray Assistant Professor MLIS, University of British Columbia BA, Political Science, University of Iowa Instructional Services Librarian Al.
Brandon Vaughn University of Texas at Austin
 Why?  * Department of Education (not LHU) awards PA certification to qualified applicants  * Students must apply for certification individually 
Authentic Discovery Projects in Statistics GCTM Conference October 16, 2009 Dianna Spence NGCSU Math/CS Dept, Dahlonega, GA.
Undergraduate Report Fall CRSS Enrollment Trends.
New Faculty Orientation Institutional Research and Accountability Sarah Logan August 18, 2011.
ALL 100-LEVEL G&G COURSES, FALL 1984 TO FALL 2007 Mahalo to Gary Rodwell and Susan Glanstein (U.H. Mānoa STAR Program) Scott Rowland and Davin Morimoto.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
Student Responsibilities. Advising  To consult their advisors on all matters pertaining to their academic careers, including changes in their programs.
Exploring the Right Postsecondary Fit Factors to Consider 8 th Grade Advisory Activity.
The collections  SERC: Science Education Resource Center at Carleton College  SERC office and staff helps develop and manage web resources for many projects.
Adventures in flipping a cell biology course Dr. Katie Shannon Biological Sciences Missouri S&T How do online videos and textbook reading engage students.
Lab 4: Alpha and Standard Error of Measurement. Reliability Reliability refers to consistency Types of reliability estimates – Test-retest reliability.
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
Satisfaction and Perceived Learning Outcomes
Some Suggestions for Improvement
NCAA Eligibility Basics
Peer Learning Assistants –
Research Question and Hypothesis
ROADMAP TO INITIAL ELIGIBILITY
Introduction to Statistics for the Social Sciences SBS200 - Lecture Section 001, Spring 2017 Room 150 Harvill Building 9:00 - 9:50 Mondays, Wednesdays.
Political Science 30: Political Inquiry
assessing scale reliability
Dr. Nina O’Brien Department of Management
Student Evaluation of Teachers Committee Report
NCAA initial Eligibility Standards
Wendi Richardson, Assistant Director Jessica Parker, Program Assistant
COURSE REGISTRATION FOR
Portfolio Development
MML R2R LSU Precalculus Redesign October 2003 – May 2006 Phoebe Rouse
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
WHAT COMES NEXT? What can you do better than when you arrived at the beginning of the term? What did you put into it? The purpose of the 4K class… Did.
Validity and Reliability II: The Basics
Chapter 8 VALIDITY AND RELIABILITY
Presentation transcript:

An Exploratory Analysis of Teaching Evaluation Data Michael D. Martinez Department of Political Science College of Liberal Arts and Sciences University of Florida August 17, 2011

Questions What does our teaching evaluation instrument actually tell us about our teaching? Are the items that students use to evaluate us actually measuring different things? Do the items in the teaching evaluation instrument actually produce a reliable scale? How much, on average, are teaching evaluations affected by level of instruction and size of class?

Teaching Evaluation form from a social science perspective Closed and open ended questions Questions Format AskedPublicly Visible Part I Q 1-7Closed-endedUF wideYes Part I Q 8-9Closed-endedCLAS onlyNo Part I Q 10Closed-endedUF wideYes Part I Q 11-15Closed-endedUF wideNo Part II Q 1-5Open-endedUF wideNo

Closed Ended Questions

Open ended questions

Data From CLAS Fall 1995 through Spring sections Only includes Publicly Visible Data Excludes CLAS items Excludes “control” variables Q11-15 Excludes open-ended questions

What does our teaching evaluation instrument actually tell us about our teaching? Are the items that students use to evaluate us actually measuring different things? Probably Not Students act as though they develop an attitude about the class and rank it on almost all items based on that attitude. Do the items in the teaching evaluation instrument actually produce a reliable scale? Yes

Inter-item correlations (CLAS Data, Fall 1995 through Spring 2010) Q1Q2Q3Q4Q5Q6Q7Q10 Q Q Q Q Q Q Q Q Cronbach’s alpha = 0.978

How much, on average, are teaching evaluations affected by level of instruction and size of class? SOME, but less than might be expected. Q10 = a + b 1 Lower + b 2 Grad + b 3 Log Enrollment + e LowerGrad level level level01

How much, on average, are teaching evaluations affected by level of instruction and size of class? SOME, but less than might be expected. Q10 = a + b 1 Lower + b 2 Grad + b 3 Log Enrollment + e b 1 will be the average effect of teaching a lower division course, relative to an upper division course, controlling for the size of the class. b 2 will be the average effect of teaching a graduate course, relative to an upper division course, controlling for the size of the class. b 3 will be the average effect of the log of class size, controlling for the level of the class.

Regression of Instructor Evaluation (Q10) on Level of Course and Class size (log) CLAS Lower (.004) Graduate.121 (.007) Lg enroll (.003) Constant (.008) R2R2.052 N of cases Entries are unstandardized regression coefficients, with standard errors in parentheses.

Regression of Instructor Evaluation (Q10) on Level of Course and Class size (log) CLASHumanitiesSoc and Beh Sci Phys and Math Sci Lower (.004) (.006) (.010) (.007) Graduate.121 (.007).095 (.014).074 (.013).273 (.014) Lg enroll (.003) (.005) (.005) (.004) Constant (.008) (.016) (.017) (.013) R2R N of cases

Expected Values: Humanities SizeLowerUpperGrad Expected Values: Phys and Math SizeLowerUpperGrad

Expected Values: Soc and Beh SizeLowerUpperGrad Expected Values: Political Sci SizeLowerUpperGrad

Morals of the Story We have a reliable teaching evaluation instrument which is definitely measuring something. Sections that are evaluated positively on one item tend to be evaluated positively on other items. Reliability isn’t validity. Response set could be a problem, but the cost of fixing it would be a disruption in the continuity of the data that we have. Like GRE scores, these scores should be regarded as a good measure, but not the only measure.

Morals of the Story Most variation in course evaluations is NOT accounted for by level of instruction or class size. Both class size and level of instruction matter, but should not be regarded as excuses for really low evaluations.

Darts and Laurels Laurel – Brian Amos, my graduate research assistant, for help with analyzing these data. Laurel – Dave Richardson and CLAS office, for making these data available to me. Dart – Academic Affairs, for not making these data publicly available in a usable form to everyone. Laurel – Academic Affairs, for (finally) creating a link to allow promotion candidates and award nominees to generate teaching evaluation reports in Word automatically with just a few clicks.

Darts and Laurels Dart – Academic Affairs, for not posting the CLAS-only items, and not posting the teaching evaluations of graduate students who taught their own courses. Laurel – Academic Affairs, for an otherwise much improved website showing evaluations Laurel – You, for patiently listening Thanks!