Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part II October 4, 2010.

Slides:



Advertisements
Similar presentations
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
Advertisements

Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part I March 16, 2009.
Maximizing Your NSSE & CCSSE Results
Using the Seven Principles as a Framework for the Evaluation of Student Ratings and Teaching Karl Wirth and Adrienne Christiansen Serie Center for Scholarship.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
College-wide Meeting October 28,2011. Success is what counts. 2 Achieving the Dream is a national initiative to help community colleges provide the best.
Midterm Evaluations of Teaching Pilot Project Kiran Mahal & Dr. Simon Bates.
Student Evaluations. Introduction: Conducted: Qualtrics Survey Fall 2011 o Sample Size: 642 o FT Tenured: 158, FT Untenured: 59 o Adjunct: 190 o Students:
NSSE 2014: Accolades and Action Items Faculty Senate Nov. 20, 2014 Patrick Barlow, Ph.D., Assessment Coordinator.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
Redesign of PSYC 1101 into a 50% Online (Hybrid) Course Sue Spaulding, UNC Charlotte Pearson Education March 9, 2012 Boston Office.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
ICE Evaluations Some Suggestions for Improvement.
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY.
1 A Comparison of Traditional, Videoconference-based, and Web-based Learning Environments A Dissertation Proposal by Ming Mu Kuo.
Increasing Online Survey Response Rates April 7, 2015 Institute for Faculty Development.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Report of the Results of the Faculty Survey of Student Engagement William E. Knight and Jie Wu Office of Institutional Research Presentation to the Faculty.
Brandon Horvath, Ph.D. Associate Professor Turfgrass Pathology University of Tennessee Teaching Documentation.
November 9,  Proving that students are learning  Reaction to challenges in public education  Rising potential, stagnant performance  Regional.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
-SLO Development Progress -SLO Activity -Assessment Progress -Support Needs.
“if student ratings are part of the data used in personnel decisions, one must have convincing evidence that they add valid evidence of teaching effectiveness”
“Improving Your Program Assessment Report.” UNIVERSITY ASSESSMENT COMMITTEE DEBRA BALLINGER AND ADAM MCGLYNN.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
University of Arkansas Faculty Senate Task Force on Grades Preliminary Report April 19, 2005.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Maximizing Learning Using Online Assessment 2011 SLATE Conference October 14, /12/ P. Boyles, Assistant Professor, Chicago State University,
Evaluating a Research Report
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Teaching Thermodynamics with Collaborative Learning Larry Caretto Mechanical Engineering Department June 9, 2006.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Main components of effective teaching. SRI validity A measuring instrument is valid if it measures what it is supposed to measure SRIs measure: Student.
Students Course and Teacher Evaluation Please refer to notes at the bottom of this slide.
Validity of Researcher-Made Surveys. Evidence of Validity.
Exhibit 3.11 Data Report to Unit Assessment Committee Spring 2008.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
PRESENTATION TO ASSOCIATION OF INSTITUTIONAL RESEARCH AND PLANNING OFFICERS BUFFALO, NEW YORK JUNE 11, 2009 How Campuses are Closing the GE Assessment.
WRIT 1122 Faculty meeting September 23, Satisfaction with goals and features  The survey results showed that faculty are satisfied overall with.
Concerns, Data, Next Steps.  New Administration Software from Scantron  New Academic Senate Policy  New Items/Survey Form (ACE, Iowa Item Pool)  New.
An Assessment of the Readiness of a Tertiary Healthcare Organization in Saudi Arabia, in Adopting Effective Online Staff Development Programs Adnan D.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Select Slides… Spring 2013 Training Strengthening Teaching and Learning through the Results of Your Student Assessment of Instruction (SAI) For Faculty.
Faculty Forum: Evaluation of Teaching Sponsored by the Faculty Senate November 10, 2006.
Fall 2006 Faculty Evaluation and Tenure Review Process Tenure Review Process Riverside Community College District.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Kenneth C. C. Yang The University of Texas at El Paso Presented at 2016 Sun Conference TEACHING INFORMATION LITERACY SKILLS IN COLLEGE CLASSROOMS: EMPIRICAL.
Advanced Writing Requirement Proposal
“Crowdsourcing” a Textbook: 120 Student Authors Writing on a Wiki
Documenting Your Teaching and Student Evaluations
Department of Physics and Goal 2 Committee Chair
Tenure and Recontracting August 29, 2017
–Anonymous Participant
Introduction to the NSU Write from the Start QEP
Tenure and Recontracting February 7, 2018
Tenure and Recontracting August 27, 2018
Tenure and Recontracting February 6, 2018
Tenure and Recontracting October 6, 2017
Course Evaluation Ad-Hoc Committee Recommendations
Student Evaluations of Teaching (SETs)
IDEA Student Ratings of Instruction
Unit 4 - A06 – Review Grade Criteria To get a c
Tenure and Recontracting February 26, 2019
Associate Professor P&T Workshop Associate to Full Professor
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part II October 4, 2010

Framework for Complete Re-design

Data Sources Peer-reviewed literature Surveys of teaching faculty –Spring 2009 and Fall 2009

Final Framework Based on Feldman’s (2007) summary of 23 empirical studies Advantages of using this framework: –it empirically discriminates which dimensions of teaching are the most important ones by examining: association with student achievement association with overall evaluations –offers reliable results (i.e., across many studies)

Item Pool Items on our final instrument were derived from three major sources: –existing course-evaluation instruments –peer-reviewed literature –surveys of teaching faculty

Criteria for Inclusion To be included on our instrument, each item needed to meet three criteria (see next slide):

Criteria for Inclusion 1)clearly relate to Feldman’s (2007) “high” or “moderate” (in importance) dimensions 2)be empirically associated with both high student achievement and high overall evaluations 3)be reasonably applicable to all of the following types of classes: freshman senior general education graduate distance-learning

Our New Instrument: Student Response to Faculty Instruction (SRFI)

SRFI Items 1.The instructor clearly explained his/her grading criteria, including how final grades in this course will be determined. 2.The instructor was clearly interested in the course material. 3.The instructor presented and explained ideas effectively.

SRFI Items, cont. 4.The instructor communicated the significance of the subject. 5.Throughout the course, the instructor made it clear what I should learn and accomplish. 6.The instructor was clearly interested in the learning of each student. 7.I would recommend this instructor to other students.

Pilot Testing

Fall classes 775 students responded Response rate: 81% Reliability estimate: Cronbach’s alpha:.93

Spring classes 638 students responded Response rate: 87% Reliability estimate: Cronbach’s alpha:.92

Validity Checks

Issue #1 Is students’ interest in the subject matter related to SRFI responses?

Correlations with Student Prior Interest Item text: “I was interested in the subject matter of this course even before it started.” Fall 2009 data: correlations ranged from.10 to.29 Spring 2010 data: correlations ranged from.09 to.25

Issue #1 Conclusion There is at best a weak relationship between prior interest in the subject matter and SRFI responses. This is very good news for gen. ed. instructors! They shouldn’t expect lower ratings on the SRFI simply because their course is gen. ed.

Issue #2 Are course grades related to SRFI responses?

Correlations with Course Grades Class-averages Fall 2009 data: correlations ranged from.01 to.23 (one correlation was negative: -.09) Spring 2010 data: correlations ranged from.02 to.23

Issue #2 Conclusion There is at best a weak association between course grades and student ratings on the SRFI. This means that “easy” courses don’t necessarily get high ratings from students.

Issue #3 Is class size related to SRFI responses?

Correlations with Class Size Course enrollment Fall 2009 data: class sizes ranged from 2 to 92 correlations ranged from.00 to -.36 Spring 2010 data: class sizes ranged from 2 to 60 correlations ranged from.02 to.12

Issue #3 Conclusion There is at best a weak association between class size and student ratings on the SRFI. This is good news for instructors of large classes; lower SRFI responses should not be expected simply because of class size.

Issue #4 Is course level related to SRFI responses?

Correlations with Course Level Fall 2009 data: 5 graduate level level level correlations ranged from -.01 to.22 Spring 2010 data: 2 graduate level level level correlations ranged from.06 to -.22

Issue #4 Conclusion There is at best a weak association between course level and student ratings on the SRFI. This is good news for instructors of first-year classes!

Committee Conclusions

SRFI’s Up We believe our pilot data provide satisfactory evidence of both the reliability and the validity of this instrument for all classes at SUNY Oneonta. There are no demographic questions on the SRFI. There will be space for free-response comments on the back of the form.

Validity Studies SPISRFI –Fall 2009 N = 775 –Spring 2010 N = 638

SPI < SRFI Advantages of the SRFI over the SPI: –demonstrated reliability and validity to the extent that we are able to measure validity without being invasive –shorter form requires less class time to administer –contributes to the College’s goal of sustainability 56 fewer pages of results for each renewal/tenure/promotion application

SPI/SRFI Comparison Item text: “The instructor presented and explained ideas effectively.”

Committee Recommendations

Package A: Recommendation Regarding the New Instrument (1 recommendation in this package)

Recommendation #1 The SRFI should replace the SPI, effective Spring 2011.

Package B: Recommendation Regarding Administration of the SRFI (1 recommendation in this package)

Recommendation #2 Departments should consider having a third party (not a class member) administer the SRFI. This person should verify that all forms correctly identify the course.

Package C: Recommendations Regarding Handling and Reporting of SRFI Results (6 recommendations in this package)

Recommendation #3 SRFI results should be returned to faculty only after grades have been recorded.

Recommendation #4 SRFI results should include: mean standard deviation frequencies

Recommendation #5 SRFI means should be reported to no more than one decimal place, and standard deviations to no more than two decimal places. Additional decimal places suggest a level of precision that does not exist.

Recommendation #6 SRFI results should be compared to those of the department and course level, never the college-wide average. 50% of SPI results are from 100-level courses.

Recommendation #7 SRFI results should be returned to faculty in a format that can be placed directly into dossiers without re-configuration or integration of comparative data.

Recommendation #8 Our committee’s published guidelines for interpreting SRFI data should be included as a cover sheet in all dossiers. Available on our web site.our web site

Package D: Recommendations Regarding Faculty Development to Improve Instruction (3 recommendations in this package)

Recommendation #9 Early-career faculty should be encouraged to consider administering anonymous surveys early in the semester. Areas of weakness can be identified and corrected prior to the end-of-semester SRFI.

Recommendation #10 Peer review should be strengthened, including the use of peer reviewers external to the discipline. See our web site for resources.our web site

Recommendation #11 Faculty should be encouraged to seek informal peer review to improve their teaching. This can be mutually beneficial!