Final Update on the New Faculty Course Evaluation & Online System November, 2003.

Slides:



Advertisements
Similar presentations
STRATEGIES FOR COURSE REDESIGN EVALUATION Laura M. Stapleton Human Development and Quantitative Methodology University of Maryland, College Park
Advertisements

Promotion and Tenure Workshop 1. Evaluation Procedure There is only one evaluation procedure leading to recommendations regarding promotion, tenure and.
University-Wide Course Evaluation Committee Peter Biehl, Chair, Department of Anthropology Krissy Costanzo, Committee Staff Support; Academic Affairs March.
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Service to Leadership Freshmen Orientation Assessment Service to Leadership Professional Development Training August 14, 2012 Agriculture Research Complex.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Greenville Technical College Assessing and Developing Student Computing Technology Skills September 19, 2012 Dr. Lenna Young, Mark Krawczyk, and Mary Locke.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Academic Advising Implementation Team PROGRESS REPORT April 29, 2009.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
University Plan for the Assessment of Student Learning Spring 2006 Revisions Include: -Addition of Graduate School Learning Goals -Incorporation of recommendations.
Formative and Summative Evaluations
1 Southern Connecticut State University Graduate Council Academic Standards Committee Procedures for Southern Connecticut State University.
Outcomes Assessment Forum October 23, 2009 Queens College Steve Schwarz, Office of the Provost Eva Fernandez, CTL Dean Savage, Dept. of Sociology.
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part II October 4, 2010.
College of Engineering Hybrid Course Formats That Facilitate Active Learning Professor David G. Meyer School of Electrical and Computer Engineering.
Phillip R. Rosenkrantz, Ed.D., P.E. Industrial & Manufacturing Engineering Department California State University, Pomona.
Portfolio Development Teaching David Acker, Associate Dean Extension John Lawrence, Associate Dean Research Joe Colletti, Senior Associate Dean April 22,
Standards and Guidelines for Quality Assurance in the European
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Techniques for Improving Student Learning Outcomes Lynn M. Forsythe Ida M. Jones Deborah J. Kemp Craig School of Business California State University,
1 National Training Programme for New Governors 2005 Module 3 Ensuring accountability.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
+ Measuring Teaching Quality in the Online Classroom Ann H. Taylor Director, Dutton e-Education Institute College of Earth and Mineral Sciences.
“if student ratings are part of the data used in personnel decisions, one must have convincing evidence that they add valid evidence of teaching effectiveness”
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
Outcome Assessment Reporting for Undergraduate Programs Stefani Dawn and Bill Bogley Office of Academic Programs, Assessment & Accreditation Faculty Senate,
End of Course Evaluation Taimi Olsen, Ph.D., Director, Tennessee Teaching and Learning Center Jennifer Ann Morrow, Ph.D., Associate Professor of Evaluation,
Instructional Plan Template | Slide 1 AET/515 Instructional Plan Misty Lunsford.
Tailoring Course Evaluations/Student Feedback to Improve Teaching Jeffrey Lindstrom, Ph.D. Siena Heights University Webinar 6 October 2014.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
AQIP Action Projects  Action Project Directory created in 2002  An overt commitment to continuous improvement  At least three active action projects.
Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University.
On-line briefing for Program Directors and Staff 1.
University of Central Florida Assessment Toolkit for Academic, Student and Enrollment Services Dr. Mark Allen Poisel Dr. Ron Atwell Dr. Paula Krist Dr.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Fall 2015 Professional Development Days C. Cruz-Johnson & R. Gamez August 28, Walking with Integrity Towards Student Success.
Teaching Council Recommendations to the Faculty Senate DRAFT 2/9/09 & 3rd DRAFT Feb 13, 2009 for use by the FS Exec Committee, March 4, 2009.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright Information Copyright Karen St.Clair & Stan North.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright.
The Peer Review Process in graduate level online coursework. “None of us is as smart as all of us” Tim Molseed, Ed. D. Black Hills State University, South.
Assessment Taskforce Update College Council November 6, 2009.
Reviewer Training 5/18/2012. Welcome & Introductions Co-Chairs: NHDOE Representative:Bob McLaughlin.
Faculty Senate Retreat Fall Welcome Back A moment of gratitude Schedule of Events: 9:00 am - 9:15 am Welcome & Continuing Topics 9:15 am - 10:00.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
Faculty Forum: Evaluation of Teaching Sponsored by the Faculty Senate November 10, 2006.
Portfolios A number of years ago the portfolio became part of the requirements to attain the two highest levels of graduation status. Though one.
1 Michigan State University Preparation for EC 2000 Thomas F. Wolff, Ph.D., P.E. Associate Dean for Undergraduate Studies College of Engineering Michigan.
Assessment Plans Council of Deans, March 22, 2007.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
Creating a Comprehensive Early Warning System to Further Student Success and Retention Shane Hammond CCLA June, 2007.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
Assessment and Reporting
Taught Postgraduate Program Review
Consider Your Audience
How is your teaching recognized and rewarded in CBS? Dec 12, 2016
Course Evaluation Committee
EVC Accreditation Update Fall 2017 PDD Thursday, 3/31/17
Promotion Tenure and Reappointment
Student Satisfaction Results
Course Evaluation Ad-Hoc Committee Recommendations
Student Evaluations of Teaching (SETs)
Taught Postgraduate Program Review
Promotion Tenure and Reappointment
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Final Update on the New Faculty Course Evaluation & Online System November, 2003

Agenda Motivation for New Instrument Quick History of the Process Demonstration of “Report Back” Features Access for Various Audiences Template for Reporting to Promotion Committees Initial Implementation Period Issue of Response Rate Endorsements

Motivation for a New Instrument

Why are we doing this? Faculty complaints about the current FCE over the years Current FCE not reflective of what they do in class Questions too general & global to be informative Concern that the two overall questions are the only data used for promotion decisions

Quick History of Process

Process Fall 2001 Committee of Faculty & Eberly staff developed instrument (Akin, Ambrose, Fay, Fischhoff, Kadane, Larkey, Nair) Protocol Study with 20 students - revised instrument accordingly Pilot study in 23 courses (n = 635 student respondents) Half on-line, half on paper in class In one large class, half on-line, half on paper On-line response rate = 42% (Spring 2000 in class response rate = 52%) Respondents from all colleges Freshmen through graduate students Wide range of class types (e.g. small seminars,labs, studios, lectures, project courses) Spring 2002 Analyzed data, interviewed pilot faculty and students - revised accordingly Eberly & OTE continued development and testing of online survey instrument

Process (cont) Fall 2002 Met with former and current members of the university RPT committee members (to discuss summative use of data) Continued development of on-line system Spring 2003 Pilot Study: 46 courses (n = 490 student responses) On-line response rate = 46% Respondents from all colleges Freshmen through graduate students Wide range of class types (e.g. small seminars,labs, studios, lectures, project courses) Analyzed data, secured feedback from pilot faculty and students -revised accordingly Met with Executive Committee of Faculty Senate - revised accordingly

Process (cont) Fall 2003 Met with Executive Committee of Faculty Senate Further review and revision of instrument and on-line system Analysis of pilot data to examine relationships among items (regressions, factor analysis, etc)

Our Concerns about Overall Questions Past & current promotion committees’ emphasis on two overall questions Large body of evidence shows that overall questions have no relationship to questions targeting specific instructor behaviors, learning outcomes or course activities Our analyses from pilot tests are consistent with these findings

Concerns about overall questions (cont) Research indicates that overall questions are prone to influence by a variety of factors Course content (e.g., quantitative vs non- quantitative) Anticipated grades Class size Physical attractiveness of the instructor Difficulty of the course Workload

Our Recommendation Based on the large body of data supporting bias with these measures, we initially excluded the overall questions. HOWEVER, Response by faculty and department heads was overwhelmingly negative Hence, under duress and against our better judgment, we re-inserted the overall questions :(

Our recommendations regarding use of FCEs FCEs should be used in conjunction with other sources of data in a portfolio Reflective statement from faculty member Course syllabi Examples of assessments (exams, projects, homework, etc) Samples of student work with feedback (essays, creative work, lab reports, etc) Alumni letters/ratings Colleague letters (e.g., instructors in follow-up courses commenting on preparedness of students)

Our recommendations (cont) If overall questions are used for promotion purposes, minimally the section average ratings should also be included (Learning Outcomes, Instructor Behaviors, Course Activities)

Advantages of New Instrument More specific feedback on a broader array of relevant issues Student Effort Learning Outcomes Instructor Behavior Course Activities Students in pilot studies indicated that the instrument was more likely to lead them to change their behavior it forced them to reflect on their learning, which they believe is a positive experience

Advantages of On-line System Provides as little or as much data as faculty members want, including relationships among questions, responses by sub-categories, etc. Results will be available as soon as grades are turned in Won’t take class time Provides extended time (may increase response rate) Reduce current data entry errors

Demonstration of “Report” Back Features

Access

Faculty Members Department Heads Deans Students Promotion Committees

Faculty, Department Heads, Deans To date all three groups have had full access to all information. This current practice will be maintained. Access will include Course enrollment and response rate Average ratings for each section (learning outcomes, instructor behaviors, course activities and resources) * Average ratings for overall questions (instructor and course)* Average ratings for individual items* Comparative statistics for department and college on above five items Relationships between items Filtering responses by subgroup (e.g.by major, year, etc) Student comments * Also included are response distribution, standard deviation and median.

Students To date students have had access to average ratings for all questions; they were not privy to comments. This current practice will be maintained. Access will include: Average ratings for each section (learning outcomes, instructor behaviors, course activities and resources) Average ratings for overall questions (instructor and course) Average ratings for individual items

Promotion Committees To date promotion committees utilized, among other data, averages of all questions and/or of the two overall questions. The Provost and Deans agreed to pilot the following template, that includes Course enrollment and response rate Average ratings for each section (learning outcomes, instructor behaviors, course activities and resources) Average ratings for overall questions (instructor and course) Comparative statistics for department and college on above five items

Example of Additional Information Department heads could include, for courses with low [explainable] FCEs Comparison to course ratings over time Comparison for faculty member over time (to show improvement) Ratings by sub-groups (e.g. majors vs.non-majors) Departmental Comparisons to similar courses (e.g. large lecture, major-only, upper division, etc)

Initial Implementation Period The initial four semester period (from Fall 2004 to Spring 2006) will be used for: Analysis of data for recalibration of university, college and department means Analysis of data from the instrument to investigate the relationships between ratings and irrelevant factors (anticipated grades, class size, workload, etc) Analysis of data from the instrument and from students, faculty, department heads, deans and promotion committees to revise guidelines and policies, if necessary

Response Rate Issue Response Rate Student Senate will promote the new instrument with a media blitz We will utilize Blackboard, Portal, and to prompt completion of instrument each semester, including reminders for non-respondents We will expand length of time for completion of course evaluations We will prompt faculty to remind students within courses

Endorsements to Date Student Senate Executive Committee of Faculty Senate Department Heads Deans Provost and President University Education Council Graduate Student Organization Faculty Senate?