COURSE EVALUATION Spring 2019 Pilot August 27, 2019.

Slides:



Advertisements
Similar presentations
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Advertisements

Outcomes Assessment- Full Implementation Meeting Fall 2009.
Midterm Evaluations of Teaching Pilot Project Kiran Mahal & Dr. Simon Bates.
Becoming one with the SLO The Zen Of Assessment of SLOs And Rubric Writing.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.
Computer Science Department Program Improvement Plan December 3, 2004.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Sex comparisons among science faculty at Hunter College Hunter College Gender Equity Project & Provost’s Office 2007 Science Faculty Survey Department.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
Chemistry Assessment Update C101-Lecture C121-Laboratory Chemistry department expanding into C100-Lecture C120 Laboratory Assessment Committee Mandate.
What do you do when your assessment data are in? Using Assessment Data to Improve Teaching and Learning 1.
Teaching Council Recommendations to the Faculty Senate DRAFT 2/9/09 & 3rd DRAFT Feb 13, 2009 for use by the FS Exec Committee, March 4, 2009.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright.
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
STACY WENZEL, PHD LOYOLA UNIVERSITY CHICAGO CENTER FOR SCIENCE AND MATH EDUCATION Educational Research Useful for College Instructors Using Ballooning.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Quality Online Preparation: Qualities of Faculty, Courses, and Preparation Programs By Dr. Erin O’Brien Valencia College League of Innovation Conference,
Problem with Multi-purpose Postsecondary Course Evaluations STANLEY VARNHAGEN JASON DANIELS BRAD ARKISON Evaluation & Research Services, Faculty of Extension.
CAEP Standard 4 Program Impact Case Study
Angela Kleanthous University of Cyprus May 20th, 2017
Documenting Your Teaching and Student Evaluations
Creativity in math and science (CMS)
Anthony Williams, Maria Northcote, Jason Morton and John Seddon
Jones Hall.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
ASU Composition Conference
Going Beyond Course Design: Implementing Online Delivery Standards
MUHC Innovation Model.
PNAIRP 2009 Portland, OR A New Web-Based Tool for Assessing the Student Experience in Learning Communities.
Student Learning outcomes assessment
Student Engagement at Orange Coast College
Stockton Federation of Teachers (SFT)
Initial Findings about Graduate Teaching Assistants’ Training Needs to Foster Active Learning in Statistics Kristen E. Roland and Jennifer J. Kaplan.
Undergraduate Teaching Assistants (UTAs)
ASSESSMENT OF STUDENT LEARNING
Online IDEA Course Evaluation Process Using Campus Labs
Grand Rapids Community College Faculty Evaluation System
Assessment Workshop Design Program Portfolio & Presentation / ART- 230
The University of Texas System Assessment of
Course Evaluation Committee
Institutional Learning Outcomes Assessment
Derek Herrmann & Ryan Smith University Assessment Services
Timothy S. Brophy, Director of Institutional Assessment
Faculty use of digital resources and its impact on digital libraries
Fall Institute for Academic Deans and Department Chairs
Qualtrics Proposal Gwen Gorzelsky, Executive Director, TILT
General Studies ePortfolio Pilot
PART ONE: Assessing Institutional Learning Outcomes
Course Evaluation Ad-Hoc Committee Recommendations
The Heart of Student Success
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Select Findings from the Fall 2018 Enrolled Student Survey
EDUCAUSE MARC 2004 E-Portfolios: Two Approaches for Transforming Curriculum & Promoting Student Learning Glenn Johnson Instructional Designer Penn State.
Dr. Jennifer Brezina Volunteer State Community College
TAMU 14th Annual Assessment Conference
IDEA Student Ratings of Instruction
Learning Community II Survey
Accreditation follow-up report
Starting at a New College
Personal Assessment of the College Environment (PACE)
Selecting Evidence to Demonstrate Teaching Effectiveness
Student Learning Outcomes Assessment
General Studies ePortfolio Pilot
Reimagining the curriculum process
Presentation transcript:

COURSE EVALUATION Spring 2019 Pilot August 27, 2019

Course Evaluation Ad-Hoc Committee Co-Chairs: Amy Thompson and Christine Fox Faculty Advisors on Methodology and Analysis: Svetlana Beltyukova and Christine Fox The charge of the Course Evaluation Ad-Hoc Committee was to review best practices and procedures for assessing student perceptions of faculty teaching and make recommendations for a standardized data collection process and standardized set of common core assessment questions.

Survey Development Procedures Reviewed published literature to assess best practices in course evaluation Reviewed existing course evaluations from each of the UT colleges as well as some peer institutions Conducted a thematic analysis to determine common constructs that reflected best teaching practices Developed 12 core questions and 3 Open-ended questions

Survey Pilot Procedures Feedback and approval was sought from the ad-hoc committee and faculty governing bodies Individual faculty members, department chairs, and college deans were contacted to solicit volunteers for the Spring 2019 pilot The core questions were administered to participating courses/departments/colleges electronically via Blackboard Each participating department and college was able to include additional questions that were tailored to their own unique needs

Survey Pilot RESULTS The core questions were administered in Spring 2019 to almost 4,000 students (undergraduate and graduate) across 9 colleges

Psychometric Analysis Psychometric analyses: For the overall sample (n=3,893) By each college By course level (graduate & undergraduate) By method of delivery (DL & face-to-face) Purposes: To determine if computing a composite course evaluation score was meaningful and justifiable To determine if any questions needed to be analyzed and reported separately To determine the extent to which the meaning of the composite score was stable across colleges, student levels, and methods of course delivery

Results of Psychometric Analysis A composite score based on 8 out of 9 questions would be meaningful and justifiable. Question 1 (I put forth my best effort in this course) needs to be analyzed and reported separately as it does not fit both conceptually and statistically with the other questions. The remaining 8 questions form 4 meaningful clusters/themes: Cluster 1: Clear expectations and fair grading Cluster 2: Climate Cluster 3: Feedback Cluster 4: Teaching strategies Results of Psychometric Analysis

Steps in Course Evaluation Question Revisions Think-aloud interviews with 27 graduate & undergraduate students (saturation was reached) representing the following colleges: Arts & Letter Education Engineering HHS Natural Sciences & Math Nursing Medicine Review of the think-aloud data and question revision discussion by the survey committee members

Course Evaluation Questions

With the original wording, students immediately thought about the syllabus and did not understand the term ‘learning outcomes.’ The intent of this question is to emphasize the importance of expectations for performance AND clear communication throughout the semester. Original Q2: The learning outcomes and expectations for performance were clearly communicated throughout the semester. Revised Q2: Expectations for performance were clearly communicated throughout the semester.

Students were interpreting “being encouraged and supported” differently. It needed to be clear that this was about the course, and motivation better captures the intent of encouragement and support because those result in motivation. Original Q3: I felt encouraged and supported to do my best work. Revised Q3: In this course, I felt motivated to do my best work.

First, the question is asking about two different things: about variety of approaches and needs of all students. Second, the students felt they could not speak for everyone’s needs. Third, the question is not about the variety of approaches but about the nature of approaches to help students be successful in the course. Original Q4: A variety of teaching approaches were used to meet the needs of all students. Revised Q4: The teaching approaches used supported my learning needs.

The question should focus on learning environment, not on feeling comfortable. Given that some students don’t feel comfortable speaking up, the original wording made it all about the student, not the course. Original Q5: I felt comfortable expressing my views and ideas in this course. Revised Q5: The course provided a comfortable environment for expressing views and ideas.

Students thought these questions were very similar as both related to adjusting performance and feedback. The suggestion was to make them shorter and focus on one component only – on feedback timeliness and quality. Original Q6: I received feedback on my work promptly and in time to adjust my performance in this class. Original Q7: Feedback I received from the instructor was helpful in improving my performance in the course.

Question revisions according to feedback… Revised Q6: I received feedback on my work within a reasonable timeframe. Revised Q7: The quality of the feedback on my work helped my learning.

Students were unclear if this question was about treating everybody the same or getting the grade they deserved or their understanding why they earned that grade. Subsequent discussion of this question with the committee made it clear that the intent of the question was about grading criteria and student understanding of the grade they earned. Original Q8: The grading in the course was fair. Revised Q8: The grading in the course fairly reflected the quality of my work.

Students felt that many required courses would “suffer” from this question as they would be required to take a course where they already know most of the material. Many also felt the question was vague and should be about engagement with the course material, relatedness of the course material, etc. Original Q9: I learned a lot in this course. Revised Q9: Overall, I had a good learning experience in this course.

Fall 2019 - ‘soft launch’ across the university using Campus Labs Software Spring 2020 – full implementation Department and College questions can be added Next Steps

THANK YOU