Stefani Dawn Assistant Director of Assessment Office of Academic Program, Assessment and Accreditation eSET.

Slides:



Advertisements
Similar presentations
Promotion and Tenure Faculty Senate May 8, To be voted on.
Advertisements

Everything you wanted to know, but were afraid to ask……..
How to Understand and Interpret Reports in the IBTP Heather P. Wright Polk County Schools Created by Heather P. Wright, Polk County Schools.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
1 Selected Results from UNCG’s Sophomore and Senior Surveys Spring 2000 Office of Institutional Research UNCG Planning Council August 24, 2000 The University.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Evaluation of Training
Applying Assessment to Learning
Customisable Online Academic Skills Self-Assessment: Development and Initial Feedback Steve Briggs Nick Collis Nigel Upton.
Teresa Ryerse Overton.  Suburbs of Washington DC  5 Campuses and a separate Medical Education Campus  78,000 Students  2,600 Faculty and Staff  ~8,000.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
Institution Research Update John Porter AIRPO June 20, 2006.
E-Student Evaluation of Teaching Report to Faculty Senate January 12, 2012 Gita N. Ramaswamy, Director Academic Programs, Assessment, and Accreditation.
Conversion of Faculty Evaluations to an On- Line Format Catherine Hackett Renner SUNY Geneseo Larry Piegza Gap Technologies, Inc. OnlineCourseEvaluations.com.
Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY.
Allocations for Temporary FTE and Teaching Assistants: How Do They Work? Presentation to ABA Tuesday, January 12, 2010 Kathy Farrelly Director, Budget.
AS /AA Clarification of the Formation, Dissolution, Merger or Movement of an Academic Department- Resubmission.
M I L L I K I N U N I V E R S I T Y Critical Writing, Reading & Research I & II MPSL First-Year Writing Requirement Report for Academic Year
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Building a Compliance Risk Monitoring Program HCCA Compliance Institute New OrleansApril 19, 2005 Lois Dehls Cornell, Esq. Assistant Vice President, Deputy.
Registration Satisfaction Survey FAS Report, Fall Presented by: K. El Hassan, PhD. Director, OIRA.
Measuring for Success Module Nine Instructions:
How to Fill Out the CARD Form (Course Assessment Reporting Data Form)
Commission on Accreditation for Respiratory Care The Site Visitors Are Coming! Transitioning from Successful Self- Study to Successful Site Visit Bradley.
Mode Mean Range Median WHAT DO THEY ALL MEAN?.
WHAT DO THEY ALL MEAN?. 4 ways to describe data The Mean – The average number in a data set. The Median – The middle number in a data set….. 50% of the.
Blackboard 9.1 Presented by: Kim Shaver Associate Director of Educational Technology Assisted by : Alicia Harkless, Educational Technology Specialist,
THE 2011 VCC STUDENT CENSUS SURVEY Selected Findings for Overall Census Responses April 2012.
Measures of Central Location (Averages) and Percentiles BUSA 2100, Section 3.1.
EDIT 6900: Research Methods in Instructional Technology UGA, Instructional Technology Spring, 2008 If you can hear audio, click If you cannot hear audio,
University of Arkansas Faculty Senate Task Force on Grades Preliminary Report April 19, 2005.
Outcome Assessment Reporting for Undergraduate Programs Stefani Dawn and Bill Bogley Office of Academic Programs, Assessment & Accreditation Faculty Senate,
SW388R6 Data Analysis and Computers I Slide 1 Central Tendency and Variability Sample Homework Problem Solving the Problem with SPSS Logic for Central.
PORTFOLIO. Student Uses Students CollectReflectShare Collect class requirements Collect graduation requirements Reflect on learning in class Reflect on.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Promotion and Tenure Faculty Senate June 12, 2014.
Average Percent of 1st & 2nd Year Students in Classes Under 50, by Type of University, Maclean's 2004.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Diversity Data Resources from the Office of Academic Planning and Institutional Research apir.wisc.edu/diversity.htm.
 Lunch and Learn April 22, 2015 Office of Human Research.
Department of Research and Planning November 14, 2011.
MAE Curriculum Update G. Thompson April 13, 2012.
+ General Education Assessment Spring 2014 Quantitative Literacy.
Preserving Academic Integrity 1Rev: , NAO Academic Misconduct.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Descriptive Research Study Investigation of Positive and Negative Affect of UniJos PhD Students toward their PhD Research Project Dr. K. A. Korb University.
Confronting and Reporting a Violation Assistance for this presentation was provided by: Camilla J. Roberts, Associate Director, Provost Office.
44 th NYSFAAA Conference “On Track for Excellence” Satisfactory Academic Progress Thomas J. Dalton, Excelsior College & Curt Gaume, Canisius College.
ASEE Profiles and Salary Surveys: An Overview
College Level Cooperatively Taught Information Literacy and Subject Area Course Background and Assignments.
Field of Dreams – If You Build It, Will They Come? PRESENTED BY DR. LARRY BUNCE DIRECTOR OF INSTITUTIONAL RESEARCH.
Graduate Student Teaching Graduate Council. Current Policy Graduate Student Teaching Students working toward graduate certificates or advanced degrees.
ESET: Combined Versus Uncombined Reports Stefani Dawn, PhD Assistant Director of Assessment Academic Programs, Assessment and Accreditation.
How to Understand, Interpret, and Use Student Data to Inform Instruction Heather P. Wright & David Bustos Polk County Schools Created by Heather P. Wright.
1 Mississippi Statewide Accountability System Adequate Yearly Progress Model Improving Mississippi Schools Conference June 11-13, 2003 Mississippi Department.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Fall 2010 Program Quality Improvement Report Cameron University Adult and Continuing Education Associate of Science IDS CIP Code: Program.
Indiana University Kokomo Strategic Enrollment Management Consultation Final Report Bob Bontrager December 8, 2007.
© 2014, Florida Department of Education. All Rights Reserved. Developmental Education Accountability Connections Conference 2015 Division.
ESET Timing Close Study Results Stefani Dawn, PhD Assistant Director of Assessment.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Terri Tommasone & Diana Abinader
of Teaching: Challenges
Gradebook Versatility in Moodle 3.0
NCAA Student-Athlete Eligibility
Course Evaluation Ad-Hoc Committee Recommendations
The Heart of Student Success
Distance Learning Benchmarking
Presentation transcript:

Stefani Dawn Assistant Director of Assessment Office of Academic Program, Assessment and Accreditation eSET

Decisions for Faculty Senate HIGHEST PRIORITY DECISIONS Minimum class size for evaluations Combined or uncombined reporting eSET-RELATED ITEMS and PROCESSES Demographic data collection Courses to include or exclude in eSET Financial, resource and technological limitations of eSET to be configured for other evaluation approaches (formative evaluations or department-centric evaluations)

Minimum Class Size for Evaluations OAR If Oregon State University solicits or accepts student survey evaluations of the classroom or laboratory performance of a faculty member, the survey evaluations shall be conducted anonymously… History Requests were made to have separate reports for slash courses. That was implemented last term and quickly put on hold when it was discovered that some of the cross listed sections had only a few students in it, even if all of the sections combined was a large number. Took the issue to the Faculty Senate EC to make some interim decisions to ensure that the OARs are upheld.

Minimum Class Size for Evaluations Interim Decision Courses with fewer than 6 students will not be evaluated. OSU guidelines University of Oregon does not conduct evaluations in courses with 5 or fewer students. Question – What should the number be to protect student anonymity? Factors to Consider Student input

Reporting: Combined or Uncombined There are advantages and disadvantages to combining reports. Combining reports preserves data Example: 2 section slash course – one has 5 students, one has 4 students – combining would allow the faculty members to receive the evaluations. Some faculty and departments prefer the combined data others do not Differences in P & T forms/data per department Graduate versus undergraduate Having separate reports and combining data after the fact is not straight forward Median calculation is not a standard approach and cannot be done easily in an Excel spreadsheet by the department or individual instructor The Office of APAA and Enterprise Computing does not have the resources to combine reports that have been separated.

The Median Calculation Selected by the Faculty Senate in 2004 Based on "Measurement and Evaluation in Psychology and Education", Fourth Edition, by Robert L. Thorndike and Elizabeth P. Hagen. This calculation assumes that multiple values within a range are distributed evenly over the range. Say you have the following distribution: Score Freq Cumulative Frequency The total responses is 106, so half of that is falls between the cumulative frequency of 48 and 77 corresponding to the scores 4 and = 5 more cases in the “5” range are needed. Since there are 29 responses for “5”, we need 5/29 of them or.20 (rounded from.17). We look at the range 4.5 – 5.5 and add.2 to 4.5 to get 4.7 as the median.

Demographic Data Previous demographic data collected: Students are still potentially identifiable by: Gender class status (e.g. the only graduate student in an undergraduate course) Grade: Audit Gender: The reason you are enrolled in this course: Grade you expect to receive in this course: Class Status: Is this course your major? Percent of this class you attended: Your overall grade point average:

Demographic Data Factors to Consider Do faculty use the data? Survey fatigue by the students

Everything is An “If-Then” If you decide to combine reports: then how will that impact departmental processes for P & T? If you decide to have separate reports: Then what data will be lost? (Which is more important the data or the separate reports?) Saving data and accumulating it over time is not an option for two reasons OAR Survey instruments from which evaluation data are obtained shall be delivered to the faculty member. There are over 4000 courses and >1500 faculty evaluated per term. Tracking this types of data is not feasible with the current FTE arrangements. Then how would reports be recombined (e.g. a person has 7 different slash versions of a course)? Could change the median calculation to a standard calculation If you decide to keep the demographic data Then how might it compromise anonymity? Is it information really used and needed? Are there potential issues in the way questions are asked and student response to those questions? Then how might it impact survey fatigue?