Time to Go Online!. Why go online? Paper vs. Online.

Slides:



Advertisements
Similar presentations
PAINLESS PERIODIC REVIEW Cynthia Steinhoff Anne Arundel Community College Arnold, Maryland.
Advertisements

1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
LCME Self Study Kick Off. What is the LCME? Accrediting agency for programs leading to the M.D. degree in the U.S. and Canada Jointly sponsored by the.
Academic Advising Implementation Team PROGRESS REPORT April 29, 2009.
Coordinator of Assessment Coordinate assessment efforts on campus Maintain the NCCC General Education Assessment Plan Collect assessment results from course.
Executive Sponsor Session October 31, 2006 ATI Technical Assistance Workshop.
Increasing Online Survey Response Rates April 7, 2015 Institute for Faculty Development.
Survey Design Training OFFICE OF PLANNING, RESEARCH & INSTITUTIONAL EFFECTIVENESS CAÑADA COLLEGE.
SOAR – Preparing for Launch Task Force Information January 2015.
Understanding our First Years Two studies and a comparison.
Chapman students presenting physical geology research projects to AP Environmental Science class, Orange High School Bringing Research into Your Classroom.
FINAL PUBLIC REVIEW OF DRAFT NATIONAL CORE ARTS STANDARDS What you should know and be able to do!
OAVSNP 2014 Charlotte Alverson, NPSO Pattie Johnson, TRI Sally Simich, ODE 1.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Risk Management in Youth Development Programs January 16, 2013.
BACK TO THE BASICS: Library Instruction Redux. BRENT HUSHER MELISSA MUTH FU ZHU0 University of Missouri–Kansas.
Student Evaluation Workshop How To’s and Don’t Do’s Kerri L Ford Institutional Research Unit Coordinator Mary Elkins Institutional Research Unit Manager.
A Performance Management and Evaluation System.  Compliance with BHECM Guidance  Identification of “Best Practice Standards”  A Job Description Review.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
Doster Hall Promotions Prepared for Rebecca Kelly Thursday, Feb. 24, 2011.
NYC SCHOOL SURVEY 2015 SCHOOL LEADER TRAINING. Agenda 2 1.SURVEY AT-A-GLANCE 2.KEY DATES and LOGISTICS 3.ETHICS 4.TIPS to INCREASE RESPONSE RATE 5.SURVEY.
1 Computer Security Survey (CSS) Workshop Thomas L. Mesenbourg Assistant Director for Economic Programs Bureau of the Census April.
PUBLIC TECHNICAL AND VOCATIONAL EDUCATION AND TRAINING (TVET) COLLEGES 2015.
Advising to Make the Grade FYE Assignments that Benefit Students and Advisers Veronica Giguere Florida Institute of Technology.
1 Presented by: Eric Kunnen, M.A. Director, Distance Learning and Instructional Technologies Distance Learning College Action Project AGC UPDATE.
Indicator 13 Online Data Collection System and IEP Review – Webinar #3 February 3, 2010 Presented by: Center for Change in Transition Services Cinda Johnson,
SENSE Fall 2007 Pilot Orientation Conference Calls.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
91 st Annual Meeting & Exposition April 1 – 4, 2012 Anaheim, California Measuring Sustainability Performance with STARS NAEP Annual Meeting April 2012.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
USER ISSUES / EXPECTATIONS REGARDING E-JOURNALS: A STUDY OF USER EXPERIENCES AT UNIVERSITY ENGINEERING COLLEGE JNTU KAKINADA Dr. B.R. Doraswamy Naick Assistant.
Student success is the first priority. Increasing Student Response Rates Presenters  Magaly Tymms, Associate Director, Academic Effectiveness and Assessment.
Collecting Post-school Data Webinar #2 November 18, 2009 Presented by: Center for Change in Transition Services Cinda Johnson, Wendy Iwaszuk, Denny Hasko,
Students Course and Teacher Evaluation Please refer to notes at the bottom of this slide.
QEP UPDATE TO THE ACADEMIC DEANS Dr. Patrick Bibby Dr. Gina Cortés-Suárez.
A Look at the Early Alert System A. Craig Dixon Madisonville Community College New Horizons Teaching.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
Dordt’s Institutionally- created Surveys Presented by Sheryl Sheeres Taylor.
Nancy Howell The University of Southern Mississippi March 23, 2009.
Just before we get started… Who am I? How questions will be handled Resources available after the webinar Key QILT Dates & the AGS Item Review 2.
Wednesday, October 28 2:30 – 3:30 PM
Implementation: As part of the LHD Stabilization plan Dr. Davis instructed the Nurse Executive Committee (NEC) to review the PHPR revision process as.
QEP Update Primary Outcomes improvement of students’ critical thinking skills 1. Measurable improvement of students’ critical thinking skills at the course,
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
USE PATTERN OF ELECTRONIC JOURNALS BY FACULTY MEMEBRS IN K.L.UNIVERSITY LIBRARY: A STUDY Smt. K. Usha Rani Librarian K L University, Vaddeswaram Guntur,
Establishing Communities of Teachers and Learners Lake Michigan City College.
Jeff Morgan President & CEO National Investor Relations Institute Notice & Access Year One Lessons Learned.
STUDENT SURVEY OF INSTRUCTION (SSI) INSTRUCTOR WORKSHOP SARAH GORDON, JAMES KNECHT & ANNETTE MOORE UNIVERSITY ASSESSMENT AND TESTING SESSION I: MONDAY,
Overview Presentation Mary Lynn Realff Co-PI and Project Director NSF Site Visit June 8, 2004 GT NSF ADVANCE – taking an integrated approach to institutional.
Accreditation Self-Study Progress Update Presentation to the SCCCD Board of Trustees Madera Center October 5, 2010 Tony Cantu, Fresno City College Marilyn.
Adding Cost and Benchmarking to Your Program Review Michelle Taylor| Senior Research Analyst National Higher Education Benchmarking Institute.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
Creating a Comprehensive Early Warning System to Further Student Success and Retention Shane Hammond CCLA June, 2007.
PILOT SCHOOL PRINCIPAL EVALUATION
Procedure Guide Web site
BECOMING AN EXPERT IN SURVEY DESIGN By Ghania Zgheib EDIT 732
Assessment Cycle and Academic Effect
Using Quality Matters™ (QM) to Design and Assess Online Courses
2003 Student Satisfaction Survey
Budget & Planning Calendar.
Guidelines for Resident Quality Improvement Project
Academic Assessment: Data Day May 7, 2015.
PACE Activities Projects and Funding Al Reinhart Region 1 PACE Coordinator Region 1 Training Workshop Crowne Plaza Albany – 1-2 August 2008 Session 4A.
Course Evaluation Ad-Hoc Committee Recommendations
Extend an Existing Degree Program to a New Location
Mean vs Median Sampling Techniques
Role of the External Examiner
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Time to Go Online!

Why go online? Paper vs. Online

 Our current paper process costs about $25,000 per year, plus staff time.  An online process would be virtually free— no processing costs and very little staff time after first iteration.

 The timing of paper evaluations is not optimal, but must be conducted in mid- November to allow processing time.  Online evaluations could be conducted anytime during the last two weeks of class.

 Our current paper process takes considerable staff time and effort:  generating course lists  soliciting and approving two sections per faculty member  printing and sorting forms  distributing, collecting, and logging forms.  An online process would eliminate nearly all processing time.

 Current paper process is not available for evaluation of part-time faculty—time and cost prohibitive.  An online evaluation process would be available to both full- and part-time faculty and would allow us to use the same instrument for both.

 Current paper process allows for evaluation of only two sections per full- time faculty member per year—a sample size of about 20%.  An online process would allow for evaluation of every class, every faculty member, every semester—a sample size of about 53% given average return rates.

 Results from paper evaluations are slow and distribution of results is unwieldy.  Online evaluation results would be available as soon as grades are submitted and directly accessible by faculty.

 Our paper process uses about 95,000 sheets of paper per year—not to mention a whole bunch of large envelopes.  An online process is environmentally responsible.

 Myth #1: Return rates on online course evaluations are abysmally low. Fact: The average return rate at other universities is about 53%. That compares to an average paper return rate of 78% and a current MC sample rate of less than 20%.

 Myth #2: If evaluations are online, only the outliers (the very satisfied or very dissatisfied) will respond. Fact: Other institutions report that even when response rates are lower, the faculty member’s average rating does not change.

 Myth #3: If evaluations are online, students will rush through them even more than they do with paper evaluations. Fact: Research shows that, when provided a free response section, students write longer comments on electronic forms than on paper forms.

 Myth #4: Many of our students don’t have access to computers for completing their evaluations. Fact: A 2003 report indicated 92% of Maryland households had computer access to the Net. Plus, students can always use a lab on campus.

 Myth #5: MC will have to use draconian measures, such as withholding final grades, to get students to complete the online evaluations. Fact: Many schools are getting very respectable return rates with awareness campaigns, friendly competitions, incentives, and other positive strategies.

 What is the best timeline for implementing this change?  Would it be possible to reduce the number of evaluation questions from 25 to 10?  If so, what is the best mechanism for selecting the 10 most important questions?  What strategies should we use to increase return rates? What should we avoid?

 What quality safeguards or comparative data would make faculty more comfortable with this change?  What are the best venues for soliciting faculty suggestion?  What other question or concerns should we be addressing?

 Please contact your dean with ideas, concerns, or other input!

 Corragio, James, and Magaly Tymms. “Transiting to an Online Course Evaluation Model: The Online Student Survey of Instruction.” St. Petersburg College. St. Petersburg, FL. February  “Facilitating Response Rates in IDEA Online.” IDEA Center. Manhattan, KS. August %20Online%20Response%20Rate.pdf %20Online%20Response%20Rate.pdf  “Information on University-wide Course Evaluations.” University of Maryland. College Park, MD

 Miller, Mary Helen. “Online Evaluations Show Same Results, Lower Response Rates.” The Chronicle of Higher Education. 6 May  “Online Faculty and Course Evaluation FAQ.” Ball State University. Muncie, IN cms.bsu.edu/About/AdministrativeOffices/Provost/FacResources/CrseResp onseFAQs.aspx  “Pilot Study for Assessing the Viability of Using Online Course Evaluations at California State University Sacramento.” Sacramento, CA. 8 October

 Sorenson, Lynn, and Trav Johnson. “Online Student Ratings of Instruction.” Brigham Young University. Salt Lake City, UT. 12 April  Thorpe, Stephen W. “Online Student Evaluation of Instruction: An Investigation of Non-Response Bias.” Paper presented at the 42nd Annual Forum of the Association for Institutional Research. Toronto, Canada. June,

 Question or Concerns?