New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.

Slides:



Advertisements
Similar presentations
HERDING JAVELINAS : CoursEval Implementation at a Large Multi-Campus University.
Advertisements

University-Wide Course Evaluation Committee Peter Biehl, Chair, Department of Anthropology Krissy Costanzo, Committee Staff Support; Academic Affairs March.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Strengthening Institutions Programs Title III
Institutional Course Evaluation Solution Faculty Senate Executive Committee September 12, 2012 Carol VanZile-Tamsen, Ph.D.; Associate Director, Office.
IN SUPPORT OF STUDENT INVOLVEMENT IN THE COURSE TRANSFORMATION PROGRAM Senate Resolution 1012.
Andrea Eastman-Mullins, Information & Technology Coordinator, UNC TLTC Dr. Ray Purdom, Director of the University Teaching and Learning Center, UNC Greensboro.
October Priority 8 Review Team 8: Planning Subcommittee M. DesVignes, D. Kinney, J. Moore, M. Siegel, R. Tillberg Collect and use data systematically.
Your Logo Here An Administrative Framework for the Blackboard Academic Suite Presented By Chris J Jones University of Oklahoma HSC April 13, 2005.
EXCELLENCE AT CAROLINA SACS REAFFIRMATION PROCESS APRIL 2008 Making Critical Connections Quality Enhancement Plan Annual Report #2 Faculty Council April.
Digital Measures Managing and Reporting on Faculty Accomplishments Steve Hare Project Manager Office of Institutional Research, Assessment, and Effectiveness.
1 Faculty Activity and Assessment Reporting Report to FAAR Advisory Group November 29, 2007.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Improving the Online Evaluation Process and Response Rates SAIR 2011.
Copyright Statement © Jason Rhode and Carol Scheidenhelm This work is the intellectual property of the authors. Permission is granted for this material.
NLII Mapping the Learning Space New Orleans, LA Colleen Carmean NLII Fellow Information Technology Director, ASU West Editor, MERLOT Faculty Development.
May 18, Two Goals 1. Become knowledgeable about the QEP. 2. Consider your role in making the QEP a success.
So You Want to Switch Course Management Systems? We Have! Come Find Out What We’ve Learned. Copyright University of Okahoma This work is the intellectual.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Techniques for Improving Student Learning Outcomes Lynn M. Forsythe Ida M. Jones Deborah J. Kemp Craig School of Business California State University,
Cumberland County: May 28 Oak Ridge: June 2 Roane County: June 4 Scott: June 4 Campbell: June 9 Knox: June 10 Loudon: June 11.
Prince George’s Community College Online Express Training Faculty to Teach Online Mary Wells Margo Chaires Andrew Habermacher.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
2010 ASCCC Curriculum Institute Santa Clara Marriott July 8-10, 2010 PARALLEL PATHS: DISTANCE EDUCATION ADVANCED Kevin Bontenbal, Cuesta College Dolores.
Sara Kim, PhD, Director, Associate Professor Instructional Design and Technology Unit, UCLA David Geffen School of Medicine Katherine Wigan, BS, MBA, Senior.
Blackboard Next Generation (Version 9.1) Introduction to New Features Coming Summer 2011.
Managerial Role – Setting the Stage Lesson 6 Jeneen T. Chapman John Madden Facilitators.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Chancellor’s Cabinet September 28, 2004.
Enhancing the Environmental Health Content in Community Health Nursing Sharon Burt, RN, PHN, DNSc Nancy Sweeney, RN, PHN, DNSc.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Academic Assessment Task Force Report August 15, 2013 Lori Escallier, Co-Chair Keith Sheppard, Co-Chair Chuck Taber, Co-Chair.
Financing Online Education: An Integrated Learning Approach Kris Biesinger Assistant Vice Chancellor SHEEO Summer 2004.
Marco Ferro, Director of Public Policy Larry Nielsen, Field Consultant With Special Guest Stars: Tammy Pilcher, President Helena Education Association.
End of Course Evaluation Taimi Olsen, Ph.D., Director, Tennessee Teaching and Learning Center Jennifer Ann Morrow, Ph.D., Associate Professor of Evaluation,
LeBaron Woodyard Dean, Academic Affairs October 30, 2013 CALIFORNIA COMMUNITY COLLEGES CHIEF INSTRUCTIONAL OFFICERS FALL 2013 CONFERENCE.
Unlocking the door: The new Ellingsburg University Web Portal Seattle University Kristen Campbell, Julie Larsen, & Nancy Padgett.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Key System Features and Next Steps. Features: Computer Adaptive Testing Adaptive assessment provides measurement across the breadth of the Common Core.
SETE ABOR POLICY ON STUDENT EVALUATIONS 6-221B. General Policy It is the policy of the Arizona Board of Regents that faculty shall be evaluated on their.
Students Course and Teacher Evaluation Please refer to notes at the bottom of this slide.
Assessment of Portal Options Presented to: Technology Committee UMS Board of Trustees May 18, 2010.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
A Perspective on Student Learning at ECU It’s working! Marilyn Sheerer Provost and Senior Vice Chancellor.
ADVISORY COUNCILS Department of Education Bureau of Career and Technical Education.
ACADEMIC PLAN REPORT Faculty Council March 16, 2012 Bruce W. Carney Executive Vice Chancellor & Provost.
Visioning 2 Committee 15,800 by 2018 JoyceArmstrongFamily Sciences 13 JessicaGullionSociology GovernorJacksonFinancial AidMarkHamnerInstitutional Research.
Curriculum at SCC and Role of the Senate Presented by Craig Rutan and Joyce Wagner SCC Academic Senate Fall 2013 Retreat.
Teaching Council Recommendations to the Faculty Senate DRAFT 2/9/09 & 3rd DRAFT Feb 13, 2009 for use by the FS Exec Committee, March 4, 2009.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Blue Sun University Due to rising concerns from faculty, senior administrators, and the occurrence at lawsuits regarding MOOCs at other similar institutions,
IDEA Ad Hoc Committee Report Submitted by: Andreas Veh, Diane Erickson, Nelta Edwards, Kerri Morris UAA Faculty Senate, Dec. 7, 2007.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
Accreditation Self-Study Progress Update Presentation to the SCCCD Board of Trustees Madera Center October 5, 2010 Tony Cantu, Fresno City College Marilyn.
GAPS OF PRACTICE OF ACADEMIC ADVISING IN YOUR CAMPUS? HOW TO BRIDGE THEM? SPEAK TO THE NACADA ACADEMIC ADVISING CONSULTANT AND SPEAKER SERVICE Selma Haghamed.
Undergraduate Experiential Inquiry and Research at Rice.
Quantitative Literacy Across the Curriculum. Members of the QLAC Committee Beimnet Teclezghi – co-chair Laura Pannaman – co-chair Marilyn Ettinger John.
The NEW Distance Education Guidelines
Division of Talent and Performance
Course Evaluation Committee
Qualtrics Proposal Gwen Gorzelsky, Executive Director, TILT
Preparing for Promotion and Annual Review August 22, 2018
Course Evaluation Ad-Hoc Committee Recommendations
EDUCAUSE MARC 2004 E-Portfolios: Two Approaches for Transforming Curriculum & Promoting Student Learning Glenn Johnson Instructional Designer Penn State.
New Faculty Orientation Non-tenure-track Faculty Appointments
Faculty Governance at NU
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009

Course Evaluation at Carolina UNC Board of Governors policy: –Requires only that student evaluations of instruction must be conducted “…at regular intervals” (at least one semester each year) and “on an ongoing basis” for each instructional faculty member. Carolina does not have a policy mandating the use of a common method or instrument for student evaluation of instruction. –Deans are authorized to determine how their courses will be evaluated. –The Provost is committed to providing centralized services to support the basic course evaluation needs of schools and departments. However, use of these services – including the new product to be implemented this year -- is optional.

Methods Currently Used By Professional Schools Some have developed their own course evaluation processes to meet curriculum, accreditation, or other local needs. Examples: –Medicine: Comprehensive system of web-based student evaluations tailored for each component of the MD curriculum. –Business: Web-based system developed in-house to administer common evaluation form. –Law: Paper survey with open-ended questions distributed in class. –Others: Customized solution developed by outside vendor.

Current Services Offered Centrally: Carolina Course Evaluations Standard instrument designed by 1999 Provost’s Task Force on Student Evaluation of Teaching in response to concerns about the validity of the old Carolina Course Review for use in tenure and promotion decisions. Instrument contained sets of standard items to provide: –Instructors with feedback to help improve teaching skills, –Administrators with data use in personnel decisions, –Students with information to aid course selection. Also intended to include: –Flexibility to allow departments, schools, and instructors to choose items that would be diagnostic for improving teaching and/or applicable for certain course settings. –Separate distribution of results to intended audiences: administrators, instructors, and students. –Ongoing evaluation of psychometric properties.

Development of Current Services Task Force report endorsed by Faculty Council and Student Government in 1999 Additional Faculty Council resolution in 2001 to provide funding for complete implementation Center for Teaching and Learning charged with implementation Two Formats: –Traditional paper (bubble sheet) version administered in class and taken to Information Technology Services (ITS) for scanning. –Web-based version developed by ITS in 2006 and piloted with a few schools. Feedback collected from instructors, administrators, and students by Office of Institutional Research & Assessment in

Limitations to Current Paper and Web-Based Processes Many of the flexible features originally intended to allow instructors and departments to add items for instructional improvement purposes were not fully implemented in either the paper or web- based system. –Example: Item bank was available in web-version but it was not possible for instructors to create their own items. One-size-fits-all design – Could not adjust format for multiple instructors or schedule different evaluation periods to meet needs of courses with non-traditional session lengths Required core questions did not appear to be relevant to distance education courses Specialized reports for instructors, administrators, and students were difficult to deliver to intended audiences Very resource-intensive Would require reprogramming to work with PeopleSoft

Search for a New System or Services Course Evaluation Advisory Committee appointed and charged by Provost to work with Institutional Research & Assessment (OIRA) to identify a new web-based evaluation system or services from an outside vendor. Reviewed feedback from pilot studies, surveys, and interviews with campus users to establish requirements. Researched commercial products and services, surveyed peer institutions, investigated open-source solutions (e.g., Sakai) and future capacity of Blackboard and PeopleSoft to evaluate courses. Course Response ™ by Digital Measures offered the best combination of functionality, ease of administration, and cost. Very positive feedback from advisory committee and academic unit representatives attending demonstrations this summer.

Features of Course Response™ Flexible functions that can be set centrally or controlled at school, department, or course section level to: Customize contents of evaluation instruments; add items supplied by Provost, dean, chair, or instructor. Schedule evaluations when needed Evaluate multiple instructors, TA’s, guest lecturers, etc. Define who will receive what results (within limits set by University Counsel) Create custom reports and do ad hoc analysis of results A hosted, turn-key solution: Vendor provides all services including software, helpdesk, and security in IBM-owned and managed data center. Students, faculty, and administrators enter through existing campus portals and authenticate using Onyen Anonymity of student responses; FERPA-compliant Quick implementation; can be ready to use with fall 2009 courses

Costs Total annual cost to Carolina is based on number of schools participating and faculty FTE Cost per school ranges from $2,500 for < 25 FTE to $4,500 for 100+ FTE Discounts for additional schools and three-year agreement Estimated total annual cost for campus: $35,000 - $40,000, to be funded by Provost’s Office

Implementation Participants for Fall 2009 include: –School of Nursing –School of Education (selected courses) –School of Journalism & Mass Communication –College of Arts & Sciences – Departments of Romance Languages, Music, Sociology, and others Training and support provided for faculty and staff with eventual goal of self-management by academic units January 2010 – OIRA will collect user feedback and work with schools to plan for spring 2010 term. For more information, contact Larry Mayes or Lynn Williford in the Office of Institutional Research & Assessment or call