Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William.

Slides:



Advertisements
Similar presentations
University-Wide Course Evaluation Committee Peter Biehl, Chair, Department of Anthropology Krissy Costanzo, Committee Staff Support; Academic Affairs March.
Advertisements

What “Counts” as Evidence of Student Learning in Program Assessment?
Campus-wide Presentation May 14, PACE Results.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
1 Achieving Quality Enhancement Through Institutional Effectiveness in Changing Times Ms. Phuong T. T. Nguyen, Dr. Diane E. Oliver, & Dr. T. Gilmour Reeve.
Critical Thinking In Every Classroom Teaching Academy: New Faculty Orientation August 11, 2007.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
IDEA Student Ratings of Instruction: Adjunct Workshop Dr. Kristi Roberson-Scott Fall 2009 Semester.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
IDEA Student Ratings of Instruction Insight Improvement Impact ® Loyola University of Chicago April 10-11, 2013 Steve Benton, PhD Senior Research Officer.
TTU Quality Enhancement Plan (QEP) Improving Critical Thinking and Real-World Problem Solving Skills Through Active Learning TTU Quality Enhancement Plan.
TTU Teaching Evaluations at TTU Using the IDEA Instrument.
Program Review and General Education Assessment at the University at Albany: Past, Present and Future Barbara Wilkinson Assistant Director for Assessment.
NOW WHAT? Charting Your Course through Using NSSE Data Regional NSSE Users Workshop October 19-20, 2006.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Interpreting IDEA reports Diagnostic Form Short Form
Assessment Overview: Part 1. Overview Overview of IDEA Data – Not other college assessments like AACP surveys, experiential results, dashboards, etc.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Assessment Overview Drake CPHS. Overview Overview of IDEA Data Assessing college-wide teaching goal Advising Results Q&A.
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
Lamar University’s QEP: FRESHMAN SUCCESS Making it happen...
Derek Herrmann & Ryan Smith University Assessment Services.
Jason D. Powell Ferrum College Saturday, October 15, :30-2:30 PM ACA Summit Asheville, NC.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Supporting Students’ Academic Success: What is in our locus of control? Kay M. Sagmiller, Ph. D.
Developing a QEP for SACS What is the QEP? Quality Enhancement Plan Focused plan for improving student learning 5 year monitoring of progress Identified.
Camille Kandiko, Indiana University Bloomington Jon Acker and William Fendley, The University of Alabama Lawrence Redlinger, The University of Texas at.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
Online Faculty Development Modules Abstract Utilizing student feedback on effective instructional practices, Online Faculty Development Modules are designed.
© 2011 Partners Harvard Medical International Strategic Plan for Teaching, Learning and Assessment Program Teaching, Learning, and Assessment Center Strategic.
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
2009 Pitt Community College CCSSE Results September 21, 2009 Report to the Campus College CCSSE Results Pitt Community College Dr. Brian Miller, Assistant.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
1 PENN STATE HARRISBURG CAPITAL COLLEGE Curricular Cooperation at Penn State: Major Themes and Goals
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
 Shelley A. Chapman, PhD Texas A & M University February 2013.
Developing a QEP that is Relevant to the Institutional Vision and Mission SACS/COC 2005 Dr. Robert R. Bell, PRESIDENT Dr. Leo McGee, Associate Vice-President.
Mining IDEA Evaluation Reports to Enhance Teaching Effectiveness Faculty Development Workshop October 26, 2010 Tom Valasek, Interim Dean of Faculty.
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
Criterion 1 – Program Mission, Objectives and Outcomes Weight = 0.05 Factors Score 1 Does the program have documented measurable objectives that support.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Learning Communities at Ventura College. What are learning communities? Interdisciplinary learning Importance of sense of community for learning Student.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Teaching Evaluations at TTU Using the IDEA Instrument
Assessment & Evaluation Committee
Finding/Creating Meaning in SLO Assessment
Teaching Evaluations at TTU Using the IDEA Instrument
Helping US Become Knowledge-Able About Student Engagement
Assessment & Evaluation Committee
The Heart of Student Success
Assessing Academic Programs at IPFW
IDEA Student Ratings of Instruction
Elizabeth Crawford The University of Tennessee at Chattanooga
Presentation transcript:

Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William Pallett, The IDEA Center Dr. Barry Stein, Tennessee Tech University

Introduction: IDEA Student Ratings Presenters Content –Bill PallettIDEA & Program Assessment –Randi Hagen Student Learning Outcomes –Bill PallettQEP at Tennessee Tech –Sarah Logan Involving the Campus Feel free to ask questions during presentations. At the end, we want to hear your campus stories.

IDEA and Program Assessment: Aggregating Data Assessment Questions: IDEA as Supporting Evidence –Are course emphases consistent with stated curricular purposes? –Do courses’ overall student progress ratings compare favorably to courses at other institutions? –When a learning objective is selected as Essential or Important, does student self-report of learning meet our expectations?

Assessment Questions What teaching methods might we employ more effectively to support student learning? How do students’ work habits, motivation, etc. compare to students at other institutions? How do students view course work demands? What factors do instructors report have a positive/negative influence on student learning?

Longitudinal Questions Do results change over time in the desired direction? Does IDEA provide supporting evidence that innovations and interventions have been successful?

Student Learning Outcomes What should a student know and do as a result of taking this course at Flagler College?

IDEA Summary of Your Teaching Effectiveness

Instructor’s Progress on Specific Objectives

Areas to Improve Your Teaching Effectiveness

Getting Started Improving Your Teaching Effectiveness Design your college professional development program around the weaknesses of your faculty Individual Use: POD-IDEA Center Notes Individual or Group Use: IDEA Papers IDEA Seminars

Use of Data File InstructorDisciplineClass #Score AACC CACC EACC CACC EACC GACC HANT HANT KANT MART NART OART Progress on Relevant Objectives: Progress on those objectives selected by the instructor as important or essential to this class. 1= No apparent progress 2= Slight progress 3= Moderate progress 4= Substantial progress 5= Exceptional progress

Quality Enhancement Plan (QEP) QEP and SACS Accreditation –IDEA –NSSE

NSSE and IDEA Relationship NSSE Benchmarks Level of Challenge Active/Collaborative Learning Student/Faculty Interaction IDEA Objectives 3. Apply course material 4. Professional point of view 11. Analysis/critical evaluation 5. Work with others as team USE: Individual Report Group Summary pp. 5-6 IDEA Items 8. Stimulate effort beyond most classes 15. Inspired to set own challenging goals 5. Teams/discussion groups 14. “hands on” projects 16. Share ideas with others 18. Help each other understand USE: Individual Report Group Summary pp. 7-8

Using IDEA Results for the QEP at Tennessee Tech University Measuring Progress Selecting a QEP Topic Assessment Plan for the QEP Identifying Problem Areas

IDEA Teaching Evaluation Instrument Frequency Goals Selected Progress on Goals

Frequency IDEA Objectives Selected

Progress on IDEA Teaching Objectives

Assessment Plan Options - IDEA SampleCompare University Wide UseFrequency & Student Progress over time Targeted DisciplinesFrequency & Student Progress over time Targeted CoursesOld vs New Course (Frequency & Progress) Cost

IDEA Works with Other Assessments Enrolled Student Surveys (NSSE) Alumni Surveys Employer Surveys Performance Measures*

Involving the Campus: Angelo State University Diagnostic form features –Improvement of teaching –Programs add items of interest to satisfy accrediting agencies Answer internal questions Group reports –External comparisons at a point in time –Internal comparisons across time

Using Group Reports Campus-wide –Provost reviews the university and college reports and compares results from multiple years –Deans review college and department results University, college, and department results for Excellent teacher Excellent course Progress on Objectives are on a network drive for everyone to use.

Academic Departments Meet to compare current department ratings to –External benchmark –Former department ratings –Current college ratings Questions –In what ways are comparisons meaningful? –With what level of ratings are we satisfied? –What are reasons for lower than anticipated ratings? –In what areas do we want to improve?

Academic Programs Faculty teaching the same course discuss choice of objectives (and list them on syllabi). Faculty with highly structured curricula outline objectives for each course level. Questions –What objectives are appropriate for certain courses? –In what ways do objectives differ for upper- vs. lower-level courses so that students receive a well-rounded educational experience?

Summary: IDEA Student Ratings Assessment –Students’ perceived learning on course goals – Efficacy of instructors’ teaching methods Provide evidence for –Review and improvement of core curriculum academic programs –Accreditation and other reporting Faculty development