“Transformative Assessment Case Study” June 20, 2003 Gloria M. Rogers, Ph.D. Rose-Hulman Institute of Technology Copyright [Gloria M. Rogers, Rose-Hulman.

Slides:



Advertisements
Similar presentations
Glenn Johnson John A. Dutton e-Education Institute Project Manager, Penn States e-Portfolio Initiative Glenn Johnson John A. Dutton e-Education Institute.
Advertisements

© Scordias & Morris, 2005 Virtual Classroom Visits: Using Video Conferencing Technology to Enhance Teacher Education Dr. Margaret Scordias Pamela B. Morris,
As presented to the Global Colloquium on Engineering Education Deborah Wolfe, P.Eng. October 2008 The Canadian Process for Incorporating Outcomes Assessment.
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
ABET-ASAC Accreditation Workshop ABET Criteria and Outcomes Assessment
Assessing the Effects of Virtual Communities of Practice on Professional Development 2003 AAHE Assessment Conference Darren Cambridge & Vicki Suter EDUCAUSE/NLII/AAHE.
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
IT Strategic Planning Project – Hamilton Campus FY2005.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
The Academic Assessment Process
Computer Science Accreditation/Assessment Issues Bolek Mikolajczak UMass Dartmouth, CIS Department Chair IT Forum, Framingham, MA January 10, 2006.
UWM CIO Office A Collaborative Process for IT Training and Development Copyright UW-Milwaukee, This work is the intellectual property of the author.
Beyond Basic Computer Skills: Implementing Technology Fluency Cynthia Edwards, Professor of Psychology Kristin Watkins, Computer Applications Specialist.
Chatham College Community and Computers Pervasive Computing at a Liberal Arts College Charlotte E. Lott, Ph. D. Lynda Barner West, Ed. D. Copyright Charlotte.
NLII Mapping the Learning Space New Orleans, LA Colleen Carmean NLII Fellow Information Technology Director, ASU West Editor, MERLOT Faculty Development.
Information Technology Career Ladder Clayton College & State University Larry Booth, IT Department Head Copyright Larry Booth,
Copyright Marilyn Drury, Darrell Fremont, Doreen Hayek, This work is the intellectual property of the authors. Permission is granted for this material.
ABET Accreditation Board for Engineering and Technology
Capstone Design Project (CDP) Civil Engineering Department First Semester 1431/1432 H 10/14/20091 King Saud University, Civil Engineering Department.
Assessment College of Engineering A Key for Accreditation February 11, 2009.
C AMPUS-WIDE E -PORTFOLIO I NITIATIVE: WHY DID IT HAPPEN, HOW DID IT WORK? : Monique Fuchs Learning Technology Solutions Project Lead – E-portfolio Initiative.
1 The Business Growth Innovation Collaboratory (BGIC) NC State Executive Education.
Catalyst Portfolio Tool Copyright Tom Lewis, This work is the intellectual property.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
Introduction to Engineering and Urban Planning How to be a Successful Engineer.
ACADEMIC PERFORMANCE AUDIT
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
OBE Briefing.
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Creating a Learning Community Vision
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Focus on Learning: Student Outcomes Assessment and the Learning College.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Outcome-based Education – From Curriculum to Classroom practices
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
Overview  Portfolio Basics Portfolio purposes Learning objectives to be addressed Roles of student and faculty in portfolio development, implementation.
AdvancED District Accreditation Process © 2010 AdvancED.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
George Mason University Assessing Technology Support: Using Portfolios to Set Goals and Measure Progress Anne Agee, Star Muir, Walt Sevon Information Technology.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
Supporting ABET Assessment and Continuous Improvement for Engineering Programs William E. Kelly Professor of Civil Engineering The Catholic University.
Design of a Typical Course s c h o o l s o f e n g I n e e r I n g S. D. Rajan Professor of Civil Engineering Professor of Aerospace and Mechanical Engineering.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright Information Copyright Karen St.Clair & Stan North.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Kimberly B. Lis, M.Ed. University of St. Thomas Administrative Internship II Dr. Virginia Leiker.
Quality Assurance Review Team Oral Exit Report School Accreditation AUTEC School 4-8 March 2012.
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
National Learning Infrastructure Initiative Transformative Assessment Project.
Assessing Learning Outcomes through Electronic Portfolios: The Rose-Hulman Institute of Technology, Terre-Haute, IN.
University of Utah Program Goals and Objectives Program Goals and Objectives Constituents U of U, COE, ASCE, IAB Constituents U of U, COE, ASCE, IAB Strategic.
Engineering programs must demonstrate that their graduates have the following: Accreditation Board for Engineering and Technology (ABET) ETP 2005.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Accreditation Board for Engineering and Technology
Copyright Information
Proposed Revisions to Criteria 3 and 5
Department of Computer Science The University of Texas at Dallas
Assessment and Accreditation
February 21-22, 2018.
Presentation transcript:

“Transformative Assessment Case Study” June 20, 2003 Gloria M. Rogers, Ph.D. Rose-Hulman Institute of Technology Copyright [Gloria M. Rogers, Rose-Hulman Institute of Technology] [2003]. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.

Rose-Hulman Institute of Technology  Terre Haute, Indiana  undergraduate students  B.S. degrees in engineering, science, and mathematics  85%+ engineering students  18% female

Catalyst for assessment (Note absence of the word ‘transformative’)  Regional accreditation  ABET accreditation  Absence of coherent planning process  Although the need to have an “assessment plan” was the catalyst for action, assessment ‘emerged’ from the planning process —can’t assess what you don’t have

Assessing Student Learning  Five strategic goals linked to mission statement (Input, Quality, Climate, Learning Outcomes, Resources)  Created dashboard indicators for four ‘non-outcome’ goals  Defined our assessment question for student learning outcome “Can students demonstrate the performance criteria at a level appropriate for a student who will graduate from Rose-Hulman?”  Defined our assessment process

Principles that guided our assessment/evaluation processes  AAHE 9 Principles of Good Practice for Assessing Student Learning (  Faculty criteria for choosing assessment methods  Direct measures of student learning  Results meet external demands  Non-intrusive (students and faculty)  Easy

Student portfolios primary assessment tool  RosE-Portfolio developed  Student Module 1997  Faculty Rating module 1998  Curriculum Mapping module 1999

Example of Results Is the submitted material at a level expected of a student who will graduate from RHIT? YES 1.Identify the readers/audience 2.Technical Content 3.Audience response 4.Grammatically correct

Outcome Explicit. This outcome is explicitly stated as being a learning outcome for this course. Demonstrate Competence. Students are asked to demonstrate their competence on this outcome through homework, projects, tests, etc. Formal Feedback. Students are given formal feedback on their performance on this outcome. Not covered. This outcome is not addressed in these ways in this course. Outcome Outcome Explicit Demonstrate Competence Formal Feedback Not Covered 1. Recognition of ethical and professional responsibilities. view criteria or make a comment (optional)  Yes  2. An understanding of how contemporary issues shape and are shaped by mathematics, science, and engineering. view criteria or make a comment (optional)  Yes  3. An ability to recognize the role of professionals in the global society and to understand diverse cultural and humanistic traditions. view criteria or make a comment (optional)  Yes  4. An ability to work effectively in teams. view criteria or make a comment (optional)  Yes  5. An ability to communicate effectively in oral, written, graphical, and visual forms. view criteria or make a comment (optional)  Yes  6. An ability to apply the skills and knowledge necessary for mathematical, scientific, and engineering practices. view criteria or make a comment (optional)  Yes  7. An ability to interpret graphical, numerical, and textual data. view criteria or make a comment (optional)  Yes  8. An ability to design and conduct experiments. view criteria or make a comment (optional)  Yes  9. An ability to design a product or process to satisfy a client’s needs subject to constraints. view criteria or make a comment (optional)  Yes  1.Demonstrate knowledge of code of ethics 2.Evaluate the ethical dimensions of a problem in their discipline

Transformation?  Emerged from institutional vision, mission, culture and context  Focused on learning outcomes, processes and purposes  Qualitative in nature  Based on iterative, collaborative framework…explains relationship between teaching/learning…  Large-scale, systemic and contagious  Enabled by intelligent and appropriate technology  Informed by assessment and a commitment to data-driven decision making.

Assessing Transformation Assessment Purpose Data Acquisition and Analysis Application of Findings Dissemination

Yr.2: 1)Assessment process emerged from the planning process and involved key campus constituents and external stakeholders. No impact on decision making. 2) Data documenting student learning from multiple sources and evaluated by multi-disciplinary faculty 3) Results not used to reshape teaching and learning beyond isolated areas; 4) Results only used to change the assessment process itself. Yr.3 Rating rubrics refined, performance criteria more focused, curriculum map developed. Focus was primarily on process and engaging more faculty. Yr.4: Curriculum map implemented; new conversations about student outcomes; data used to make course improvements—some progress but still isolated. Faculty leaders emerge. Yr.5: Departments use curriculum map to drive discussions and decision-making about alignment of courses to outcomes. Faculty-led seminars instituted to inform new faculty and engage other faculty in discussions related to student outcomes and RosE-Portfolio process. Yr.6: VP actively supporting processes. Department Heads ‘officially’ sign on to using RosE-Portfolio data as the primary source for data collection for ‘soft six’. Timeline developed that focused the data collection effort. Yr.7: Faculty take ownership of process; significant increase in student participation; faculty-led seminars well attended; Departments including data in their department assessment planning; institutional budget line for portfolio process and rating.

Assessing Transformation Assessment Purpose Data Acquisition and Analysis Application of Findings Dissemination

Barriers to change  Case for action was externally driven  Unfamiliar processes (distinctions between classroom and institutional assessment)  Uninformed faculty  Uninterested students  Uninvolved administrators

Assessing Change  Faculty engagement (how many, in what ways)  Student participation (who, in what ways)  “Whine” meter  Budgetary support  Data requests  Department agendas (changing conversations)  Curriculum changes based on assessment results

Assessing Technology  Pilot test each module before implementation  Students  Faculty  Data users  Embedded assessment in rating module (“log”)  Focus groups with raters following each rating session  Embed assessment/feedback in student module  Embed feedback in curriculum map

Lessons Learned  Define your assessment question!  Develop robust assessment processes appropriate to the question (including closing the loop)  You can’t do everything  Pick your battles  More data are not better  Be clear about carrots and sticks  Have to have a ‘technology owner’ (build confidence)  Faculty leadership/administrative support  LEADERSHIP, LEADERSHIP, LEADERSHIP (You’re da Dean)