Download presentation
Presentation is loading. Please wait.
Published byAllison Cox Modified over 9 years ago
1
“Transformative Assessment Case Study” June 20, 2003 Gloria M. Rogers, Ph.D. Rose-Hulman Institute of Technology Copyright [Gloria M. Rogers, Rose-Hulman Institute of Technology] [2003]. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.
2
Rose-Hulman Institute of Technology Terre Haute, Indiana 1800+ undergraduate students B.S. degrees in engineering, science, and mathematics 85%+ engineering students 18% female
3
Catalyst for assessment (Note absence of the word ‘transformative’) Regional accreditation ABET accreditation Absence of coherent planning process Although the need to have an “assessment plan” was the catalyst for action, assessment ‘emerged’ from the planning process —can’t assess what you don’t have
4
Assessing Student Learning Five strategic goals linked to mission statement (Input, Quality, Climate, Learning Outcomes, Resources) Created dashboard indicators for four ‘non-outcome’ goals Defined our assessment question for student learning outcome “Can students demonstrate the performance criteria at a level appropriate for a student who will graduate from Rose-Hulman?” Defined our assessment process
5
Principles that guided our assessment/evaluation processes AAHE 9 Principles of Good Practice for Assessing Student Learning (www.aahe.org/assessment/principl.htm) Faculty criteria for choosing assessment methods Direct measures of student learning Results meet external demands Non-intrusive (students and faculty) Easy
6
Student portfolios primary assessment tool RosE-Portfolio developed Student Module 1997 Faculty Rating module 1998 Curriculum Mapping module 1999
11
Example of Results Is the submitted material at a level expected of a student who will graduate from RHIT? YES 1.Identify the readers/audience 2.Technical Content 3.Audience response 4.Grammatically correct
12
Outcome Explicit. This outcome is explicitly stated as being a learning outcome for this course. Demonstrate Competence. Students are asked to demonstrate their competence on this outcome through homework, projects, tests, etc. Formal Feedback. Students are given formal feedback on their performance on this outcome. Not covered. This outcome is not addressed in these ways in this course. Outcome Outcome Explicit Demonstrate Competence Formal Feedback Not Covered 1. Recognition of ethical and professional responsibilities. view criteria or make a comment (optional) Yes 2. An understanding of how contemporary issues shape and are shaped by mathematics, science, and engineering. view criteria or make a comment (optional) Yes 3. An ability to recognize the role of professionals in the global society and to understand diverse cultural and humanistic traditions. view criteria or make a comment (optional) Yes 4. An ability to work effectively in teams. view criteria or make a comment (optional) Yes 5. An ability to communicate effectively in oral, written, graphical, and visual forms. view criteria or make a comment (optional) Yes 6. An ability to apply the skills and knowledge necessary for mathematical, scientific, and engineering practices. view criteria or make a comment (optional) Yes 7. An ability to interpret graphical, numerical, and textual data. view criteria or make a comment (optional) Yes 8. An ability to design and conduct experiments. view criteria or make a comment (optional) Yes 9. An ability to design a product or process to satisfy a client’s needs subject to constraints. view criteria or make a comment (optional) Yes 1.Demonstrate knowledge of code of ethics 2.Evaluate the ethical dimensions of a problem in their discipline
14
Transformation? Emerged from institutional vision, mission, culture and context Focused on learning outcomes, processes and purposes Qualitative in nature Based on iterative, collaborative framework…explains relationship between teaching/learning… Large-scale, systemic and contagious Enabled by intelligent and appropriate technology Informed by assessment and a commitment to data-driven decision making.
15
Assessing Transformation Assessment Purpose Data Acquisition and Analysis Application of Findings Dissemination
16
Yr.2: 1)Assessment process emerged from the planning process and involved key campus constituents and external stakeholders. No impact on decision making. 2) Data documenting student learning from multiple sources and evaluated by multi-disciplinary faculty 3) Results not used to reshape teaching and learning beyond isolated areas; 4) Results only used to change the assessment process itself. Yr.3 Rating rubrics refined, performance criteria more focused, curriculum map developed. Focus was primarily on process and engaging more faculty. Yr.4: Curriculum map implemented; new conversations about student outcomes; data used to make course improvements—some progress but still isolated. Faculty leaders emerge. Yr.5: Departments use curriculum map to drive discussions and decision-making about alignment of courses to outcomes. Faculty-led seminars instituted to inform new faculty and engage other faculty in discussions related to student outcomes and RosE-Portfolio process. Yr.6: VP actively supporting processes. Department Heads ‘officially’ sign on to using RosE-Portfolio data as the primary source for data collection for ‘soft six’. Timeline developed that focused the data collection effort. Yr.7: Faculty take ownership of process; significant increase in student participation; faculty-led seminars well attended; Departments including data in their department assessment planning; institutional budget line for portfolio process and rating.
18
Assessing Transformation Assessment Purpose Data Acquisition and Analysis Application of Findings Dissemination
19
Barriers to change Case for action was externally driven Unfamiliar processes (distinctions between classroom and institutional assessment) Uninformed faculty Uninterested students Uninvolved administrators
20
Assessing Change Faculty engagement (how many, in what ways) Student participation (who, in what ways) “Whine” meter Budgetary support Data requests Department agendas (changing conversations) Curriculum changes based on assessment results
21
Assessing Technology Pilot test each module before implementation Students Faculty Data users Embedded assessment in rating module (“log”) Focus groups with raters following each rating session Embed assessment/feedback in student module Embed feedback in curriculum map
22
Lessons Learned Define your assessment question! Develop robust assessment processes appropriate to the question (including closing the loop) You can’t do everything Pick your battles More data are not better Be clear about carrots and sticks Have to have a ‘technology owner’ (build confidence) Faculty leadership/administrative support LEADERSHIP, LEADERSHIP, LEADERSHIP (You’re da Dean)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.