Rose-Hulman Institute of Technology

Slides:



Advertisements
Similar presentations
“Strategies for Harnessing Information Technology to
Advertisements

Analyzing Student Work
Technical Communication and the RosE-Portfolio Documenting and Reflecting on the Development of Your Communication Skills.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
EPortfolio Assessment Pilot. Agenda Purpose of the ePortfolio assessment pilot CSD use of ePortfolio English department use of ePortfolio Future applications.
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
The Personal Development Plan (PDP)
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost.
Iowa Collaborative Assessment Modules (ICAM) Heartland Area Education Agency.
The Department of Educational Administration Assessment Report School of Education and Human Services Carol Godsave, Chair, Assessment Coordinator.
BY Karen Liu, Ph. D. Indiana State University August 18,
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
Overview  Portfolio Basics Portfolio purposes Learning objectives to be addressed Roles of student and faculty in portfolio development, implementation.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Mary Ann Roe e-Colorado Portal Coordinator Colorado Department of Labor and Employment Jennifer Jirous Computer Information Systems Faculty Pikes Peak.
The Rocket Science of Score Points Holistic Scoring and the New Jersey HSPA Writing Assessment.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Assessment in student life
Assessment Planning and Learning Outcome Design Dr
Document Development Cycle
CRITICAL CORE: Straight Talk.
Consider Your Audience
LASC 2010 Program Review Orientation
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Innovating Rubrics: Inviting Dialogue With and About Student Writing
Assessment-driven Core Reform
IB Assessments CRITERION!!!.
Writing Tasks and Prompts
Proposed Revisions to Criteria 3 and 5
D:\Student Guide Booklet fall 2008.doc
Institutional Learning Outcomes Assessment
Editing & Polishing your Assignment
Your Inquiry Project
Parent Involvement Committee EQAO Presentation
Institutional Effectiveness USF System Office of Decision Support
Department of Computer Science The University of Texas at Dallas
Forging the Innovation Generation
Quantitative Literacy at WOU
Assessment and Accreditation
Technical Communication and the RosE-Portfolio
PART ONE: Assessing Institutional Learning Outcomes
Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Director of Assessment Rose-Hulman Institute of Technology
GMMD Program Canino School of Engineering Technology Fall 2015 Assessment Report
Curriculum Coordinator: Marela Fiacco Date : February 29, 2015
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Industrial Technology Management Program Canino School of Engineering Technology Fall 2015 Assessment Report Curriculum Coordinator: Eric Y. Cheng Date.
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Designing Your Performance Task Assessment
Student Learning Outcomes at CSUDH
Opportunities to Enhance Quality at EKU
Student Learning Outcomes Assessment
Presentation transcript:

Rose-Hulman Institute of Technology Assessment Exploration Days: Using (Electronic) Portfolios for Assessment Julia M. Williams Associate Professor of English and Coordinator of Technical Communication Rose-Hulman Institute of Technology 4/15/2019 Julia M. Williams, RHIT

Using Portfolios for Assessment Portfolios, ABET, and Collecting Data on Student Learning Outcomes Portfolio Design: Five Principles and the Case of Rose-Hulman Institute of Technology Using Portfolios to Assess Engineering Communication Electronic Portfolios The Future of Portfolios in Engineering Education 4/15/2019 Julia M. Williams, RHIT

Portfolios, ABET, and Collecting Data on Student Learning Outcomes Promises and Pitfalls of Portfolios as Assessment Method 4/15/2019 Julia M. Williams, RHIT

Fateful Words Student Portfolios . . . (ABET Homepage) The assessment process must demonstrate that the outcomes important to the mission of the institution and the objectives of the program, including those listed above [ABET a-k], are being measured. Evidence that may be used includes, but is not limited to the following: Student Portfolios . . . (ABET Homepage) 4/15/2019 Julia M. Williams, RHIT

Portfolio purposes Importance of clear understanding of outcomes desired from portfolio use Clear purpose Design driven by desired outcome and skills to be assessed Growth model Showcase Hybrid 4/15/2019 Julia M. Williams, RHIT

Why should we do portfolios? What do we want to measure? In what ways are we currently measuring it? What will portfolios show us about it that we don’t know already? Who will teach students to use portfolios? How will we maintain portfolios? Who will assess the portfolios? Who wants this information? What will we do with the information 4/15/2019 Julia M. Williams, RHIT

Portfolio Design: Five Principles and the Case of RHIT Defining, Identifying, Correlating, Facilitating, and Assessing 4/15/2019 Julia M. Williams, RHIT

Five Portfolio Principles Defining the learning objective (communication) Identifying appropriate skills and mapping the curriculum to locate where they should be developed Correlating learning activities to program and course objectives Facilitating students’ work with portfolios Assessing student learning for improvement (curricular, program, faculty, student) Principles in action at RHIT 4/15/2019 Julia M. Williams, RHIT

Rose-Hulman Institute of Technology Terre Haute, Indiana 1500+ undergraduate students B.S. degrees in engineering, science, and mathematics 80%+ engineering students

Portfolio Project Context Institute Project: RosE-Portfolio Piggyback course portfolio onto Institute portfolio Solving a student learning disconnect Student perception: class work unrelated to assignments Lack of reflection on learning generally 4/15/2019 Julia M. Williams, RHIT

Defining the Learning Objective 4/15/2019 Julia M. Williams, RHIT

Defining Engineering Communication: Institute Level When given the opportunity, students will: 1. Identify the readers/audience for a communication task by assessing their technical knowledge and information needs. 2. Organize/design information to meet readers/audience needs. 3.   Provide technical content that is factually correct, supported with evidence, explained with sufficient detail, and properly documented. 4    Test audience response to communication tasks to determine how well ideas have been relayed. 5.   Submit work with a minimum of errors in spelling, punctuation, grammar, and usage. 4/15/2019 Julia M. Williams, RHIT

Defining Engineering Communication: Course Level 1. Familiarity with the forms of communication appropriate to workplace communication 2. Ability to apply organizational patterns to structure technical information 3. Ability to perform audience analysis as the basis for planning and completing any communication task 4. Ability to locate information and assess its accuracy 5. Consistent use of proper spelling, grammar, punctuation, and mechanics 6. Experience with testing audience response through Peer Review Workshops 7. Familiarity with various technologies useful in the completion of communication tasks 8. Ability to complete an electronic technical writing portfolio suitable for presentation during job interviews 4/15/2019 Julia M. Williams, RHIT

Defining Engineering Communication: Course Level 1. Familiarity with the forms of communication appropriate to workplace communication 2. Ability to apply organizational patterns to structure technical information 3. Ability to perform audience analysis as the basis for planning and completing any communication task 4. Ability to locate information and assess its accuracy 5. Consistent use of proper spelling, grammar, punctuation, and mechanics 6. Experience with testing audience response through Peer Review Workshops 7. Familiarity with various technologies useful in the completion of communication tasks 8. Ability to complete an electronic technical writing portfolio suitable for presentation during job interviews 4/15/2019 Julia M. Williams, RHIT

Importance of measurable learning objectives Facilitates meaningful evaluation Provides a common language Use tools to facilitate the process You cannot do it all Prioritize Involve your key constituents Communicate with students 4/15/2019 Julia M. Williams, RHIT

Identifying/Mapping Engineering Communication Objectives 4/15/2019 Julia M. Williams, RHIT

Identifying/Mapping Engineering Communication Objectives Curriculum Map RH 330 Technical Communication Required of juniors in CE, ECE, ME, AO Teaches principles of effective engineering communication Preparation for writing required in Senior Design 4/15/2019 Julia M. Williams, RHIT

Student and Faculty roles in portfolio design and use Be clear about how faculty are involved in the design and use of portfolio regardless of format Meet their needs for information Sensitivity to workload Non-intrusive as possible Involve students in meaningful ways Be clear about what’s in it for students Process should reinforce and be aligned with the educational process 4/15/2019 Julia M. Williams, RHIT

Correlating Learning Objectives 4/15/2019 Julia M. Williams, RHIT

Correlating Learning Objectives Materials that students submit: Audience Analysis Worksheet, Final Tech Comm Report Course Objectives/Task Matrix Use of portfolios in class Identifying appropriate materials for submission Importance of reflective statement: brief paragraph; makes the “case” for the relevance of submitted material to the learning objective 4/15/2019 Julia M. Williams, RHIT

Student and Faculty Roles Who is going to review/rate portfolio contents? Who is going to be responsible for analysis of data? What mechanisms do you have in place to evaluate the data? How do the results relate to your educational delivery strategies—curricular and co-curricular? 4/15/2019 Julia M. Williams, RHIT

Facilitating Students’ Work with Portfolios 4/15/2019 Julia M. Williams, RHIT

Facilitating: Reflection Statement Making a case of the relevance of submitted material to the learning objective Pause to reflect on what exactly they are being asked to do and why Practice in making arguments--what the student has done and how it meets “company” goals 4/15/2019 Julia M. Williams, RHIT

Questions that the RS answers What did I do? What does it mean? What have I learned? How might I do things differently? Why is it relevant to this objective? 4/15/2019 Julia M. Williams, RHIT

Facilitating: Reflection Statement Fall Quarter 2000: students go through directed motions to complete course requirements Poor reflection Winter 2000-01: Changes to course portfolio practice 4/15/2019 Julia M. Williams, RHIT

RS Samples: Fall 2000 Criterion 1: Identify readers, assess technical knowledge/information needs Student A: “This document was prepared in order to analyze the audience for a technical communication paper. It looks at two different audiences and analyzes them with regards to several different aspects.” 4/15/2019 Julia M. Williams, RHIT

RS Samples: Fall 2000 Criterion 1: Identify readers, assess technical knowledge/information needs Student B: “This document speaks for itself. It details the work that went into writing the technical paper, ‘Recognition of Context-Free Grammars’ for a specific audience.” 4/15/2019 Julia M. Williams, RHIT

RS Samples: Fall 2000 Student C: “This is one of the main purposes of the document I've created. Upon deciding a topic for the tech comm paper, I had to decide who I would write for. My choice was to write for farmers, so I considered how much technical information they would have on the topic. I knew that farmers would have uses for sensors in agriculture, but I also understood that many didn't have college educations, let alone much technical background on how some equipment worked. I understood that they would want some information, but not extremely technical information on how sensors worked.” 4/15/2019 Julia M. Williams, RHIT

RS Samples: Winter 2000-01 Student D: In this document, I looked at two main groups of potential readers for my Tech Comm paper. Those groups were professional experts, and 3rd or 4th year Rose-Hulman students. For the experts, I identified a need to demonstrate my understanding of the topic. There was no reason to teach them anything, as they were most likely very familiar with the research topics I addressed. I also recognized a demand for in depth technical explanations to again prove comprehension on my part. 4/15/2019 Julia M. Williams, RHIT

RS Samples: Winter 2000-01 Student D (cont.): For the students, I immediately recognized the usual response of reluctance towards any long, technical paper. In this audience analysis, I outlined strategies such as using impressive color images, targeted section headings, and understandable analogies to help accomodate [sic] the student reader. Most of the technical terms were considered to be at the level of the student, and so were not explained unless clearly foreign. 4/15/2019 Julia M. Williams, RHIT

Assessing Student Learning for Improvement 4/15/2019 Julia M. Williams, RHIT

Assessment of student material Faculty work in teams Each team assesses one learning objective Score holistically Emerging rubrics Does the reflective statement demonstrate or argue for the relevance of the submitted material to the criterion? Is the submitted material at a level expected of a Rose-Hulman graduate? 4/15/2019 Julia M. Williams, RHIT

Assessment Development of scoring rubrics Which ones, how often Linked to performance criteria Known to students Scales consistent with purpose of assessment (i.e., student/program) Which ones, how often Interrater reliability Feedback (students/program) 4/15/2019 Julia M. Williams, RHIT

Assessing Student Learning with Electronic Portfolios A Demonstration 4/15/2019 Julia M. Williams, RHIT

Show Me! 4/15/2019 Julia M. Williams, RHIT

Example of Results Reflection relevant to criterion? Expected for R-HIT graduate? 4/15/2019 Julia M. Williams, RHIT

Example of Results Reflection relevant to criterion? Expected for R-HIT graduate? 4/15/2019 Julia M. Williams, RHIT

Example of Results Is the submitted material at a level expected of a Rose-Hulman graduate? Appropriate for audience Organization Content factually correct Test audience response Grammatically correct 4/15/2019 Julia M. Williams, RHIT

Linking results to Practice Development of Curriculum Map Linking curriculum content/pedagogy to knowledge, practice and demonstration of learning outcomes

Curriculum Map Results Fall 1999-2000 (181 courses/labs) Communication Skills 4/15/2019 Julia M. Williams, RHIT

Curriculum Map Results Fall 1999-2000 (181 courses/labs) Ethics 4/15/2019 Julia M. Williams, RHIT

Institute acts on the recom-mendations of the Eval. Comm. Closing the loop JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC WINTER SPRING SUMMER FALL Eval Committee receives and evaluates all data; makes report and refers recom-mendations to appropriate areas. Institute acts on the recom-mendations of the Eval. Comm. Reports of actions taken by the Institute and the targeted areas are returned to the Eval Comm. for iterative evaluation. Institute assessment cmte. prepares reports for submission to Dept. Heads of the collected data (e.g. surveys, e-portfolio ratings). 4/15/2019 Julia M. Williams, RHIT

Using Electronic Portfolios for Assessment Julia M. Williams Associate Professor of English and Coordinator of Technical Communication Rose-Hulman Institute of Technology 4/15/2019 Julia M. Williams, RHIT

Assessment Method Truisms Implement Revise Develop There will always be more than one way to measure any outcome No single method is good for measuring a wide variety of different student abilities Generally an inverse relationship between the quality of measurement methods and their expediency Importance of pilot testing to see if method is good for your program (students & faculty) 4/15/2019 Julia M. Williams, RHIT

Portfolios are Fun! My Presentation: http://www.rose-hulman.edu/~williams/sdsmt.html My email: julia.williams@rose-hulman.edu RosE-Portfolio Site: http://www.rose-hulman.edu/IRA/IRA/index.html click on RosE-Portfolio link 4/15/2019 Julia M. Williams, RHIT

Portfolio Bibliography Accreditation Board for Engineering and Technology, 2000, http://www.abet.org/ Belanoff, 1994, Blair Resources for Teaching Writing: Portfolios, Prentice Hall. Belanoff and Dickson, eds., 1991, Portfolios: Process and Product, Boynton/Cook Publishers. Murphy and Grant, 1996, “Portfolio approaches to assessment: breakthrough or more of the same?”, Assessment of Writing: Politics, Policies, Practices, Ed. White, Lutz, and Kamusikiri, Modern Language Association Paulson, Paulson, and Meyer, 1991, “What makes a portfolio a portfolio?”, Educational Leadership, 48 (5), pp. 60-3. Rogers and Williams, 1999, “Building a better portfolio,” ASEE Prism, January, pp. 30-2. Yancy and Weiser, eds., 1997, Situating Portfolios: Four Perspectives, Utah State University Press. 4/15/2019 Julia M. Williams, RHIT