ABET Assessing Program Outcomes Amir Rezaei
Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program assessment Developing measurable outcomes Developing scoring rubrics Assessment methods Direct measures/Indirect measures Data collections Closing Loops Writing the report
Context of Assessment InputProgressOutputOutcomes Student Background Program & services offered; population served Student grades; graduation rates; employment statistics What have students learned; what skills have they gained; attitudes developed? Faculty background Faculty teaching loads/class size Publication numbers/Faculty development activities; credit hrs delivered Faculty publication citations data; faculty developement Educational Resources Policies, procedures, governance Statistics on resource availability, participation rate Student learning and growth What comes into the system? What are we doing with the inputs? How many? What is the effect? InputProgress Student Background Program & services offered; population served Faculty background Faculty teaching loads/class size Educational Resources Policies, procedures, governance Assessment of inputs and process only establishes the capability or capacity of program Output Student grades; graduation rates; employment statistics Publication numbers/Facult y development activities; credit hrs delivered Statistics on resource availability, participation rate Assessment of Outputs serve as indirect measures for effectiveness Outcomes What have students learned; what skills have they gained; attitudes developed? Faculty publication citations data; faculty developement Student learning and growth Assessment of Outputs provides or direct measurement of the effectiveness
Process of Assessment Can we demonstrate that student have learned outcomes xx to an appropriate level by the time of graduation? Can we demonstrate that we have added value to student learning of outcome xx to an appropriate level by the time of graduation? Collect data when they graduate Pre and post data collection
ABET TermsDefinition Some other terms for same concepts ObjectivesBroad statement that describe the career and professional accomplishments that the program is preparing graduates to achieve Goals, outcomes, purpose, etc OutcomesStatement that describe what students are expected to know and able to do by the time graduation Objectives, standards, etc Performance Criteria Specific, measurable statements identifying the performance(s) required to meet the outcome; evidence Standards, rubrics, specifications, metrics, outcomes, etc. AssessmentProcess that identify, collect, use and prepare data that can be used to evaluate achievement Evalautions EvaluationProcess of reviewing the results of data collection and analysis and making a determination of the value of finding and action to be taken Assesments ABET Terms Objectives Outcomes Performance Criteria Assessment Evaluation ABET uses these terms, Use same language to reduce the confusion
Similarities and differences between classroom and program assessment Degree of complexity Time span Accountability for the assessment process Cost Level of faculty buy-in Level of precision of the measure
Classroom Assessment Subject Statics Topics Statics of particles Equivalent system of forces Eq. Rigid bodies Equilibrium Structures Friction Concepts Forces in 2d & 3D Moment f a force about.. Eq in 2D & 3D FBD Trusses, Frames and Machines Friction Assessment Focus: Evaluate individual student performance (grades) Evaluate teaching/learning Context Subject matter Faculty member Pedagogy Student Facility Timeline 1 quarter
Environmental Factor Institutional Context Program Assessment Coursework & Curricular Patterns Classes chosen; major Educational Objective Out-of-class Experiences Co-curricular; Co-ops; internship Classroom Experience Pedagogy; Facility; Faculty & student Characteristics Student Pre-college Traits Timeline xx years
Developing measurable outcomes Work Effectively with other Ability to function on multi-disciplinary team Researches and gathers information Fulfill duties of team roles Shares work equally Listen to other teammates Makes contributions Takes responsibility Value other viewpoints Objetive Outcomes Performance Criteria
Developing scoring rubrics A rubric is a set of categories which define and describe the important components of the work being completed, critiqued, or assessed Purposes − Information to/about individual student competence (Analytic) − Overall examination of the status of the performance of a group of students (Holistic)
Developing scoring rubrics Generic − Big picture approach − Element of subjectivity Task-specific − Single task − Focused approach − Less subjective You don’t have to develop a rubric for every outcomes -Note Rubric template
Assessment methods Written surveys and questionnaires Exit and other interviews Commercial, standardized exams Locally developed exam Archival records Focus group Portfolios Simulations Performance Appraisal External examiner Oral exam Behavioral observations No rubric Rubric No rubric Rubric
Direct and Indirect Measures Direct measures: − Direct measures provide for the direct examination or observations of student knowledge or skills against measurable learning outcomes Indirect Measures: − Indirect measures are those that ascertain the opinion of self-report of the extent or value of learning experiences
DirectIndirect Exit and other interviews Standardized exams Locally developed exams Portfolios Simulations Performance Appraisal External examiner Oral exam Behavioral observation Written surveys and questionnaires Exit and other interviews Archival records Focus groups
Sampling And Data collection For program assessment, sampling is acceptable and desirable for programs of sufficient size Year 1 Year 2 Year 3 Define Outcomes/ Map curr. Define Outcomes/ Map curr. Data Collection Data Collection Evaluation & Design of implementat ion Evaluation & Design of implementat ion Implement Improveme nts & Data collec. Implement Improveme nts & Data collec. Year 4
Closing the Loop Evaluation committee receive and evaluate all data; makes report and refers recommendations to appropriate areas Institute acts on the recommendations of the Eval. Comm. Report of actions taken by the Institute and the targeted areas are returned to the Eval Comm. For iterative evaluation Institute assessment comm. Prepares reports for submission to dept. Heads of the collected data (e.g. survey, e- portfolio rating)