What to Expect When You’re Expecting

Slides:



Advertisements
Similar presentations
For AS 229 (Environmental Technology). 1. A competent environmental technologist with strong understanding of fundamental scientific and technological.
Advertisements

Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas.
Good Afternoon Program Assessment Committee (PAC) Data Assessment Records and Analysis For the first three year Cycle Semesters Dr. Hussain Al-Zaher.
Assessing Students Preparedness to Compete and Succeed in a Global Economy Through Written Communications Robert A. Chin & Carolyn Dunn Donna Hollar Department.
1 A pupil from whom nothing is ever demanded which he cannot do, never does all he can. John Stuart Mill.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Computer Science Department Program Improvement Plan December 3, 2004.
A. ABET Update i. Overview of documents submitted to ABET (Self-Study) EWRE Retreat 8/2/2005 ii. What we need to do between now and ABET visit in November.
1. An ability to:  Understand the academic requirements you need to obtain your degree  Calculate your GPA  Prepare a draft schedule 2.
Mohammad Alshayeb 19 May Agenda Update on Computer Science Program Assessment/Accreditation Work Update on Software Engineering Program Assessment/Accreditation.
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
DIPOL Quality Practice in Training at İstanbul Technical University Maritime Faculty Dr.Banu Tansel.
Introduction The Mechanical Engineering Department at WPI was established in 1868 and the first undergraduate degrees were awarded in The Department.
ABET Accreditation Board for Engineering and Technology
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Capstone Design Project (CDP) Civil Engineering Department First Semester 1431/1432 H 10/14/20091 King Saud University, Civil Engineering Department.
Assessment College of Engineering A Key for Accreditation February 11, 2009.
Accreditation Board for Engineering and Technology - is a non governmental organization that accredits post secondary educational organizations in : 1)
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
CHEN Program Assessment Advisory Board Meeting June 3 rd, 2012.
King Fahd University of Petroleum and Minerals
OUTCOME BASED LEARNING- CONTINUES IMPROVEMENT. Motivation  PEC??  Continues Improvement.
A Sample Poster — Landscape Layout Name of Team Members Mechanical Engineering Department Introduction The Mechanical Engineering Department at WPI was.
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
CHEMICAL ENGINEERING PROGRAM CHEN Program Assessment Advisory Board Meeting May 21, 2013.
AL-QADISIYIA UNIVERSITY COLLEGE OF ENGINEERING SELF ASSESSMENT REPORT Submitted by SAR committee.
Overview of the Department’s ABET Criterion 3 Assessment Process.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
EENG 4910/4990 Engineering Design Murali Varanasi September 02, 2009.
1 A pupil from whom nothing is ever demanded which he cannot do, never does all he can. John Stuart Mill.
Design of a Typical Course s c h o o l s o f e n g I n e e r I n g S. D. Rajan Professor of Civil Engineering Professor of Aerospace and Mechanical Engineering.
CEN ABET Mini- Retreat March 4, CEN ABET Mini-Retreat Agenda: –State of the Assessments –Discussion on loop closings. –CSE Program Objectives/Outcomes.
Department of Electrical and Computer Engineering MDR Report.
Copyright © 2011 by ABET, Inc. and TMS 1 December 2, 2008 ABET Update UMC Meeting April 6, 2015 San Francisco, CA Chester J. Van Tyne
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
ABET Accreditation Status CISE IAB MeeertingOctober 6, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
HU113_Assignment31 HU113: Technical Report Writing Prof. Abdelsamie Moet Teaching Assistant: Mrs. Rana El-Gohary Fall 2012/13 Pharos University in Alexandria.
CEN Faculty MeetingMarch 31, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
CISE IAB MeetingOctober 15, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
Preparing for ABET visit Prof. Dr. Lerzan Özkale Management Engineering Head of Department November 2010.
ABET Accreditation Criterion 4: Continuous Improvement Direct Assessment of Learning Outcomes Dr. Abdel-Rahman Al-Qawasmi Associate Professor EE Department.
ENGINEERING ANALYSIS. WHAT IS ENGINEERING ANALYSIS? ABET Required Program Outcomes: (a) an ability to apply knowledge of mathematics, science, and engineering.
1 Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas (UTD) January 22, 2016.
COLLEGE OF ENGINEERING POLICIES THE WHAT, HOW AND WHY.
University of Utah Program Goals and Objectives Program Goals and Objectives Constituents U of U, COE, ASCE, IAB Constituents U of U, COE, ASCE, IAB Strategic.
Engineering programs must demonstrate that their graduates have the following: Accreditation Board for Engineering and Technology (ABET) ETP 2005.
ABET ACREDITATION By: Elizabeth Rivera Oficina de Acreditación.
Funded by a grant from the National Science Foundation. Any opinions, findings, conclusions or recommendations expressed are those of the authors and do.
Computer Engineering Program Outcomes Assessment Dept. of Computer Engineering King Fahd University of Petroleum & Minerals, Saudi Arabia Dept. of Computer.
Department of Electrical and Computer Engineering ABET Outcomes - Definition Skills students have graduation.
ABET Accreditation College of IT and Computer Engineering
Continuous Program Improvement
OUTCOME BASED EDUCATION
Closing the Assessment Loop
Accreditation Board for Engineering and Technology
Class Agenda Capstone Design Project Process 10 min
General Education Assessment
Proposed Revisions to Criteria 3 and 5
Neeraj Mittal September 29, 2017
Department of Computer Science The University of Texas at Dallas
Development of ABET Syllabus
Assessment and Accreditation
CE 220 Professionalism A pupil from whom nothing is ever demanded which he cannot do, never does all he can. John Stuart Mill.
Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Student Learning Outcomes at CSUDH
Curriculum Coordinator: Patrick LaPierre February 3, 2017
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
Presentation transcript:

What to Expect When You’re Expecting a Visitor from ABET August 22, 2016

Today’s Learning Objectives At the end of this discussion, you should be able to: List your responsibilities and tentative schedule for the ABET visit Describe our previous assessment process Describe our future assessment process (and why we changed it) List the primary responsibilities of each group in the assessment process Develop a rubric for assessing the achievement of a learning outcome

The Big Question Do we have a good process that: Measures student performance on established outcomes Assesses the results of these measurements to identify real or potential issues Responds by identifying and taking corrective action aimed at improving performance on outcomes Closes the loop by returning to step 1 to measure and reassess the outcomes There should be a similar process for program objectives. Perfection is the goal of the process, but not the expectation of ABET.

ABET Visit Schedule & Responsibilities ME Program Evaluator (PEV): Prof. Imin Kao, Stony Brook University Sunday, October 2, 2016 Activity: Review of course materials, tours of facilities Responsible: Chair and ABET coordinator to meet with PEV Responsible: Faculty to prepare materials, clean up labs, give lab tours Monday, October 3, 2016 Activity: Meetings with faculty, staff and students (to be arranged) Responsible: faculty, staff and students Tuesday, October 4, 2016 (first half of day) Activity: More meetings with faculty, staff and students (to be arranged)

Purpose of Faculty Meetings with Visitor Evaluate the overall quality and attitude of the faculty regarding undergraduate education Passion and enthusiasm about the program and about our students is important, but be honest Complaining / whining about known issues does not help, but these should be acknowledged It’s expected to discuss our challenges, but also be prepared to discuss what we are doing to address those challenges Evaluate the faculty’s understanding of the Program Educational Objectives (PEOs) and how they are determined and assessed Student Outcomes and how they are determined and assessed Provide examples of improvements (closing the loop)! A cheat sheet will be provided

Enrollment Growth & Teaching Capacity Page 3 of our ABET self-study report: “The above activities have been accomplished in an environment of extreme enrollment growth in the BSME program. Since 2010, total enrollment in the program has approximately doubled, while the number of faculty has stayed constant. The teaching capacity has been increased through the hiring of three full-time teaching specialists, each of whom teaches three sections per semester. Additional tenure-track faculty and teaching specialist hires are expected in the next few years.” Update: Two teaching specialists and one faculty member recently left the program This summer we hired three teaching specialists and one part-time instructor as a result of additional college support

Problem Solving Skills Page 49 of our ABET self-study report, referring to outcome assessment results: “… the low scores are largely due to deficiencies in the process of solving for internal loads and stress states at a point. This issue needs further attention. Informal discussions with other ME 471 instructors confirmed the weakness noted above, which is also consistent with the most common mistake in the assessment performed in ME 361. As a result, discussions are underway to modify the series of mechanics and design courses to improve these skills.” Response: A new approach to teaching ME 222 will be attempted during fall 2016, involving aspects of mastery learning The mechanical design courses ME 371 and ME 471 are being restructured Recitation sections in core courses are being introduced next year to emphasize problem solving skills and practice

Our Objectives Program Educational Objectives (PEOs) describe the expected accomplishments of our students several years after graduation Our graduates will: Be competent and ethical engineers practicing in a diverse range of activities Use their mechanical engineering education as a stimulus for personal and professional growth Be recognized for their capability, creativity, and application of knowledge Be independent and critical thinkers who identify problems and develop effective solutions

Assessment of Our Objectives 1. Groups are surveyed: Faculty, Students, Alumni, Board of Advisors, Capstone Design Sponsors 2. Processed survey data is reviewed by the MEUCC 3. MEUCC reports to faculty with recommended changes to the PEOs or to the assessment process 4. Faculty discusses and approves changes to the PEOs or to the assessment process Administered by the Associate Chair for Undergraduate Studies

Our Student Outcomes Student outcomes describe the expected capabilities of our students at graduation Our graduates will have: (a) An ability to apply knowledge of mathematics, science, and engineering (b) An ability to design and conduct experiments, as well as to analyze and interpret data (c) An ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability (d) An ability to function on multidisciplinary teams (e) An ability to identify, formulate, and solve engineering problems (f) An understanding of professional and ethical responsibility (g) An ability to communicate effectively (h) The broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context (i) A recognition of the need for and the ability to engage in life-long learning (j) A knowledge of contemporary issues (k) An ability to use the techniques, skills, and modern engineering tools necessary for engineering practice

Assessment of Student Outcomes Currently, we use a sampling approach, which takes place every 2-3 years: Courses are identified that emphasize one or more student outcomes For each outcome emphasized, the instructor identifies an assignment to measure that outcome as well as a level of minimum competency (e.g., a score of 65% on this assignment) Measurement results are provided to the MEUCC, which sets a metric goal (e.g., > 80% of students reach above minimum competency) The Board of Advisors and the MEUCC each assign a grade based on achievement of the metric goals Areas of weakness are identified by the MEUCC and discussed with instructors in that area. Instructors make any necessary modifications and then reassess the outcomes. In addition, a graduating senior focus group survey is conducted in the spring semester of each year. Data and comments are evaluated by the MEUCC.

NEW Assessment of Student Outcomes In the future, we will use rubrics to assess student outcomes: Courses are identified that emphasize one or more student outcomes For each outcome emphasized, the instructor identifies an assignment to measure that outcome and uses an established rubric to determine what percentage of students meet or exceed the minimum competency goal Measurement results are provided to the MEUCC, which sets a metric goal (e.g., > 80% of students reach above minimum competency) The Board of Advisors and the MEUCC identify areas of weakness and discuss possible ways to improve student performance on that outcome Instructors make any necessary modifications and then reassess the outcomes In addition, a graduating senior focus group survey is conducted in the spring semester of each year, and survey data from all graduating seniors is collected each year. Data and comments are evaluated by the MEUCC.

NEW Assessment of Student Outcomes In the future, we will assess student outcomes on a continuous basis (instead of every 2-3 years) Measure and assess approximately one-third of items of evidence each year (in Sets), with most outcomes represented in each set More continuous improvement process, with more focus on each outcome Reduced chance that a cohort of students will pass through the program without being assessed in each outcome Proposal: MEUCC writes a yearly report summarizing assessment results   Measure Assess Improve Fall 2017 / Spring 2018 Set 1 Previous changes Fall 2018 / Spring 2019 Set 2 Fall 2019 / Spring 2020 Set 3 + PEOs Fall 2020 / Spring 2021 Fall 2021 / Spring 2022 Fall 2022 / Spring 2023 continue …

Rubrics A rubric defines the expectations for an assignment by listing the criteria and describing levels of quality We have decided to use three (3) levels, or Degrees of Achievement: Deficient – Does not meet minimum competency Satisfactory – Meets expectations Superior – Exceeds expectations Levels 2 and 3 meet or exceed the expected minimum competency, while level 1 does not Need to define explicitly the expectation for each level For each outcome, there may be multiple Performance Indicators, each representing an individual knowledge or skill being measured, or a particular part of the outcome

Example – Part 1 (the problem)

Example – Part 2 (first indicator) Rubric for Outcome (a): An ability to apply knowledge of mathematics, science, and engineering Course: ME 361 Semester: Spring 2016 Instructor: Geoff Recktenwald Tool: Final Exam, Problem 5 No. students: 72 TOO GENERIC   Degree of Achievement Performance Indicator Deficient Does not meet Minimum Competency Satisfactory Meets Expectations Superior Exceeds Expectations Application of fundamental principles of science and engineering Identifies which fundamental principles govern the process or system. Unable to carry through from knowing principles or theory to generating solution. Has good knowledge of governing principles/ theory. Generally is able to apply correct principles to problem solutions, but may make inappropriate assumptions or simplifications. Combines scientific and engineering principles/theory to formulate correct solutions to engineering problems. Enter grade range, if used. e.g.: 0-60 / 61-80 / 81-100 0-5 on part b 6-9 on part b 10 on part b Enter number of students in this range 2 (2.8%) 13 (18.1%) 57 (79.2%)

Example – Part 3 (second indicator) Rubric for Outcome (a): An ability to apply knowledge of mathematics, science, and engineering Course: ME 361 Semester: Spring 2016 Instructor: Geoff Recktenwald Tool: Final Exam, Problem 5 No. students: 72 COULD BE BETTER Appropriate mathematical techniques to achieve solutions to engineering problems Recognizes what mathematical techniques are required to reach a solution, but can only apply correct mathematical techniques with guidance or applies them incorrectly. Applies mathematical principles to obtain analytical or numerical solution to model equations. Generally chooses an appropriate method, but maybe not best method and/or without analysis of results. Has strong understanding of mathematical techniques and applies “best” methods to achieve correct solutions. Enter grade range, if used. E.g.: 0-60 / 61-80 / 81-100 0-16 on problem 5 17-21 on problem 5 22-30 on problem 5 Enter number of students in this range 7 (5.6%) 16 (22.2%) 49 (68.1%)

Your Turn to Develop a Rubric Divide into groups of 3-5 based on area of specialization Select one of the courses listed below and develop a test problem and a rubric to assess achievement of either outcome (a) or (e) – your choice. Dynamics, Systems and Control (DSC) ME 361 ME 451 ME 461 Solid Mechanics, Design & Manufacturing (SMDM) ME 391 ME 471 Fluid-Thermal Science and Engineering (FTSE) ME 332 ME 410

Test Problem for Outcome (____): Course: ME ______

Rubric Template – fill the empty cells Rubric for Outcome (____): Course: ME ______   Degree of Achievement Performance Indicator Deficient Does not meet Minimum Competency Satisfactory Meets Expectations Superior Exceeds Expectations Enter grade range, if used. e.g.: 0-60 / 61-80 / 81-100 Enter number of students in this range X Y Z