JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.

Slides:



Advertisements
Similar presentations
Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
Advertisements

For AS 229 (Environmental Technology). 1. A competent environmental technologist with strong understanding of fundamental scientific and technological.
Core Competencies Student Focus Group, Nov. 20, 2008.
ABET-ASAC Accreditation Workshop ABET Criteria and Outcomes Assessment
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Accreditation Board for Engineering and Technology ABET 1Advisory committee of
Engaging Faculty in Prevention Efforts Ellen Bass Associate Professor.
1 General Education Senate discussion scheduled for April 11 and 25 1.Proposal to base General Education on outcomes that can be assessed 2.Proposal for.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Computer Science Department Program Improvement Plan December 3, 2004.
ABET The Complete Report on Your Course. ABET OUTCOME CHECKLIST.
ETT 429 Spring 2007 Technology Standards. NETS-T Background International Society for Technology in Education (ISTE) created National Educational Technology.
1. An ability to:  Understand the academic requirements you need to obtain your degree  Calculate your GPA  Prepare a draft schedule 2.
Mohammad Alshayeb 19 May Agenda Update on Computer Science Program Assessment/Accreditation Work Update on Software Engineering Program Assessment/Accreditation.
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
Venue: M038 Date: Monday March 14,2011 Time: 10:00 AM JIC ABET WORKSHOP No.1 Guidelines on: I- Department’s Mission, PEOs and SOs II- The Preparation of.
Standards and Guidelines for Quality Assurance in the European
ABET Accreditation Board for Engineering and Technology
Assessment College of Engineering A Key for Accreditation February 11, 2009.
FLCC knows a lot about assessment – J will send examples
Accreditation Board for Engineering and Technology - is a non governmental organization that accredits post secondary educational organizations in : 1)
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
ABET Accreditation (Based on the presentations by Dr. Raman Unnikrishnan and W. J. Wilson) Assoc. Prof. Zeki BAYRAM EMU Computer Engineering Dept. 14 January.
King Fahd University of Petroleum and Minerals
Outcomes Assessment and Program Effectiveness at Florida Atlantic University : Student Affairs Gail Wisan, Ph.D. July 13, 2010.
OUTCOME BASED LEARNING- CONTINUES IMPROVEMENT. Motivation  PEC??  Continues Improvement.
JIC ABET WORKSHOP No.10 STATUS OF JIC ABET ACCREDITATION PROJECT.
OBE Briefing.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Venue: M038 Date: Monday May 30,2011 Time: 10:00 AM JIC ABET WORKSHOP No.5 Guidelines on: I ASSESSMENT DATA REPORT FORMAT. II COURSE ASSESSMENT CHART.
ABET Student Forum September 20, Review of the criterion Criterion 2: Objectives Criterion 3: Outcomes Criterion 5: Curriculum.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Outcome-based Education – From Curriculum to Classroom practices
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM JIC ABET WORKSHOP No.3 Guidelines on Criterion 4: Continuous Improvement I PEOs Assessment IIISOs.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
University of Central Florida S.O.S.: Student Outcomes Solutions for Program Assessment Paula S. Krist, Ph.D. Director, OEAS December 5, 2005 CS-55.
Updating Curriculum to Support Learning Davidson County Community College May
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Venue: M038 Date: Monday March 28,2011 Time: 10:00 AM JIC ABET WORKSHOP No.2 Guidelines on: IMapping of PEOs to Mission Statement IIMapping of SOs to PEOs.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
ABET Accreditation Process Chemical Engineering Department Prof. Emad Ali.
Supporting ABET Assessment and Continuous Improvement for Engineering Programs William E. Kelly Professor of Civil Engineering The Catholic University.
ABET is Coming! What I need to know about ABET, but was afraid to ask.
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
MEDICAL EQUIPMENT TECHNOLOGY DEPARTMENT FIRST SEMESTER 2014/2015 Medical Equipment Department November 2015.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Gateway Engineering Education Coalition Background on ABET Overview of ABET EC 2000 Structure Engineering Accreditation and ABET EC2000 – Part I.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
Preparing for ABET visit Prof. Dr. Lerzan Özkale Management Engineering Head of Department November 2010.
ABET Accreditation Criterion 4: Continuous Improvement Direct Assessment of Learning Outcomes Dr. Abdel-Rahman Al-Qawasmi Associate Professor EE Department.
University of Utah Program Goals and Objectives Program Goals and Objectives Constituents U of U, COE, ASCE, IAB Constituents U of U, COE, ASCE, IAB Strategic.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Engineering programs must demonstrate that their graduates have the following: Accreditation Board for Engineering and Technology (ABET) ETP 2005.
ABET ACREDITATION By: Elizabeth Rivera Oficina de Acreditación.
Funded by a grant from the National Science Foundation. Any opinions, findings, conclusions or recommendations expressed are those of the authors and do.
ABET Accreditation College of IT and Computer Engineering
OUTCOME BASED EDUCATION
Accreditation Board for Engineering and Technology
Program Review and Planning
Proposed Revisions to Criteria 3 and 5
Department of Computer Science The University of Texas at Dallas
Information Technology (IT)
Assessment and Accreditation
Air Conditioning Engineering Tech
Air Conditioning Engineering Tech
Industrial Technology Management Program Canino School of Engineering Technology Fall 2015 Assessment Report Curriculum Coordinator: Eric Y. Cheng Date.
Campus Management System (CMS): A tool to ease Continual Quality Improvement (CQI) implementation process in Outcome Based Education (OBE) Approach Presented.
Campus Management System (CMS): A tool to ease Continual Quality Improvement (CQI) implementation process in Outcome Based Education (OBE) Approach Presented.
Presentation transcript:

JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire. I Student Course Satisfaction Survey Questionnaire. II Faculty Survey Questionnaire. III Program Assessment Planning. IV Course Description models. V Mapping of CHET PEOs to JIC Mission Statement. Presented by:  JIC ABET COMMITTEE   Half-circle picture with accent arcsDrag the half-circle to the left until the two middle yellow adjustment diamonds are lined up with the left edge of the slide. On the Home tab, in the Drawing group, click Arrange, point to Align, and then do the following: Click Align to Slide. Click Align Middle. Select the arc. On the Home tab, in the Clipboard group, click the arrow under Paste, and then click Duplicate. Select the second arc. Under Drawing Tools, on the Format tab, in the Size group, do the following: In the Shape Height box, enter 6.79”. In the Shape Width box, enter 10.03”. On the Home tab, in the bottom right corner of the Drawing group, click the Format Shape dialog box launcher. In the Format Picture dialog box, click Fill in the left pane. In the Fill pane, select No fill. Also in the Format Shape dialog box, click Line Color in the left pane. In the Line Color pane, select Solid line and then do the following: Click the button next to Color, and then under Theme Colors click White, Background 1 (first row, first option from the left). In the Transparency box, enter 50%. Also in the Format Shape dialog box, click Line Style in the left pane. In the Line Style pane, in the Width box, enter 1.5 pt. Drag the second arc left on the slide until the two middle yellow adjustment diamonds are lined up with the left edge of the slide. On the Home tab, in the Drawing group, click Arrange, point to Align, and then do the following: Select the second arc. On the Home tab, in the Clipboard group, click the arrow under Paste, and then click Duplicate. Select the third arc. Under Drawing Tools, on the Format tab, in the Size group, do the following: In the Shape Height box, enter 6.86”. In the Shape Width box, enter 9.98”. On the Home tab, in the bottom right corner of the Drawing group, click the Format Shape dialog box launcher. In the Format Shape dialog box, click Line Color in the left pane, select Gradient line in the Line Color pane, and then do the following: In the Type list, select Linear. Click the button next to Direction, and then click Linear Down (first row, second option from the left). Under Gradient stops, click Add or Remove until two stops appear in the drop-down list. Also under Gradient stops, customize the gradient stops that you added as follows: Select Stop 1 from the list, and then do the following: In the Stop position box, enter 0%. Click the button next to Color, and then under Theme Colors click Blue, Accent 1, Lighter 40% (fourth row, fifth option from the left). In the Transparency box, enter 77%. Select Stop 2 from the list, and then do the following: In the Stop position box, enter 100%. Click the button next to Color, click More Colors, and then in the Colors dialog box, on the Custom tab, enter values for Red: 208, Green: 215, Blue: 222. In the Format Shape dialog box, in the Line Style pane, in the Transparency box, enter 90%. Also in the Format Shape dialog box, click Line Style in the left pane. In the Line Style pane, in the Width box, enter 4.25 pt. On the Home tab, in the Drawing group, click Arrange, point to Align, and then do the following: Click Align Left. Drag the third arc left on the slide until the two middle yellow adjustment diamonds are lined up with the left edge of the slide. Drag the third arc vertically as needed to position it slightly above the second arc on the slide. To reproduce the background on this slide, do the following: Right-click the slide background area, and then click Format Background. In the Format Background dialog box, click Fill in the left pane, select Gradient fill in the Fill pane, and then do the following: Click the button next to Direction, and then click Linear Up (second row, second option from the left). In the Angle box, enter 270⁰. Under Gradient stops, click Add or Remove until four stops appear in the drop-down list. Click the button next to Color, click More Colors, and then in the Colors dialog box, on the Custom tab, enter values for Red: 167, Green: 185, Blue: 197. In the Stop position box, enter 30%. Select Stop 3 from the list, and then do the following: In the Stop position box, enter 70%. Click the button next to Color, and then under Theme Colors click White, Background 1 (first row, first option from the left). Select Stop 4 from the list, and then do the following: Venue: M038 Date: Monday May 02,2011 Time: 10:00 AM

I- Student Course Satisfaction Survey Questionnaire

II- Faculty Survey Questionnaire

III - Program Assessment Planning

The focus of the data collection is to answer the question, “Can the program demonstrate the level to which students have attained the anticipated student outcomes?”  The evidence of student learning is then used to identify student strengths and weaknesses related to each of the student outcomes for the purpose of making decisions about how to improve the program teaching/learning processes.  This evidence should be the product of faculty reviewing and/or observing student work related to the program requirements.

It is important to understand several principles of a well-constructed process to enable continuous improvement related to program-level student learning.

Each course syllabus should outline the learning that should take place in that course. The course is organized around a series of activities designed to engage the students in learning related to what is indicated on the syllabus.  At the end of the course the faculty member evaluates how well the students learned based on their total performance in the course. At program level assessment, the focus should be on the learning that has resulted from the cumulative experiences in the program.  This is progressive as students learn, practice and demonstrate their learning as they progress through the curriculum moving from simple to more complex concepts and skills.  To provide evidence of student outcomes for the program, the data that should be used is the data that is collected in the course where the experience is summative. 

Principles of a well-constructed process to enable continuous improvement are: Focus of the continuous improvement process (Criterion 4) should be on the assessment of the program, not the assessment of individual students. Focus should be on the cumulative learning of students and not the assessment of individual courses. Student outcomes should be defined in order for faculty to have a common understanding of the expectations for student learning and to achieve consistency across the curriculum. The program does not have to collect data on every student in every course to know how well it is doing toward the attainment of student outcomes.  The program does not need more than one data point on each student in the program cohort to determine if the performance has been met. The program does not have to assess every outcome every year to know how well it is doing toward the attainment of student outcomes. This creates an unreasonable overhead for faculty and the result is massive data collection systems that produce data that is not informative The focus should be on continuous improvement based on information for decision making, not just data collection (i.e., data ≠ information)

The program Continuous quality improvement process would contain the following: A continuous timeline of activities (not all activity is data collection). Possible question:  “What is your data collection and evaluation timeline?” Define performance indicators for each student outcome with faculty consensus. Possible question for faculty: “How do you define your outcomes so that the faculty is assessing them consistently across the program" Systematic data collection that focuses on summative performance related to the indicators. Possible question:  “Where do you collect the data that is evidence of the summative learning of students?“ Summative results will have a single data point for each performance indicator for each student. Possible request:  “Describe how the data being presented were collected.” Data that is collected enables faculty to identify student strengths and weaknesses related to the outcomes. Possible question: “I see that X% of your students have attained outcome Y. What were the strengths and weakness of their performance?” Evaluation process focuses improvements on areas of student weaknesses and is communicated to faculty. Possible request: “Describe how the proposed actions improved student learning (or are anticipated to improve student learning) related to the weaknesses that were identified.

Example of Continuous Assessment Process

Indirect and Direct Assessment Timeline and Responsibilities

IV- Course description Models

MME 202 : Mechanical CAD Applications Sample #1 MME 202 : Mechanical CAD Applications

ELC 205 : Technical Report Writing Sample #2 ELC 205 : Technical Report Writing

to JIC Mission Statement V- Mapping of CHET PEOs to JIC Mission Statement

CHEMICAL AND PROCESS ENGINEERING TECHNOLOGY ( CHET) PROGRAM   JIC MISSION To provide the Kingdom with well-educated and highly-trained manpower in technical and business related fields by offering quality  technology education and training programs that are career-focused and market-driven, through partnership with business, industry, community and other stakeholders.

PROGRAM EDUCATIONAL OBJECTIVES Provide quality technology education MISSION PROGRAM EDUCATIONAL OBJECTIVES Provide quality  technology education Offer career-focused and market-driven programs Build partnership with business, industry, and community Support and enhance the various developmental projects of the nation 1. To prepare students who are capable of demonstrating excellence in professional knowledge and technical skills in the field of chemical and process engineering technology in serving the local, national and international industries, and the government agencies; • 2. To prepare students with the necessary background and technical skills to work professionally as individuals or as teams in their professional practice or in the pursuit of higher education; 3. To enhance the students ability to effectively communicate technically and professionally in written, oral and graphic forms; 4. To prepare students to be interested, motivated, and capable of pursuing continued life-long learning through higher education, short courses or other training programs in chemical and process engineering technology and related fields; 5. To prepare students for personal and professional success with an understanding and appreciation of ethical behavior, social responsibility, and diversity, both as individuals and in team environments.

Student Outcomes and Relationship to Chemical Engineering Technology Program Educational Objectives: The students will be able to: Chemical Engineering and Process Technology Educational Objectives 1 2 3 4 5 a. apply the acquired knowledge, technical skills and the use of modern tools in chemical and process engineering technology to narrowly defined chemical and process engineering technology activities; • b. apply knowledge of mathematics, science, engineering and technology to solve emerging problems in the field of chemical and process engineering technology; c. identify, formulate and solve technical problems in the field of chemical and process engineering technology using the skills of critical thinking and creative problem solving; d. conduct standard tests and measurements, and to conduct, analyze and interpret data, prepare technical reports and apply experimental results for chemical process and technology improvement; e. function effectively as a member of a multi-disciplinary team in a variety of working environments; f. demonstrate an adequate level of competency and proficiency in written, oral and graphical communications; g. recognize and appreciate the need to and be capable to engage in life-long learning for his professional development h. understand and appreciate professional, ethical and social responsibilities, and diversity when working as an individual and in team; and i. demonstrate a commitment to quality, timelines, and continuous improvement to enhance their professional career and the society.

THANK YOU