Accreditation Evaluation of the BS-CSE Program

Slides:



Advertisements
Similar presentations
Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas.
Advertisements

Cyber Education Project Accreditation Committee November 2014.
Presentation to CSE Advisory Board Neelam Soundarajan April 25, 2014.
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
© Copyright CSAB 2013 Future Directions for the Computing Accreditation Criteria Report from CAC and CSAB Joint Criteria Committee Gayle Yaverbaum Barbara.
EECS Faculty Retreat ECE Assessment Report August 22, 2014.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Computer Science Department Program Improvement Plan December 3, 2004.
Computer Science Accreditation/Assessment Issues Bolek Mikolajczak UMass Dartmouth, CIS Department Chair IT Forum, Framingham, MA January 10, 2006.
Report to External Review Board Brigham Young University Civil & Environmental Engineering October 14, 2005.
Mohammad Alshayeb 19 May Agenda Update on Computer Science Program Assessment/Accreditation Work Update on Software Engineering Program Assessment/Accreditation.
Industry Advisory Board Department of Computer Science.
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
ABET Accreditation Board for Engineering and Technology
Capstone Design Project (CDP) Civil Engineering Department First Semester 1431/1432 H 10/14/20091 King Saud University, Civil Engineering Department.
Assessment College of Engineering A Key for Accreditation February 11, 2009.
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
CHEN Program Assessment Advisory Board Meeting June 3 rd, 2012.
King Fahd University of Petroleum and Minerals
1 Student Success Plans Regional Meeting February 9, 2007 Youngstown State University Office of Assessment Sharon Stringer
Department of Computing and Technology School of Science and Technology A.A.S. in Computer Aided Design Drafting (CADD) CIP Code Program Quality.
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
CHEMICAL ENGINEERING PROGRAM CHEN Program Assessment Advisory Board Meeting May 21, 2013.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Advising Workshop for New Faculty CSE Department Ohio State University.
Accreditation Evaluation of the BS-CSE Program Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department 1.
Assessment and Evaluation BS-CSE Program Neelam Soundarajan Chair CSE Undergrad Studies Comm.
Information and Computation Assurance Programs in the Ohio State University Neelam Soundarajan Computer Sc. & Eng. Dept. Ohio State University.
Preparing Future Faculty in Engineering & Applied Science A program for graduate students who: are in the College of Engineering and Applied Science are.
1 Proposal for Direct Assessment of Program Outcomes (For Faculty Meeting, Nov. 28, ’05)
1 Accreditation Evaluation of the BS-CSE Program: Site Visit Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department.
CISE IAB MeetingOctober 25, 20071/9 ABET Accreditation Brief, rough history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials,
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
Department of Electrical and Computer Engineering ABET 2000 Methodology of Evaluation - rather than credit counting Outcomes assessment –Faculty Review.
1 Accreditation Evaluation of the BS-CSE Program CSE Department Ohio State University.
Use of Surveys N J Rao and K Rajanikanth
1 Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas (UTD) January 22, 2016.
Industry Advisory Board June 8 th, 2012
Industry Advisory Board
ABET Accreditation College of IT and Computer Engineering
Assessment and Reporting
Industry Advisory Board
ABET Definitions Objectives Outcomes Broad Statements
Lecture 1. Course Introduction
Agenda 12:00-1:30 Lunch / Summary of 2006 Visit
Accreditation at the Masters Level
Director of Policy Analysis and Research
Accreditation Board for Engineering and Technology
ECE361 Engineering Practice
MCS Master of Computer Science Program
College of Computer Science OBE Implementation on Curriculum Revisions
Accreditation Evaluation of the BS-CSE Program
Industry Advisory Board
Computer Science Assessment Plan Overview
What to Expect When You’re Expecting
Neeraj Mittal September 29, 2017
Department of Computer Science The University of Texas at Dallas
Information Technology (IT)
Welcome W 1.2 Introduction to Engineering Design II
ABET 2017 Followup Actions.
Outcome-Based Instruction: Self-Study Report
Assessment and Accreditation
Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
What do our students know?
ECSE Advising 10 September /20/2019 Kenneth A. Connor.
Student Learning Outcomes at CSUDH
Objectives & Outcomes Chuck Cone ERAU Oct 30, 2010.
Presentation transcript:

Accreditation Evaluation of the BS-CSE Program Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department

Background All undergrad programs in CoE are accredited by Eng. Accreditation Comm. (EAC) BS-CSE: Also accredited by the Computing Accreditation Comm. (CAC) BS-CIS and BA-CIS are in Arts & Sc. and are not accredited … This discussion is only about BS-CSE EAC & CAC are both part of ABET (abet.org) Normal accreditation cycle is 6 years (our last evaluation: Au 2011)

Process Submit a self-study that shows how the program satisfies EAC+CAC requirements EAC+CAC appoints an evaluation team: CAC team chair (Rajendra Raj, RIT) CAC program evaluator (PEV) (Charles Dierbach, Towson University) EAC PEV (Steven Barrett, U. of Wyoming) EAC also appoints PEVs for the other programs and a team chair; but we won’t see them Three-day site visit (Oct. 15 --17)

Main point of the site-visit … Validate the claims made in the self-study by : Talking to faculty, advisors, students, higher admin, … Checking course materials, assessment-related materials, … Looking at computing facilities, classrooms,… The goal: evaluate how well the program meets the EAC+CAC criteria requirements

BS-CSE curriculum CS core + core choices + tech electives + GE + math, science, engineering And meet requirements of a specialization CS Core (28 cr hrs): CSE 2221, 2231: Software I, II CSE 2321,2331: Foundations I, II CSE 2421, 2431: Systems I, II Computing ethics: CSE 2501 or Phil 1338 ECE 2020, 2060: Digital logic; Analog circuits

Core Choice Junior project: CSE 3901 or 3902 Capstone design course: One of: CSE 5911, 5912, 5914, 5915 One from each of the following four pairs: CSE 3231 or 3241 (SE or DB) CSE 3321 or 3341 (Formal Lang. or Prog. Lang.) CSE 3421 or 3461 (Architecture or Networking) CSE 3521 or 3541 (AI or Graphics) A student may take both courses from a pair; then second course counts as a tech elective

Tech Electives, Specialization Option Must complete 17 cr. hrs. of tech electives Min. 9 hrs. of CSE courses at 3000-level or above Rest: combination of (3000-level or above) CSE and appropriate non-CSE courses Core choice courses + tech electives must meet requirements of specialization option (AI; Graphics & Game Des.; DB; Systems;…) What are the EAC/CAC requirements and what questions can you expect? …

The Criteria … Institution: Facilities; Support for teaching; Support for professional activities; … Curriculum: … Students: Must be advised and monitored Assessment/continuous improvement: This is a critical requirement

Caution! It is NOT about our ranking or even how good our curriculum is etc. It is whether we meet the letter and intent of the Criteria requirements including, especially, the ones related to assessment & cont. improvement *Most* faculty are expected to be involved with the assessment/improvement activities and you can expect questions about your role in them when you talk to the evaluator

A *common* misunderstanding … “Assessment means assessment of students’ projects, exams., homeworks, … to assign grades” That is not what ABET means by assessment Assessment is of the program (not individual students) to see how well it helps students achieve the program outcomes … and use the results to identify weaknesses in the program and develop ways to improve the program What outcomes?

Objectives, Outcomes … ABET also requires programs to have program (educational) objectives PEOs are goals that graduate of the program are expected to achieve while student outcomes (SOs) are knowledge/skills they acquire by the time they graduate PEOs have become less important to ABET in the last few years but it depends on the team

Our Program Objectives Graduates will be successfully employed in the computing profession … Some will be pursuing/have completed graduate studies in computing … Will be informed/responsible engineering and computing professionals … We obtain feedback on these from alumni and Adv. Board members. We revise them based on that input, most recently last fall. (We added a reference to the ACM Code to last objective.)

Some of our outcomes Technical outcomes: Students will acquire an ability to apply knowledge of computing, mathematics, ...; an ability to apply design and development principles in the construction of software systems of varying complexity; ability to use the techniques, skills, and modern engineering tools necessary for practice as a CSE professional; … Professional outcomes: Students will acquire an ability to function on multi-disciplinary teams; an ability to communicate effectively with a range of audiences; These are are generic … how do you assess them?

Our assessment process POCAT: Multiple-choice Exit exam for assessing technical group of outcomes Offered each semester BS-CSE students take the exam close to their graduation But their performance on the test does not affect individual students in any way nor is this recorded Questions on POCAT are from a range of required courses and several popular electives … Designed to assess student understanding of key concepts and identify common misconceptions Each question is mapped to one or more technical outcomes … hence assessing their achievement

Process (for tech. outcomes) (contd.) POCAT details: Faculty responsible for relevant courses suggest questions for POCAT; these are maintained in a question bank (Note: We need questions! UGSC will take care of mapping questions to outcomes) Each semester, UGSC chair creates the test Students must sign up for the test It is administered by the Advising Office on a weekday evening; pizza and pop are served after the test Students (generally?) complete the exit survey during the same session

POCAT Details (contd.) Test results are processed by a script and reported in a web page For each question, information about which answer was chosen by which student (by code known only to the student) Summary of average achievement of each (technical) outcome (based on the technical outcomes that each question corresponds to) Added a couple of years ago: information about the discrimination of the various answers to each question

The “improvement” part … Test results discussed by UGSC Focus on any unusual results Typical actions: Refine the question to get better information Discuss possible changes in relevant courses Etc. For all such actions, involve relevant faculty POCAT and improvements based on its results is a key part of how we meet the ABET requirements … all faculty should be able to discuss it

Process for professional outcomes Rubric used in CSE 2501/Phil 1338 to assess comm. skills, understanding of prof/ethical issues.. The dimensions include: Organization, Mechanics (of presentation), Relating to audience, Awareness of contemporary issues (such as security of electronic voting machines, …) Assessed (by the instructor) once a year and the results presented to UGSC Goal: Identify any weaknesses with respect to these outcomes and possible improvements …

Process for prof. outcomes (contd.) Rubric 2 for use by capstone course instructors to assess team skills, comm. skills etc. as well as other factors (applicable to capstone courses that are included in the criteria): quality of problem formulation, quality of implementation Rubric 3 for use in the capstone poster session to assess the same items – but this rubric is completed by visitors to the poster session R2 used once a year in each capstone course R3 used in the poster session at end of Au/Sp sem. Results presented to UGSC to help identify any weaknesses and propose possible improvements …

Summary: Assessments and Improvements POCAT and the three rubrics are direct assessments (since they are based on assessing actual student work) We also have an indirect assessment, the exit-survey, which asks students the extent to which the program enabled them to achieve the outcomes We obtain additional feedback from students at the annual UG forum UGSC evaluates all of this information and arrives at possible improvements in program

Plans Site Visit: Oct. 15-17 (Sunday-Tuesday) Meetings with faculty: Monday, Oct. 16 But not all faculty The team will tell us, with input from us, who they want to talk to I will prepare a website with all the relevant information (including these slides) Please go through that information (and ask questions if anything is unclear) Look for and read(!) emails from me and respond …

***Really, Really Important*** Please do NOT argue with the evaluation team! The team members are volunteers. Please treat them with respect! If you have comments about the accreditation criteria etc., send them to Neelam. Do not challenge the team to defend the criteria.