Presentation is loading. Please wait.

Presentation is loading. Please wait.

October 27, 2005ECE Department Meeting1 Assessment Plan and Notes on the ABET Program evaluator training J. Fernando Vega-Riveros, Ph.D. Dept. of Electrical.

Similar presentations


Presentation on theme: "October 27, 2005ECE Department Meeting1 Assessment Plan and Notes on the ABET Program evaluator training J. Fernando Vega-Riveros, Ph.D. Dept. of Electrical."— Presentation transcript:

1 October 27, 2005ECE Department Meeting1 Assessment Plan and Notes on the ABET Program evaluator training J. Fernando Vega-Riveros, Ph.D. Dept. of Electrical and Computer Engineering University of Puerto Rico

2 October 27, 2005ECE Department Meeting2 Objectives of presentation Assessment and Evaluation ABET workshop learning objectives Key points of the workshop Issues and questions a program evaluator may ask for each accreditation criterion Discussion Outline

3 October 27, 2005ECE Department Meeting3 Objective of presentation Present issues and comparative analysis of assessment approaches and ABET criteria Promote discussion of issues as they pertain to Department INEL/ICOM Make a decision about department assessment instruments and methods

4 October 27, 2005ECE Department Meeting4 Rationale A unified assessment approach for our two programs would would reduce administrative and academic burden on faculty, students and staff. Same instruments and tools used for the whole department

5 October 27, 2005ECE Department Meeting5 Accreditation timeline INEL and ICOM accredited till 2009Assessment Plan Conduct assessment and close the loop Request re-accreditation to ABETSubmit Self-Study Report ABET Visit Today December 2005 Summer 2006 Summer 2007 January 2008 July 2008 December 2009 Define assessment instrumentsDesign metrics, thresholds and sampling method

6 October 27, 2005ECE Department Meeting6 Assessment and evaluation 1 Assessment: act of collecting data or evidence that can be used to answer classroom, curricular. Or research questions. Assessment method: Procedures used to support the collection of data for assessment purposes. Evaluation: refers to the interpretations that are made of the evidence collected about a given question. 1 Olds, B.M.; Moskal, B.M. and Miller, R. Assessment in engineering education: evolution, approaches and future collaborations. ASEE Journal of Engineering Education. Vol. 94 No. 1. Jan. 2005. pp 13-25

7 October 27, 2005ECE Department Meeting7 Types (components?) of assessment Longitudinal Design Transversal Assessment Assessment of outcome achievement in a given academic term (snapshots) Tracks outcome achievement over time Formative Assessment Summative Assessment

8 October 27, 2005ECE Department Meeting8 Goal of assessment Measure outcome achievement (a-k) and how these outcomes are developing along and across the curriculum Validate interventions, e.g. curricular revisions, improved or novel pedagogies, new courses, course restructuring, etc. Collect data for quality assurance and continuous quality improvement Realistic and minimize burden

9 October 27, 2005ECE Department Meeting9 Program Educational Objectives Program will allow the graduates of the program to: 1.Obtain a broad educational experience necessary to understand the impact of electrical/computer engineering problems and solutions within a global and societal context; 2.Possess a combination of knowledge and analytical, computational, and experimental skills necessary to solve practical electrical/computer engineering problems; 3.Have adequate communication skills both as an individual and as part of a team; 4.Value the importance of lifelong learning 5.Be aware of contemporary issues and thus be able to make decisions taking into consideration professional and societal needs and ethical implications

10 October 27, 2005ECE Department Meeting10 Program educational outcomes Graduates of the program will have the: a)Ability to apply knowledge of mathematics, science and engineering necessary to carry out analysis and design appropriate to electrical/computer engineering problems; b)Ability to design and conduct experiments as well as analyze and interpret data; c)Ability to design computer systems to meet desired needs; d)Ability to function in multidisciplinary teams; e)Ability to identify, formulate, and solve engineering problems;

11 October 27, 2005ECE Department Meeting11 Program educational outcomes (cont) f)Understanding of professional and ethical responsibility g)Ability to communicate effectively; h)Broad education necessary to understand the impact of engineering solutions in a global/societal context; i)Recognition of the need for and ability to engage in lifelong learning j)Knowledge of contemporary issues k)Ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

12 October 27, 2005ECE Department Meeting12 Current Assessment Approach (ICOM) Instruments GPA (Transcript Analysis) Student, alumni and employer surveys Metrics and satisfaction thresholds Items related to grades are considered satisfactory it they are above 3.0 (B grade) Items related to satisfaction rating are satisfactory if Above 3.0 (weight of a “satisfied” response in surveys); or Above average satisfaction of all items in same category

13 October 27, 2005ECE Department Meeting13 Issues of current approach Minimum burden on faculty High burden on accreditation team (coordinator) Uses surveys, practice that ABET discourages for outcome assessment. Surveys can be used as supportive information but depends on perception not on actual outcome achievement 3surveys Some thresholds move depending on population and may not be set according to more explicit program criteria. 3 ABET 2005 Program Evaluator Training Materials (EAC Training, Approved December 2004).

14 October 27, 2005ECE Department Meeting14 Issues of current approach (Continued) GPA does not discriminate among outcomes of a given course; it gives some indication of achievement of the set of outcomes set for a given course or courses. ABET has some concerns about use of course grades 4. 4 Guidelines to Institutions, Team Chairs and Program Evaluators on Interpreting and Meeting the Standards Set Forth in Criterion 3 of the Engineering Accreditation Criteria. http://www.abet.org/Linked Documents-UPDATE/Program Docs/EAC Guidelines or Criterion3.pdf.http://www.abet.org/Linked Documents-UPDATE/Program Docs/EAC Guidelines or Criterion3.pdf

15 October 27, 2005ECE Department Meeting15 Assessment methods 2 Descriptive designs Surveys Interviews and focus groups Observation Conversational analysis Ethnographic studies Experimental designs Randomized controlled trials Matching Baseline data Longitudinal design Post-test-only design 2 Olds et-al Op. cit.

16 October 27, 2005ECE Department Meeting16 Some approaches to outcome assessment 1.Student portfolios: manageable with small student populations. How do we store and organize portfolios (and make them easily retrievable)? Who is responsible for portfolio repository? 2.Periodic comprehensive (Standardized) tests administered in key core courses (could be pre or post). How can we motivate students to take them seriously? How do we assess outcomes d, g, i? Professional examination could be one of these tests. Who designs tests? 3.Cornerstone and capstone courses. Do we have cornerstone courses? Can we identify those courses? 4.Assessment across course materials (tests, evidence of presentations and reports, when needed). Requires collaboration of faculty and material sampling and metrics not just doing a test or presentation suffices. Need assessment criteria from faculty but need to show some unity of criteria.

17 October 27, 2005ECE Department Meeting17 Approaches to outcome assessment AlternativeProsCons Student Portfolios Unified assessment for all students and all academic terms Comprehensive Manageable with small student populations. How do we store and organize portfolios? Who is responsible for portfolio repository? Periodic comprehensi ve tests Standardized Unified assessment for all students and all academic terms Professional licensure could be one of the tests What if student achieves outcome but fails the course or vice versa? How do we assess criteria d, g and i? How and when would we give them? How can we motivate students to take them seriously? Who designs the tests? Standardized tests measure how well students learned information but may not demonstrate how well they solve problems

18 October 27, 2005ECE Department Meeting18 Approaches to outcome assessment (Continued) AlternativeProsCons Cornerstone and capstone courses Comprehensive Assessment instruments could be more standardized than in content courses (no need to come up with new questions for tests or exams) What are the cornerstone courses? Do we have any? We need to identify candidate courses and/or create them Course material-based assessment We may already have many necessary elements and evidence materials Probably the smallest additional burden on faculty Allows student progress monitoring Allows multiple measures of each outcome Need to do a mapping between materials and outcomes How to sample materials (statistically significant) Not comprehensive

19 October 27, 2005ECE Department Meeting19 Approaches to outcome assessment (Continued) AlternativeProsCons Professional Licensure test (Reválida) Almost comprehensive Can be used for comparative analysis with rest of the nation Shows outcomes of the program Does not allow assessment of outcomes d, g and i. Only one point measurement and does not let us see how students develop skills to achieve outcomes Long response time to any curricular changes

20 October 27, 2005ECE Department Meeting20 Assessment Proposal Establish cycles of assessment for courses Sampling of course material (no need to be exhaustive, just representative and statistically significant) Mapping of course material and educational outcomes (may need collaboration of course instructor) Measurement of outcome achievement using course material and outcome mapping. No course is assessed every semester (unless something exceptional is found). Round-Robin strategy. Not all sections of a course need to be assessed. Assessment plan will establish cycle of sections assessment to minimize burden on faculty members and distribute assessment load as evenly as possible. Will use professional licensure examination as a component of summative assessment and benchmarking Results reviewed by area committees and final results send to steering committees

21 October 27, 2005ECE Department Meeting21 Assessment Proposal – developing the assessment plan Determine who we will assess: Educational objectives: graduates between 1 and 3 years. Educational outcomes: current undergraduate students in the Department. Establish a Schedule for Assessment. Proposed re-accreditation timeline. Determine who will interpret results: Area committees. Interpret how results will inform teaching/learning and decision making: Steering committees. Determine how and with whom we will share interpretations: Department, College, Office of Continuous Improvement and Assessment, ABET, Middle States, CES through periodic reports and self studies. Decide how our department and institution will follow-up on implemented changes: periodic assessment and continuous program quality improvement.

22 October 27, 2005ECE Department Meeting22 Assessment Schedule January 2006June 2006December 2006June 2007December 2007 Request re-accreditation to ABETSubmit Self-Study Report January 2008 July 2008 Publish Assessment Schedule 1 st Assessment Sampling 2 nd Asmt Smpl and start closing loop 3 rd Asmt Smpl and continue closing loop 4 th Asmt Smpl & finish asmt. cycle ABET Visit December 2009 Committee Evaluations Committee Evaluations Committee Evaluations

23 October 27, 2005ECE Department Meeting23 Questions, discussion and decision


Download ppt "October 27, 2005ECE Department Meeting1 Assessment Plan and Notes on the ABET Program evaluator training J. Fernando Vega-Riveros, Ph.D. Dept. of Electrical."

Similar presentations


Ads by Google