Outcomes Assessment in Computer Science Robert Lingard Assessment Coordinator August 27, 2009 1.

Slides:



Advertisements
Similar presentations
Global Learning Outcomes at Pensacola State College (GLOs)
Advertisements

Note: Lists provided by the Conference Board of Canada
UNSW Strategic Educational Development Grants
Assessment Report Computer Science School of Science and Mathematics Kad Lakshmanan Chair Sandeep R. Mitra Assessment Coordinator.
National Academic Reference Standards
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas.
Core Competencies Student Focus Group, Nov. 20, 2008.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
Engineering Design Rubric Dimensions 1, 2 and 7.
Cyber Education Project Accreditation Committee November 2014.
© Copyright CSAB 2013 Future Directions for the Computing Accreditation Criteria Report from CAC and CSAB Joint Criteria Committee Gayle Yaverbaum Barbara.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Computer Science Department Program Improvement Plan December 3, 2004.
Overview of the Rose-Hulman Bachelor of Science in Software Engineering Don Bagert SE Faculty Retreat – New Faculty Tutorial August 23, 2005.
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Outcomes-Based Accreditation: An Agent for Change and Quality Improvement in Higher Education Programs A. Erbil PAYZIN Founding Member and Past Chairman.
Computer Science ABET Visit Update November 8, 2003.
Mohammad Alshayeb 19 May Agenda Update on Computer Science Program Assessment/Accreditation Work Update on Software Engineering Program Assessment/Accreditation.
Academic Writing Carol M. Allen May 2007 Writing Styles in the Online Program Personal/Informal – –Discussion Topics –Journals Formal –Academic.
Purpose of the Standards
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
ABET Accreditation Board for Engineering and Technology
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
1 C-99 Assessing Student Learning in Graduate Degree Programs C-99 Assessing Student Learning in Graduate Degree Programs Bob Smallwood, University of.
CHEN Program Assessment Advisory Board Meeting June 3 rd, 2012.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Assessment Report School of The Arts, Humanities and Social Sciences________________ Department: Political Science and International Studies.
Sheila Roberts Department of Geology Bowling Green State University.
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Designing and implementing of the NQF Tempus Project N° TEMPUS-2008-SE-SMHES ( )
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
CSE ACCREDITATION REVIEW BY CAC & EAC UC Irvine October 2, 2013.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
EDU 385 Education Assessment in the Classroom
Learning outcomes for BUSINESS INFORMATCIS Vladimir Radevski, PhD Associated Professor Faculty of Contemporary Sciences and Technologies (CST)
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Richard Beinecke, Professor and Chair Suffolk University Institute for Public Service.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Source : The Problem Learning and innovation skills increasingly are being recognized as the skills that separate students who are.
Copyright © 2011 by ABET, Inc. and TMS 1 December 2, 2008 ABET Update UMC Meeting April 6, 2015 San Francisco, CA Chester J. Van Tyne
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
CEN Faculty MeetingMarch 31, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
CISE IAB MeetingOctober 15, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
1 Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas (UTD) January 22, 2016.
Accreditation of study programs at the Faculty of information technologies Tempus SMGR BE ESABIH EU standards for accreditation of study.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Robert P. King Department of Applied Economics April 14, 2017
OUTCOME BASED EDUCATION
CRITICAL CORE: Straight Talk.
Consider Your Audience
IB Assessments CRITERION!!!.
Accreditation Board for Engineering and Technology
Proposed Revisions to Criteria 3 and 5
Department of Computer Science The University of Texas at Dallas
Information Technology (IT)
Student Learning Outcomes at CSUDH
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
Presentation transcript:

Outcomes Assessment in Computer Science Robert Lingard Assessment Coordinator August 27,

Outline  The Importance of Assessment  Problems with Assessment  Goals of an Assessment Process  An Approach to Assessment  Applying the Software Engineering Paradigm  The Annual Assessment Process at CSUN  Conclusions 2

Importance of Assessment  Assessment is required by accrediting boards (e.g., ABET).  Those with vested interests (financial and otherwise) in education are demanding accountability.  Without assessment it is difficult to know what changes to make to improve learning. 3

Problems with Assessment  Pressure for assessment can cause a “rush to assessment,” and produce meaningless results.  Pressure to “close the loop” can cause premature decisions regarding program changes.  There is a tendency to only assess high performance areas in order to “showcase” a program.  Results are often not validated.  The effects of changes are often not assessed. 4

Goals of an Assessment Process  To facilitate the direct assessment of student learning  To measure student retention and the ability to apply what they have learned  To provide mechanisms to ensure the continuity of the process  To be an efficient and natural extension of normal operations  To satisfy the requirements of ABET and the University for assessment 5

Approach to Assessment  “The first and only goal [in education is to] teach for long-term retention and transfer.” – Diane Halpern  The best assessments of student learning are based on direct measures of achievement.  How well students have learned is best assessed by looking at their ability to apply what they learned earlier to new situations. 6

Requirements for a Complete Assessment Process  It must be comprehensive – must cover the full range of learning outcomes  It must include multiple judgments – multiple sources of evidence must be used  It must include multiple dimensions – different facets of student performance must be included  It must collect direct evidence – direct measures of student attainment must be used 7

The Software Engineering Paradigm  Understand the problem  Plan a solution  Carry out the plan  Make sure the solution is correct 8

Understanding the Problem  Before beginning assessment, the first step is to decide what to assess – i.e., pick the most important things to assess.  Collecting information from faculty, students, alumni, and employers can give hints as to where learning problems exist.  Surveys, faculty meetings, and student interviews are ways of collecting this information. 9

Planning a Solution  Once a learning outcome is selected, a plan for conducting the assessment must be developed.  It must be determined how the assessment will be done – i.e., embedded questions in exams, a standard instrument, etc.  How the results will be evaluated must also be determined – e.g., rubrics must be developed.  It must also be determined who will do the various tasks required and when they will be done. 10

Carrying Out the Plan  The steps of the plan must be carried out, and the plan must be monitored to ensure successful completion of the assessment.  Someone must be designated as the lead, and this person has the responsibility for monitoring the plan. 11

Making Sure the Solution is Correct  Assessment results obtained must be validated – this step is often omitted.  One way to validate results is to make comparisons among several independent assessments using different approaches.  Failure to validate the results can result in program changes that may not be appropriate.  After program changes are made the effects of the changes must be assessed. 12

13 Reliability and Validity Reliable but not valid Valid but not reliable Reliable & valid

Questions to Ask about an Assessment  Was the instrument we used reliable?  Did we measure what we intended to measure?  If we used a sample, was the sample size large enough to be confident in the results?  If we are comparing the results of two assessments, are the differences significant? 14

Iteration  Like software engineering this process is iterative.  Once program changes have been made, they must be assessed to determine whether the desired result has been accomplished.  Analysis of this reassessment might indicate the need for further changes. 15

16

The Annual Assessment Process at CSUN 1. Department forms Assessment Committee The program is divided into seven areas and a coordinator is chosen for each area. These coordinators constitute the Assessment Committee. 2. Assessment Committee recommends outcomes/objectives to be assessed The Assessment considers results from previous assessments, informal assessments, and the length of time since particular outcomes have been assessed to determine the current set to be assessed. 3. Department approves assessment goals and assessment plans are prepared After department approval the Assessment Committee decides on an assessment approach and develops a schedule of activities. 17

The Assessment Committee Consists of the Program Area Coordinators PROGRAM AREA COURSESPROGRAM OUTCOMES Fundamental Concepts Rick Covington  Intro. to Algorithms & Prog.  Data Structures & Prog. Design  Advanced Data Structures  Plus Electives  Demonstrate an understanding of algorithms and data structures.  Demonstrate proficiency in using a high-level computer language.  Demonstrate a problem solving ability. Systems Robert McIlhenny  Computer Architecture  Computer Organization  Operating Systems & Sys. Arch.  Plus Electives  Demonstrate an understanding of computer organization and architecture. Language/Theory Gloria Melara  Concepts of Prog. Languages  Automata, Languages & Comp.  Discrete Mathematics  Combinatorial Algorithms  Symbolic Logic  Plus Electives  Demonstrate an understanding of programming language concepts and knowledge of a variety of programming language paradigms.  Demonstrate an ability to apply mathematical skills appropriate to the computer science discipline. Software Engineering George Wang  Intro. to Software Engineering  Plus Electives  Demonstrate proficiency in collecting, analyzing, and interpreting data.  Demonstrate an understanding of emerging technologies and a working knowledge of currently available software tools.  Demonstrate an understanding of the principles and practices for software design and development.  Be able to apply the principles and practices for software design and development to real problems. Societal Issues Peter Gabrovsky  Societal Issues in Computing  Plus Electives  Demonstrate an awareness of the evolution and dynamic nature of the foundational core of computer science.  Demonstrate knowledge of the social impact of computing.  Demonstrate an understanding of the professional and ethical considerations of computing. Communications Diane Schwartz  Intro. to Software Engineering  Societal Issues in Computing  Plus Electives  Be able to effectively communicate orally.  Be able to effectively communicate in written form.  Be able to work effectively on a team. Lifelong Learning Jack Alanen  Societal Issues in Computing  Plus Electives  Demonstrate the knowledge and capabilities necessary for pursuing a professional career or graduate studies  Demonstrate the recognition of the need for, and ability for, continuing professional development. 18

The Annual Assessment Process at CSUN 1. Department forms Assessment Committee The program is divided into seven areas and a coordinator is chosen for each area. These coordinators constitute the Assessment Committee. 2. Assessment Committee recommends outcomes/objectives to be assessed The Assessment considers results from previous assessments, informal assessments, and the length of time since particular outcomes have been assessed to determine the current set to be assessed. 3. Department approves assessment goals and assessment plans are prepared After department approval the Assessment Committee decides on an assessment approach and develops a schedule of activities. 19

Student Learning Outcomes 1.Demonstrate an understanding of algorithms and data structures. 2.Demonstrate an understanding of computer organization and architecture. 3.Demonstrate an understanding of programming language concepts and knowledge of a variety of programming language paradigms. 4.Demonstrate proficiency in using a high-level computer language. 5.Demonstrate an ability to apply mathematical skills appropriate to the computer science discipline. 6.Demonstrate an awareness of the evolution and dynamic nature of the foundational core of computer science. 7.Demonstrate proficiency in collecting, analyzing, and interpreting data and information. 8.Demonstrate a problem solving ability. 9.Demonstrate an understanding of emerging technologies and a working knowledge of currently available software tools. 10.Demonstrate an understanding of the principles and practices for software design and development. 11.Be able to apply the principles and practices for software design and development to real world problems. 12.Be able to effectively communicate orally. 13.Be able to effectively communicate in written form. 14.Be able to work effectively on a team. 15.Demonstrate knowledge of the social impact of computing. 16.Demonstrate an understanding of the professional and ethical considerations of computing. 17.Demonstrate the knowledge and capabilities necessary for pursuing a professional career or graduate studies. 18.Recognize the need for, and show an ability for, continuing professional development. 20

Program Educational Objectives A few years after graduation, graduates of the computer science program will: 1.Be able to apply the principles of computer science, mathematics, and scientific investigation to solve real world problems appropriate to the discipline 2.Be able to apply current industry accepted computing practices and new and emerging technologies to analyze, design, implement, and verify high quality computer-based solutions to real world problems. 3.Exhibit teamwork and effective communication skills. 4.Be able to positively and appropriately apply knowledge of societal impacts of computing technologies in the course of career related activities. 5.Be successfully employed or accepted into a graduate program, and demonstrate a pursuit of lifelong learning. 21

The Annual Assessment Process at CSUN 1. Department forms Assessment Committee The program is divided into seven areas and a coordinator is chosen for each area. These coordinators constitute the Assessment Committee. 2. Assessment Committee recommends outcomes/objectives to be assessed The Assessment considers results from previous assessments, informal assessments, and the length of time since particular outcomes have been assessed to determine the current set to be assessed. 3. Department approves assessment goals and assessment plans are prepared After department approval the Assessment Committee decides on an assessment approach and develops a schedule of activities. 22

Sample Assessment Plan Learning Outcome: Be able to effectively communicate in written form Tool(s) to be Used in the Assessment: 1.The Writing Proficiency Exam (WPE) 2.Term Papers written in Comp 450 Assessment Activities to be Performed, When and by Whom: 1.Establish minimum and average WPE score values considered to be acceptable for each version of the WPE. (E.g., 3% better than the average scores over the past 5 years.) – May 2005, by the Software Engineering Course Group. 2.Collect the WPE scores for all Computer Science students over the last 5 years (noting which version was taken). – September 2005, obtained from Institutional Research by the Software Engineering Course Group. 3.Statistically analyze the WPE scores relative to past scores of similar students. – October 2005, by the Software Engineering Course Group. 4.Create a set of criteria (rubric) for assessing the quality of student written work, including the setting of standards for acceptability. – November 2005, by the Software Engineering Course Group. 5.Gather the term papers written by all students currently taking Comp 450. – December 2005, obtained from the Comp 450 instructors by the Software Engineering Course Group. 6.Select a sample of the term papers of graduating students and assess them according to the established criteria. – January 2006, by a committee selected by the department for the assessment of written communication. 7.Prepare an assessment summary report for the department including recommendations for program improvement. – February 2006, by the Assessment Committee. 23

The Annual Assessment Process at CSUN 4. Department approves assessment plans and assessments are conducted Program Area coordinators ensure that the planned assessment activities are completed and that a final report is prepared 5. Assessment are analyzed and program changes are recommended The Assessment Committee analyzes the results of the completed assessments and makes recommendations for program changes 6. Department reviews recommendations and makes program changes that are determined appropriate At a department meeting the recommendations of the Assessment Committee are discussed and proposed program changes are made only if there is approval from the department as a whole. 24

Sample Rubric Learning Outcome: Be able to effectively communicate in written form Assessment Rubric for Written CommunicationStrong 2 Acceptable 1 Weak 0 EFFECTIVENESS OF THE THESIS: Term papers written in an academic context are expected to contain a thoughtful and insightful thesis, main idea, position, or claim that is sustained throughout the paper. The thesis is clear, insightful and thought-provoking. It is sustained consistently throughout the paper. The thesis is clear and plausible. It is sustained consistently throughout the paper. The thesis is weak or absent. It is not sustained throughout the paper. RESPONSE TO ASSIGNMENT: Papers written in an academic context are expected to address the topic and issues set forth in the assignment and address all aspects of the writing task. Usually requires some discussion and refutation of an opposing view point. The paper responds to the assignment and addresses the topic and issues. Discussion of a counter- argument is included when appropriate. The paper responds to the assignment and addresses the topic and issues. Some discussion of a counter-argument is included when appropriate. The paper does not respond to the assignment or treats the assignment in a superficial, simplistic, or disjointed manner. Little or no discussion of a counter-argument in included. SUPPORT: Papers written in an academic context are expected to provide support for main points with reasons, explanations, and examples that are appropriate for intended audience. The thesis is fully and convincingly developed, supported with good reasons, explanations and examples. The thesis is adequately developed, supported with reasons, explanations, and examples. The thesis is inadequately developed, unsupported with reasons, explanations, and examples. ORGANIZATION: Papers written in an academic context are expected to be well-organized, in both overall structure & paragraphs. The paper is well-structured; its form contributes to its purpose. Paragraphs are well-organized and carefully linked to the thesis. The paper is generally well- structured, with only a few flaws in overall organization. Paragraphs are adequately organized and generally linked to the thesis. The paper is poorly structured; organizational flaws undermine its effectiveness. Paragraphs are not well organized; nor are they linked to the thesis. STYLE; Papers written in an academic context are expected to be stylistically effective – that is, to contain well- structured sentences, well-chosen words, and an appropriate tone, as a means of achieving its purpose. The sentence structure, word choice, fluency, and tone of the paper enhance its effectiveness and reinforce its purpose. The sentence structure, word choice, fluency, and tone of the paper contribute to its effectiveness and adequately support its purpose. The sentence structure, word choice, fluency, and tone of the paper detract from its effectiveness or are inappropriate to its purpose. GRAMMAR AND MECHANICS: Papers written in an academic context are expected to maintain sentence level correctness in terms of syntax, grammar, spelling, punctuation, and format. The paper is correct in terms of its syntax, grammar, spelling, punctuation, and format. Sentence level errors do not seriously detract from the paper’s effectiveness. Sentence level errors are so frequent and disruptive that they detract from the paper’s effectiveness. 25

The Annual Assessment Process at CSUN 4. Department approves assessment plans and assessments are conducted Program Area coordinators ensure that the planned assessment activities are completed and that a final report is prepared 5. Assessment are analyzed and program changes are recommended The Assessment Committee analyzes the results of the completed assessments and makes recommendations for program changes 6. Department reviews recommendations and makes program changes that are determined appropriate At a department meeting the recommendations of the Assessment Committee are discussed and proposed program changes are made only if there is approval from the department as a whole. 26

Sample Assessment Results Learning Outcome: Demonstrate an understanding of the principles and practices for software design and development 27

The Annual Assessment Process at CSUN 4. Department approves assessment plans and assessments are conducted Program Area coordinators ensure that the planned assessment activities are completed and that a final report is prepared 5. Assessment are analyzed and program changes are recommended The Assessment Committee analyzes the results of the completed assessments and makes recommendations for program changes 6. Department reviews recommendations and makes program changes that are determined appropriate At a department meeting the recommendations of the Assessment Committee are discussed and proposed program changes are made only if there is approval from the department as a whole. 28

The following recommendations for program improvement were made based on the results of the assessment: 1.Modify the course objectives for introductory computer science courses to include an introduction to software engineering concepts. STATUS: This was approved by the department and the members of the Software Engineering Course Group will provide information to the Fundamental Concepts Course Group to assist them in appropriately revising the course objectives for the introductory courses. 2.Modify the course objectives for elective courses with software engineering projects to include the reinforcement of software engineering concepts. STATUS: This was approved by the department and the members of the Software Engineering Course Group will provide information to those teaching elective courses to assist them in appropriately revising the course objectives to reinforce software engineering concepts. 3.Add a senior software engineering design project as a require for graduation. STATUS: After extensive discussions by the department regarding ways of implementing such a requirement, a committee was formed to develop a recommended approach and report back to the department for approval. Sample Program Improvements Learning Outcome: Demonstrate an understanding of the principles and practices for software design and development 29

Conclusions  The process has been used successfully for program assessment.  It has been embedded into the normal operations of the department and become accepted by faculty.  Since all faculty are involved, no one has become overburdened.  It has provided an effective means for directly assessing student learning with a focus on retention and transfer.  More attention is needed in the areas of validating results and ensuring the reliability of the assessment instruments used. 30