A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.

Slides:



Advertisements
Similar presentations
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
Advertisements

SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Computer Science Department Program Improvement Plan December 3, 2004.
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Assessment Data Brigham Young University Department of Civil and Environmental Engineering W. Spencer Guthrie August 31, 2006 Brigham Young University.
Computer Science ABET Visit Update November 8, 2003.
Report to External Review Board W. Spencer Guthrie, Ph.D. Department of Civil and Environmental Engineering Brigham Young University December 1, 2006 W.
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
Assessment College of Engineering A Key for Accreditation February 11, 2009.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
SYSTEM ANALYSIS AND DESIGN
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
Overview Changes in the re-accreditation process since 2007 Assessment Resources.
3/12/20031 RUSH TO ASSESSMENT: Designing Assessment to Produce Failure Roberta Madison, Gloria Melara and Robert Lingard.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Outcomes Assessment in Computer Science Robert Lingard Assessment Coordinator August 27,
The Role of the Internal and External Evaluators in Student Assessment Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic.
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
Developing Smart objectives and literature review Zia-Ul-Ain Sabiha.
PRINCIPLES OF MANAGEMENT – DDPQ2532 INTRODUCTION.
Brian T. Malec, Ph.D. Professor of Health Administration Department of Health Sciences California State University, Northridge Northridge, CA
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
28 June 2016 | Proprietary and confidential information. © Mphasis 2013 Audit and its classifications Mar-2016 Internal Auditor Training.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
+ CIW Lesson 10 Part A. + IT Project and Program Management Successfully managed IT projects increase productivity and increase profits IT projects differ.
Assessment and Reporting
Presentation to the Portfolio Committee on Communications on the:
Presented by Anne C. Adams, MSW (919) and Bea Sweet
Quality Assurance of Higher Education Programs:
CURRICULUM DESIGN Part 2 of 2 PROGRAM EDUCATIONAL OBJECTIVES PROGRAM OUTCOMES COURSE OUTCOMES.
MUHC Innovation Model.
Objectives of the Training
2. Quality Assurance Unit (QAU)
Research Process №5.
HUMAN RESOURCE GOVERNANCE, RISK MANAGEMENT AND COMPLIANCE
Proposed Revisions to Criteria 3 and 5
Robert W. Lingard California State University, Northridge
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Senior Design Module Last revised 8/26/2018
ABET أسس ومبادئ معايير الـ
Institutional Effectiveness Presented By Claudette H. Williams
Program Assessment Processes for Developing and Strengthening
Building Changes’ Strategic Business Planning Process
Collecting and Interpreting Quantitative Data – Introduction (Part 1)
Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department
Background on Provincial Report Cards
Assessing Student Learning
Dr. Ron Atwell Ms. Kathleen Connelly
The Technology Integration Planning Model
Assessing Academic Programs at IPFW
What to do with your data?
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Analyzing Reliability and Validity in Outcomes Assessment
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Student Learning Outcomes at CSUDH
Fort Valley State University
Discussion on Virginia’s State Assessment Program
SLO Assessment and Program Review
Accountability and Assessment in School Counseling
Collecting and Interpreting Quantitative Data
HUD’s Coordinated Entry Data & Management Guide
Co-Curricular Assessment
Presentation transcript:

A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California State University, Northridge ASEE Annual Conference June 25, 2007

Outline The Importance of Assessment Problems with Assessment Goals of an Assessment Process An Approach to Assessment Applying the Software Engineering Paradigm The Annual Assessment Process at CSUN Conclusions

Importance of Assessment Assessment is required by accrediting boards (e.g., ABET). Those with vested interests (financial and otherwise) in education are demanding accountability. Without assessment it is difficult to know what changes to make to improve learning.

Problems with Assessment Pressure for assessment can cause a “rush to assessment,” and produce meaningless results. Pressure to “close the loop” can cause premature decisions regarding program changes. There is a tendency to only assess high performance areas in order to “showcase” a program. Results are often not validated. The effects of changes are often not assessed.

Goals of an Assessment Process To facilitate the direct assessment of student learning To measure student retention and the ability to apply what they have learned To Provide mechanisms to ensure the continuity of the process To be an efficient and natural extension of normal operations To satisfy the requirements of ABET and the University for assessment

Approach to Assessment “The first and only goal [in education is to] teach for long-term retention and transfer.” – Diane Halpern The best assessments of student learning are based on direct measures of achievement. How well students have learned is best assessed by looking at their ability to apply what they learned earlier to new situations.

Requirements for a Complete Assessment Process It must be comprehensive – must cover the full range of learning outcomes It must include multiple judgments – multiple sources of evidence must be used It must include multiple dimensions – different facets of student performance must be included It must collect direct evidence – direct measures of student attainment must be used

The Software Engineering Paradigm Requirements Analysis (understanding the problem) Software Design (planning a solution) Implementation/Coding (carrying out the plan) Testing/Validation (making sure the solution is correct)

Understanding the Problem Before beginning assessment, the first step is to decide what to assess – i.e., pick the most important things to assess. Collecting information from faculty, students, alumni, and employers can give hints as to where learning problems exist. Surveys, faculty meetings, and student interviews are ways of collecting this information.

Planning a Solution Once a learning outcome is selected, a plan for conducting the assessment must be developed. It must be determined how the assessment will be done – i.e., embedded questions in exams, a standard instrument, etc. How the results will be evaluated must also be determined – e.g., rubrics must be developed. It must also be determine who will do the various tasks required and when they will be done.

Carrying Out the Plan The steps of the plan must be carried out, and the plan must be monitored to ensure successful completion of the assessment. Someone must be designated as the lead, and this person has the responsibility for monitoring the plan.

Making Sure the Solution is Correct Assessment results obtained must be validated – this step is often omitted. One way to validate results is to make comparisons among several independent assessments using different approaches. Failure to validate the results can result in program changes that may not be appropriate. After program changes are made the effects of the changes must be assessed.

Iteration Like software engineering this process is iterative. Once program changes have been made, they must be assessed to determine whether the desired result has been accomplished. Analysis of this reassessment might indicated the need for further changes.

The Annual Assessment Process at CSUN Department forms Assessment Committee The program is divided into seven areas and a coordinator is chosen for each area. These coordinators constitute the Assessment Committee. Assessment Committee recommends outcomes to be assessed The Assessment considers results from previous assessments, informal assessments, and the length of time since particular outcomes have been assessed to determine the current set to be assessed. Department approves assessment goals and assessment plans are prepared After department approval the Assessment Committee decides on an assessment approach and develops a schedule of activities.

The Annual Assessment Process at CSUN Department approves assessment plans and assessments are conducted Program Area coordinators ensure that the planned assessment activities are completed and that a final report is prepared Assessment are analyzed and program changes are recommended The Assessment Committee analyzes the results of the completed assessments and makes recommendations for program changes Department reviews recommendations and makes program changes that are determined appropriate At a department meeting the recommendations of the Assessment Committee are discussed and proposed program changes are made only if there is approval from the department as a whole.

Conclusions The process has been used successfully for program assessment at CSUN. It has been embedded into the normal operations of the department and become accepted by faculty. Since all faculty are involved, no one has become overburdened. It has provided an effective means for directly assessing student learning with a focus on retention and transfer.