Download presentation
1
Assessment Techniques for Curricular Improvement
Roxanne Canosa, Rajendra K. Raj Department of Computer Science Rochester Institute of Technology
2
Overview What is Assessment? Terminology Performance Criteria
Analytic vs. Holistic Approaches Assessment == Grading? Terminology Assessment vs. Accreditation Outcomes vs. Objectives Performance Criteria Direct vs. Indirect Evaluation and Continuous Improvement
3
What is Assessment? “Assessment is one or more processes that identify, collect, and prepare data to evaluate the achievement of program outcomes and educational objectives” Criteria for Accrediting Computing Programs – Appendix A (Proposed Changes) From Section II.D.1 of the ABET Accreditation Policy and Procedure Manual
4
Analytic vs. Holistic Approaches
Analytic approach All students/courses analyzed to diagnose areas in need of improvement Holistic approach Focus on overall performance of the program Input from employers, alumni, advisory board Develop efficient and effective processes “Lean, mean assessment machine” Don’t commit “random acts of assessment” Gloria Rogers
5
What is Your Assessment Goal?
Assessing all students or specific groups of students? Assessing students, department, or program? Assessing for short-term improvement or long-term effect? Assessing for formative or summative purposes?
6
Grading vs. Assessing Assessing Grading
Measures extent to which a student meets faculty requirements and expectations for a course Can grades infer student’s achievement of an outcome? Factors Student knowledge Work ethic Faculty variance in course content, grading components, beliefs, bias, … Assessing Measures extent to which a student achieves each course (program) outcome Can we leverage grading components for assessment? Use rubrics, which are pre-announced performance criteria
7
Assessment vs. Accreditation
Institutional accreditation through Middle States, SACS, etc. are increasingly requiring direct assessment of program objectives and outcomes Jargon may be different, but the essential ideas are the same
8
Terminology (Jargon) Standards, rubrics, metrics
From ABET perspective Terminology (Jargon) Standards, rubrics, metrics Statements to measure performance on outcomes and backed by evidence Performance criteria Goals, outcomes, standards Describe expected accomplishments of graduates 3-5 years after graduation Objectives Educational strategies Mapping curriculum (coursework, internships, etc.) to outcomes Educational practices Assessment Process to review results of data collection and analysis to determine value of findings and future action(s) Evaluation Processes to identify, collect, analyze & report data to evaluate achievement Objectives, goals, standards Describe what students are expected to know & be able to do by graduation Outcomes Other Terms Definition Term
9
Terminology Lessons Use terminology for your situation
Sometimes dictated by institutional accreditation (SACS, Middle States) Sometimes dictated by program accreditation (ABET) Keep a glossary of terms handy for any external evaluators Stick to your terminology Terms are not fungible without causing too much grief
10
Proposed Changes to ABET Criteria for Computing
Old criteria Intents and Standards New criteria ( cycle) General Program Specific
11
New ABET Criteria 8 General Criteria CS Program Specific Criteria
Students Program Educational Objectives Program Outcomes (a) through (i) Assessment and Evaluation Curriculum Faculty Facilities Support CS Program Specific Criteria Outcomes and Assessment (a) and (b) Faculty Qualifications Curriculum (a), (b), and (c) IT/IS Program Specific Criteria
12
Program Audit Concern Concern
“A criterion is currently satisfied; however, potential exists for this situation to change in the near future such that the criterion may not be satisfied. Positive action is required to ensure full compliance with the Criteria.”
13
Program Audit Weakness
“A criterion is currently satisfied but lacks strength of compliance that assures that the quality of the program will not be compromised prior to the next general review. Remedial action is required to strengthen compliance with the Criteria.”
14
Program Audit Deficiency
“A criterion is not satisfied. Therefore, the program is not in compliance with the Criteria and immediate action is required.”
15
Program Objectives “Program educational objectives are broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve.” Long-term goals Should be distinct to your program Should be publicly available Must be measurable!
16
Program Outcomes “Program outcomes are narrower statements that describe what students are expected to know and be able to do by the time of graduation. These relate to the skills, knowledge, and behaviors that students acquire in their matriculation through the program.” Should be publicly available Must be measurable!
17
Objectives vs. Outcomes
Example objective: Graduates will exhibit effective communication skills Example outcomes: By the time of graduation, students will: demonstrate effective written communication skills demonstrate effective oral communication skills - Gloria Rogers
18
Performance Criteria Define and describe progression toward meeting important components of work being completed, critiqued, or assessed Student provides adequate detail to support his/her solution/argument Student uses language and appropriate word choice for the audience Student work demonstrates an organizational pattern that is logical and conveys completeness Student uses the rules of standard English Provide solid evidence of progression
19
What is Solid Evidence? Direct Evidence Indirect Evidence
Easier to measure Familiar to most faculty - exam or project grades, presentation skills, etc. Indirect Evidence Difficult to measure Attitudes or perceptions For example, a desired outcome of a course may include “improving students’ appreciation of team work”
20
Direct vs. Indirect Assessment
The assessment process should include both indirect and direct measurement techniques A variety of sources should be used Employers, students, alumni, etc. Converging evidence from multiple sources can reduce the effect of any inherent bias in the data
21
Direct Assessment Direct examination or observation of student knowledge or skills using stated, measurable outcomes Faculty typically assess student learning throughout a course using exams/quizzes, demonstrations, and reports Sample what students know or can do Provide evidence of student learning
22
Direct Assessment of PEOs
Employment statistics Promotions and career advancement of graduates Job titles, advanced degrees earned, additional course work taken after graduation, etc. PEOs must be assessed separately from POs
23
Direct Assessment of POs
Common final exams Locally developed exit exams Standardized regional or national exit exams External examiner Co-op reports from employers Portfolios of student work
24
Indirect Assessment Indirect assessment of student learning ascertains the perceived extent or value of learning experiences Assess opinions or thoughts about student knowledge or skills Provides information about student perception of their learning and how this learning is valued by different constituencies
25
Indirect Assessment Measures
Exit and other kinds of interviews Archival data Focus groups Written surveys and questionnaires Industrial advisory boards Employers Job fair recruiters Faculty at other schools
26
Survey of Assessment Methods
Indirect Standardized Exams Oral Exams Portfolios Written & Other Surveys External Examiner Locally Designed Exams Method Direct Performance Appraisal Focus Groups Archived Records Behavioral Observations Simulations Exit & Other Interviews
27
Direct and Indirect Duality of some instruments, e.g., an exit interview Indirect Survey of opinions about the perceived value of the program components Direct If person asking the questions uses it as a way of assessing student’s skills (e.g., oral communication), then the survey is being used as a direct measure of the achievement of that outcome
28
Evaluation “Evaluation is one or more processes for interpreting the data and evidence accumulated through assessment practices. Evaluation determines the extent to which program outcomes or program educational objectives are being achieved, and results in decisions and actions to improve the program.”
29
Continuous Improvement
Accreditation boards are moving towards outcomes-based assessment of CS, IS, and IT programs Programs must have an established outcomes-based assessment plan in place (or at least be making progress in that direction) Process must be documented Process must show continuous improvement (both quantitatively and qualitatively)
30
Faculty Responsibility
All faculty must have a commitment to and be directly involved in the evaluation of program educational objectives and program outcomes, as well as the process for continuous improvement of the program
31
Need for Faculty And Staff Buy-In
What makes most academics tick? Rewards Money? Fun? Appreciation? Recognition? How to encourage involvement? We all resent any extra work!
32
Where to Begin? Define your Mission Statement
Define your Program Educational Objectives (PEOs) Define your Program Outcomes (POs) Define Course Outcomes (COs) Include specific course outcomes on each course syllabus Make publicly available
33
Then What? Show how course outcomes map to program outcomes
Show how program outcomes map to program educational objectives Choose measurement tools, both direct and indirect Collect data
34
Finally Present data to faculty in an easily digestible form
Charts, graphs, tables, etc. Faculty evaluates the data Are students actually learning the material that the faculty believe (and claim) they are learning? Faculty make recommendations for improvement as necessary
35
The Big Picture Performance Criteria Stakeholders (students,
Mission Statement Stakeholders (students, alumni employers faculty, …) Course Outcomes Program Objectives Program Outcomes Assess: Collect and Analyze Evidence Assess: Collect and Analyze Evidence Revise Evaluate: Interpret Evidence Take Action Educational Practices/Strategies
36
The Big Picture Show relationship between mission statement, objectives, and outcomes Assess and evaluate objectives and outcomes independently Map program outcomes to program objectives Map course outcomes to program outcomes Identify weaknesses and implement focused improvements in targeted areas
37
Issues All assessment methods have their limitations and contain some bias Meaningful analysis requires both direct and indirect measures from a variety of sources Students, alumni, faculty, employers, etc. Multiple assessment methods provides converging evidence of student learning
38
Assessment Lessons Cannot do everything at once
Try an approach for first round; learn and refine Having data isn’t all there is to it! Easy to generate lots of bad data One size fits all … NOT! Programs, courses, instructors all differ Be ready to compromise Perfection is neither possible nor desirable Faculty evaluation and promotion Do not tie to data generated from assessment
39
Resources http://www.cs.rit.edu/~rlc/Assessment/
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.