Download presentation
Presentation is loading. Please wait.
Published byShea Allie Modified over 10 years ago
1
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College
2
›Rubrics help reduce subjectivity in evaluations that are inherently subjective –Appropriate when: ›Learning manifests in varying degrees of sophistication ›Formative attainment goals can be described ›Evaluation seeks to differentiate of levels of performance –“Either/or” dichotomies (such as right or wrong, present or not, can do it or cannot) are more appropriately evaluated with tests or checklists
3
›Clarify program goals –Promote a shared understanding of the desired student learning outcomes (SLOs) –Help faculty see connections between course and program learning outcomes
4
›Serve as norming devices –Identify benchmarking indicators of achievement –Describe progressive levels of competency development in ways that clearly guide ratings ›Help aggregate diverse classroom assessments –Provide the means to translate findings into a common program assessment language
5
In written, oral, numeric or visual formats, the student will: Did it awesomely Mastery 90-100% A 3 points Did it Proficient 80-89% B 2 points Kind of did it Developing 70-79% C 1 point Didn’t do it Non-attempt or Emerging 0-69% D-F 0 points Demonstrate organization and/or coherence of ideas, content, and/or formulas Material is sharply focused and organized. The student presents a logical organization of ideas around a common theme that demonstrates an advanced understanding of the subject matter. Material is mostly focused and organized. The student presents logical constructions around a common theme that reflects meaning and purpose. The student’s ideas and organizational patterns reflect a common theme that demonstrates a basic understanding of the subject matter. Ideas are disorganized or may lack development in some places. The material lacks focus and organization with few or no ideas around common theme. Student struggles to demonstrate her/his understanding of the subject matter. Produce communication appropriate to audience, situation, venue, and/or context Demonstrates a thorough understanding of context, audience, and purpose that is responsive to the assigned task(s) and focuses on all elements of the work. Demonstrates adequate consideration of context, audience, and purpose and a clear focus on the assigned task(s) (e.g., the task aligns with audience, purpose, and context). Demonstrates a basic awareness of context, audience, purpose, and to the assigned tasks(s) (e.g., begins to show awareness of audience's perceptions and assumptions). Struggles to demonstrate attention to context, audience, purpose, and to the assigned tasks(s) (e.g., expectation of instructor or self as audience).
6
›Course-level assessments –Connect to program rubrics through SLO alignment –Need to be based on course outcomes to be useful at the course level –May be formative or summative, direct or indirect, qualitative or quantitative –Can be unique
7
Varying assessment type and focus → a broader body of evidence
8
–Internal and/or external stakeholder input ›Students, employers, faculty, community members ›Surveys, focus groups, interviews –Standardized/licensing/ certification exam results –Accreditor and/or consultant feedback –Practicum/internship evaluations –Peer/benchmark comparisons › Program-level assessments can vary too
11
›Start with well-written, measurable program outcome statements –Upon successful completion of this program, the student will be able to… ›Make any necessary revision before developing rubrics
12
›Solicit faculty input and agreement through inclusive processes: 1.Brainstorming 2.Grouping 3.Prioritizing 4.Rubric Refinement
13
›Ask the faculty how students demonstrate learning related to the competency –Remind everyone that brainstorming is a process of generating ideas without censure –Get the ideas down in writing where all can see (e.g., using a flip chart, projection, or note cards) –Note themes that emerge as potential component criteria, but avoid pigeonholing ideas as this can stifle and/or canalize the flow of ideas
14
›Group ideas to identify key criteria for assessing the competency ›Combine related ideas, with a focus on broad representation ›Rephrase ideas as needed to describe indicators of learning
15
›Reorder or number the indicators of learning based on progressive levels of sophistication/mastery ›Discuss options for performance-level headings and organization –Number of levels –Words that could be used as headings –Ascending or descending order of sophistication
16
›One person (or a small group) can now draft a rubric based on the group’s input ›Strive for clear, concise descriptions of observable demonstrations of learning –Broad enough to be applicable to multiple courses –Specific enough to provide unambiguous rating scales
17
›Ask the faculty for input –Performance-level descriptions need to be truly representative –All faculty need to be able to translate their course learning outcomes to the program rubric ›Revisit and revise as needed –After implementation, follow up with the faculty to assess how well the rubric worked
18
›Curriculum mapping ›Student Learning Outcome statement writing/refinement ›Facilitating rubric development ›Rubric drafting ›Course-level assessments ›Program-level assessments ›Data analysis ›Representing findings in graphics Ursula Waln uwaln@cnm.edu 505-224-4000 x 52943 LSA 101C
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.