Download presentation
Presentation is loading. Please wait.
Published byFelicia Simon Modified over 9 years ago
1
Assessment in College Teaching Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College
2
Overview Grading, Assessment & the Purpose of these Slides
3
Grading and Assessment Go Hand-in-Hand Grading Summarizing student performance symbolically Summarizing student performance symbolically Percentages correct Percentages correct Points earned Points earned Letter grades Letter grades A holistic evaluation of work A holistic evaluation of work Used to communicate student success relative to criteria and/or other students Used to communicate student success relative to criteria and/or other students Assessment Analyzing student learning Analyzing student learning What students learned well What students learned well What they didn’t learn so well What they didn’t learn so well The factors that influenced the learning The factors that influenced the learning A multifaceted evaluation of student progress A multifaceted evaluation of student progress Used to identify ways to improve learning Used to identify ways to improve learning Does not have to involve grades Does not have to involve grades
4
The Purpose of these Slides These slides aim provide an overview of three techniques instructors can use to get the most out of their course-level assessment efforts: These slides aim provide an overview of three techniques instructors can use to get the most out of their course-level assessment efforts: 1.Classroom Assessment Techniques (CATs) Classroom Assessment Techniques (CATs)Classroom Assessment Techniques (CATs) For formative assessment For formative assessment 2.Item Analysis Item AnalysisItem Analysis For objective evaluation For objective evaluation 3.Descriptive Rubrics Descriptive RubricsDescriptive Rubrics For subjective evaluation For subjective evaluation Click the links in the list to skip ahead to a section.
5
Classroom Assessment Techniques For Formative Assessment
6
A Comprehensive, Authoritative Resource Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2 nd ed.). San Francisco, CA: Jossey-Bass. Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2 nd ed.). San Francisco, CA: Jossey-Bass. Describes 50 commonly used classroom assessment techniques (CATs) Describes 50 commonly used classroom assessment techniques (CATs) Emphasizes the importance of having clear learning goals Emphasizes the importance of having clear learning goals Promotes planned, intentional use to gauge student progress Promotes planned, intentional use to gauge student progress Encourages discussing results with the students Encourages discussing results with the students To promote learning To promote learning To teach students to monitor their own learning progress To teach students to monitor their own learning progress Encourages the use of insights gained to redirect instruction Encourages the use of insights gained to redirect instruction Examples of CATs are briefly described in the following 10 slides. Examples of CATs are briefly described in the following 10 slides. Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
7
Prior Knowledge, Recall & Understanding Misconception/Preconception Check Misconception/Preconception Check Having students write answers to questions designed to uncover prior knowledge or beliefs that may impede learning Having students write answers to questions designed to uncover prior knowledge or beliefs that may impede learning Empty Outlines Empty Outlines Providing students with an empty or partially completed outline and having them fill it in Providing students with an empty or partially completed outline and having them fill it in Memory Matrix Memory Matrix Giving students a table with column and row headings and having them fill in the intersecting cells with relevant details, match the categories, etc. Giving students a table with column and row headings and having them fill in the intersecting cells with relevant details, match the categories, etc. Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
8
Skill in Analysis & Critical Thinking Categorizing Grid Categorizing Grid Giving students a table with row headings and having students match by category and write in corresponding items from a separate list Giving students a table with row headings and having students match by category and write in corresponding items from a separate list Content, Form, and Function Outlines Content, Form, and Function Outlines Having students outline the what, how, and why related to a concept Having students outline the what, how, and why related to a concept Analytic Memos Analytic Memos Having students write a one- or two-page analysis of a problem or issue as if they were writing to an employer, client, stakeholder, politician, etc. Having students write a one- or two-page analysis of a problem or issue as if they were writing to an employer, client, stakeholder, politician, etc. Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
9
Skill in Synthesis & Creative Thinking Approximate Analogies Approximate Analogies Having students complete the analogy A is to B as ___ is to ___, with A and B provided Having students complete the analogy A is to B as ___ is to ___, with A and B provided Concept Maps Concept Maps Having students illustrate relationships between concepts by creating a visual layout bubbles and arrows connecting words and/or phrases Having students illustrate relationships between concepts by creating a visual layout bubbles and arrows connecting words and/or phrases Annotated Portfolios Annotated Portfolios Having students create portfolios presenting a limited number of works related to the specific course, a narrative, and maybe supporting documentation Having students create portfolios presenting a limited number of works related to the specific course, a narrative, and maybe supporting documentation Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
10
Skill in Problem Solving Problem Recognition Tasks Problem Recognition Tasks Presenting students with a few examples of common problem types and then asking them to identify the particular type of problem each represents Presenting students with a few examples of common problem types and then asking them to identify the particular type of problem each represents What’s the Principle? What’s the Principle? Presenting students with a few examples of common problem types and then asking them to state the principle that best applies to each problem Presenting students with a few examples of common problem types and then asking them to state the principle that best applies to each problem Documented Problem Solutions Documented Problem Solutions Having students not only show their work, but also explain next to it in writing how they worked the problem out (“show and tell”) Having students not only show their work, but also explain next to it in writing how they worked the problem out (“show and tell”) Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
11
Skill in Application & Performance Directed Paraphrasing Directed Paraphrasing Having students paraphrase part of a lesson for a specific audience and purpose Having students paraphrase part of a lesson for a specific audience and purpose Application Cards Application Cards Handing out an index card (or slip of scratch paper) and having students write down at least one ‘real-world’ application for what they have learned Handing out an index card (or slip of scratch paper) and having students write down at least one ‘real-world’ application for what they have learned Paper or Project Prospectus Paper or Project Prospectus Having students create a brief, structured plan for a paper or project, anticipating and identifying the elements to be developed Having students create a brief, structured plan for a paper or project, anticipating and identifying the elements to be developed Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
12
Awareness of Attitudes & Values Profiles of Admirable Individuals Profiles of Admirable Individuals Having students write a brief, focused profile of an individual – in a field related to the course – whose values, skills, or actions they admire Having students write a brief, focused profile of an individual – in a field related to the course – whose values, skills, or actions they admire Everyday Ethical Dilemmas Everyday Ethical Dilemmas Presenting students with a case study that poses an ethical dilemma – related to the course – and having them write anonymous responses Presenting students with a case study that poses an ethical dilemma – related to the course – and having them write anonymous responses Course-Related Self-Confidence Surveys Course-Related Self-Confidence Surveys Having students write responses to a few questions aimed at measuring their self-confidence in relation to a specific skill or ability Having students write responses to a few questions aimed at measuring their self-confidence in relation to a specific skill or ability Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
13
Self-Awareness as Learners Focused Autobiographical Sketches Focused Autobiographical Sketches Having students write one to two pages about a single, successful learning experience in their past relevant to the learning in the course Having students write one to two pages about a single, successful learning experience in their past relevant to the learning in the course Interest/Knowledge/Skills Checklists Interest/Knowledge/Skills Checklists Giving students a checklist of the course topics and/or skills and having them rate their level of interest, skill, and/or knowledge for each Giving students a checklist of the course topics and/or skills and having them rate their level of interest, skill, and/or knowledge for each Goal Ranking and Matching Goal Ranking and Matching Having students write down a few goals they hope to achieve – in relation to the course/ program – and rank those goals; then comparing student goals to instructor/program goals to help students better understand what the course/program is about Having students write down a few goals they hope to achieve – in relation to the course/ program – and rank those goals; then comparing student goals to instructor/program goals to help students better understand what the course/program is about Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
14
Course-Related Study Skills & Behaviors Productive Study-Time Logs Productive Study-Time Logs Having students record how much time they spend studying, when they study, and/or how productively they study Having students record how much time they spend studying, when they study, and/or how productively they study Punctuated Lectures Punctuated Lectures Stopping periodically during lectures and having students reflect upon and then write briefly about their listening behavior just prior and how it helped or hindered their learning Stopping periodically during lectures and having students reflect upon and then write briefly about their listening behavior just prior and how it helped or hindered their learning Process Analysis Process Analysis Having student keep a record of the step they take in carrying out an assignment and then reflect on how well their approach worked Having student keep a record of the step they take in carrying out an assignment and then reflect on how well their approach worked Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
15
Reactions to Instruction Teacher-Designed Feedback Forms Teacher-Designed Feedback Forms Having students respond anonymously to 3 to 7 questions in multiple- choice, Likert scale, or short-answer formats to get course-specific feedback Having students respond anonymously to 3 to 7 questions in multiple- choice, Likert scale, or short-answer formats to get course-specific feedback Group Instructional Feedback Technique Group Instructional Feedback Technique Having someone else (other than the instructor) poll students on what works, what doesn’t, and what could be done to improve the course Having someone else (other than the instructor) poll students on what works, what doesn’t, and what could be done to improve the course Classroom Assessment Quality Circles Classroom Assessment Quality Circles Involving groups of students in conducting structured, ongoing assessment of course materials, activities, and assignments and suggesting ways to improve student learning Involving groups of students in conducting structured, ongoing assessment of course materials, activities, and assignments and suggesting ways to improve student learning Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
16
Reactions to Class Activities & Materials Group-Work Evaluations Group-Work Evaluations Having students answer questions to evaluate team dynamics and learning experiences following cooperative learning activities Having students answer questions to evaluate team dynamics and learning experiences following cooperative learning activities Reading Rating Sheets Reading Rating Sheets Having students rate their own reading behaviors and/or the interest, relevance, etc., of a reading assignment Having students rate their own reading behaviors and/or the interest, relevance, etc., of a reading assignment Exam Evaluations Exam Evaluations Having students provide feedback that reflects on the degree to which an exam (and preparing for it) helped them to learn the material, how fair they think the exam is as an assessment of their learning, etc. Having students provide feedback that reflects on the degree to which an exam (and preparing for it) helped them to learn the material, how fair they think the exam is as an assessment of their learning, etc. Click to go to: Click to go to: Item Analysis Descriptive Rubrics Item Analysis Descriptive Rubrics
17
Item Analysis For Objective Evaluation
18
Item Analysis Looks at frequency of correct responses (or behaviors) in connection with overall performance Looks at frequency of correct responses (or behaviors) in connection with overall performance Used to examine item reliability Used to examine item reliability How consistently a question or performance criterion discriminates between high and low performers How consistently a question or performance criterion discriminates between high and low performers Can be useful in improving validity of measures Can be useful in improving validity of measures Can help instructors decide whether to eliminate certain items from the grade calculations Can help instructors decide whether to eliminate certain items from the grade calculations Can reveal specific strengths and gaps in student learning Can reveal specific strengths and gaps in student learning Click to go to: CATs Descriptive Rubrics Descriptive Rubrics
19
How Item Analysis Works Groups students by the highest, mid-range, and lowest overall scores and examines item responses by group Groups students by the highest, mid-range, and lowest overall scores and examines item responses by group Assumes that higher-scoring students have a higher probability of getting any given item correct than do lower-scoring students Assumes that higher-scoring students have a higher probability of getting any given item correct than do lower-scoring students May have studied and/or practiced more and understood the material better May have studied and/or practiced more and understood the material better May have greater test-taking savvy, less anxiety, etc. May have greater test-taking savvy, less anxiety, etc. Produces a calculation for each item Produces a calculation for each item Do it yourself to easily calculate a group difference or discrimination index Do it yourself to easily calculate a group difference or discrimination index Use EAC Outcomes (a Blackboard plug-in made available to all CNM faculty by the Nursing program) to generate a point-biserial correlation coefficient Use EAC Outcomes (a Blackboard plug-in made available to all CNM faculty by the Nursing program) to generate a point-biserial correlation coefficient Gives the instructor a way to analyze performance on each item Gives the instructor a way to analyze performance on each item Click to go to: CATs Descriptive Rubrics Descriptive Rubrics
20
One Way to Do Item Analysis by Hand Shared by Linda Suskie at the NMHEAR Conference, 2015Item Tally of those in Top 27% who missed item* Tally of those in the Middle 46% who missed item Tally of those in the Lower 27% who missed item* Total % Who Missed Item Group Difference (# in Lower minus # in Top) 1||||| ||||| ||||| ||||| || ||||| ||||| ||||| || 34%17 2||||| ||||||| ||||| ||||| ||||| ||||| |||| 40%12 3||||||5% 4 ||||| ||||||||| | 17%11 * You can use whatever portion you want for the top and lower groups, but they need to be equal. Using 27% is accepted convention (Truman Kelley, 1939). An unreliable question Good discrimination
21
Another Way to Do Item Analysis by Hand Rasch Item Discrimination Index (D) N=31 because the upper and lower group each contain 31 students (115 students tested) Item# in Upper Group who answered correctly (# UG ) Portion of UG who answered correctly (p UG) # in Lower Group who answered correctly (# LG ) Portion of LG who answered correctly (p LG ) 1 311.00 (100%)140.45 (45%)0.55 2 240.77 (77%)120.39 (39%)0.38 3 280.90 (90%)290.93 (93%)-0.03 4 31 1.00 (100%)200.65 (65%)0.35 A discrimination index of 0.4 or greater is generally regarded as high and anything less than 0.2 as low (R.L. Ebel, 1954). An unreliable question Good discrimination
22
The Same Thing but Less Complicated Rasch Item Discrimination Index (D) N in Upper and Lower Groups is 31 (27% of 115 students) Item# in Upper Group who answered correctly (# UG ) # in Lower Group who answered correctly (# LG ) 1 31140.55 2 24120.38 3 2829-0.03 4 31 200.35 It isn’t necessary to calculate the portions of correct responses in each group if you use the formula shown here. This is really easy to do!
23
Example of an EAC Outcomes Report A point-biserial correlation is the Pearson correlation between responses to a particular item and scores on the total test (with or without that item). Correlation coefficients range from -1 to 1. This is available to CNM faculty through Blackboard course tools. An unreliable question Good discrimination
24
Identifying Key Questions A key (a.k.a. signature) question is one that provides information about student learning in relation to a specific instructional objective (or student learning outcome statement). A key (a.k.a. signature) question is one that provides information about student learning in relation to a specific instructional objective (or student learning outcome statement). The item analysis methods shown in the preceding slides can help you identify and improve the reliability of key questions. The item analysis methods shown in the preceding slides can help you identify and improve the reliability of key questions. A low level of discrimination may indicate a need to tweak the wording. A low level of discrimination may indicate a need to tweak the wording. Improving discrimination value also improves question validity. Improving discrimination value also improves question validity. The more valid an assessment measure, the more useful it is in gauging student learning. The more valid an assessment measure, the more useful it is in gauging student learning. Click to go to: CATs Descriptive Rubrics Descriptive Rubrics
25
Detailed Multiple-Choice Item Analysis The detailed item analysis method shown on the next slide is for use with key multiple-choice items. The detailed item analysis method shown on the next slide is for use with key multiple-choice items. This type of analysis can provide clues to the nature of students’ misunderstanding, provided: This type of analysis can provide clues to the nature of students’ misunderstanding, provided: The item is a valid measure of the instructional objective The item is a valid measure of the instructional objective Incorrect options (distractors) are written to be diagnostic (i.e., to reveal misconceptions or breakdowns in understanding) Incorrect options (distractors) are written to be diagnostic (i.e., to reveal misconceptions or breakdowns in understanding) Click to go to: CATs Descriptive Rubrics Descriptive Rubrics
26
Example of a Detailed Item Analysis Item 2 of 4. The correct option is E. (115 students tested) Item Response Pattern ABCDERow Total Upper 27% |||||||||||| ||||| ||||| ||||| |||| 31 6.5%16%77.5% Middle 46% |||||||| ||||| |||| |||||||| ||||| ||||| ||||| ||||| ||||| ||| 53 6%26%4%2%62% Lower 27% |||||||||| |||||||||||||| ||||| ||31 16%23%16%6%39% Grand Total 10267369115 8.5%23%6%2.5%60% These results suggest that distractor B might provide the greatest clue about breakdown in students’ understanding, followed by distractor A, then C.
27
Descriptive Rubrics For Subjective Evaluation
28
Rubric: Just Another Word for Scoring Guide A rubric is any scoring guide that lists specific criteria, such as a checklist or a rating scale. A rubric is any scoring guide that lists specific criteria, such as a checklist or a rating scale. Checklists are used for objective evaluation (did it or did not do it). Checklists are used for objective evaluation (did it or did not do it). Rating scales are used for subjective evaluation (gradations of quality). Rating scales are used for subjective evaluation (gradations of quality). Descriptive rubrics are rating scales that contain descriptions of what constitutes each level of performance. Descriptive rubrics are rating scales that contain descriptions of what constitutes each level of performance. Maybe call them descriptive scoring guides if you don’t like the word rubric. Maybe call them descriptive scoring guides if you don’t like the word rubric. Most people who talk about rubrics are referring to descriptive rubrics, not checklists or rating scales. Most people who talk about rubrics are referring to descriptive rubrics, not checklists or rating scales. Click to go to: CATs Item Analysis Item Analysis
29
The Purpose of Descriptive Rubrics Descriptive rubrics are used to lend objectivity to evaluations that are inherently subjective, e.g.: Descriptive rubrics are used to lend objectivity to evaluations that are inherently subjective, e.g.: Grading of artwork, papers, performances, projects, speeches, etc. Grading of artwork, papers, performances, projects, speeches, etc. Assessing overall student progress toward specific learning outcomes (course and/or program level) Assessing overall student progress toward specific learning outcomes (course and/or program level) Monitoring developmental levels of individuals as they progress through a program (‘developmental rubrics’). Monitoring developmental levels of individuals as they progress through a program (‘developmental rubrics’). Conducting employee performance evaluations. Conducting employee performance evaluations. Assessing group progress toward a goal. Assessing group progress toward a goal. When used by multiple evaluators, descriptive rubrics can minimize differences in rater thresholds (especially if normed). When used by multiple evaluators, descriptive rubrics can minimize differences in rater thresholds (especially if normed). Click to go to: CATs Item Analysis Item Analysis
30
Why Use Descriptive Rubrics in Class? In giving assignments, descriptive rubrics can help clarify the instructor’s expectations and grading criteria for students. In giving assignments, descriptive rubrics can help clarify the instructor’s expectations and grading criteria for students. Students can ask more informed questions about the assignment. Students can ask more informed questions about the assignment. A clear sense of what is expected can inspire students to achieve more. A clear sense of what is expected can inspire students to achieve more. The rubric helps explain to students why they received the grade they did. The rubric helps explain to students why they received the grade they did. Descriptive rubrics help instructors remain fair and consistent in their scoring of student work (more so than rating scales). Descriptive rubrics help instructors remain fair and consistent in their scoring of student work (more so than rating scales). Scoring is easier and faster when descriptions clearly distinguish levels. Scoring is easier and faster when descriptions clearly distinguish levels. The effects of scoring fatigue (e.g., grading more generously toward the bottom of a stack due to disappointed expectations) are minimized. The effects of scoring fatigue (e.g., grading more generously toward the bottom of a stack due to disappointed expectations) are minimized. Click to go to: CATs Item Analysis Item Analysis
31
Why Use Descriptive Rubrics for Assessment? Clearly identifying benchmark levels of performance and describing what learning looks like at each level establishes a solid framework for interpreting multiple measures of performance. Clearly identifying benchmark levels of performance and describing what learning looks like at each level establishes a solid framework for interpreting multiple measures of performance. Student performance on different types of assignments and at different points in the learning process can be interpreted for analysis using a descriptive rubric as a central reference. Student performance on different types of assignments and at different points in the learning process can be interpreted for analysis using a descriptive rubric as a central reference. With rubrics that describe what goal achievement looks like, instructors can more readily identify and assess the strength of connections between: With rubrics that describe what goal achievement looks like, instructors can more readily identify and assess the strength of connections between: Course assignments and course goals Course assignments and course goals Course assignments and program goals Course assignments and program goals Click to go to: CATs Item Analysis Item Analysis
32
Two Common Types of Descriptive Rubrics Holistic Each level of performance has just one comprehensive description. Each level of performance has just one comprehensive description. Descriptions may be organized in columns or rows. Descriptions may be organized in columns or rows. Useful for quick and general assessment and feedback. Useful for quick and general assessment and feedback. Analytic Each level of performance has descriptions for each of the performance criteria. Each level of performance has descriptions for each of the performance criteria. Descriptions are organized in a matrix. Descriptions are organized in a matrix. Useful for detailed assessment and feedback. Useful for detailed assessment and feedback. Click to go to: CATs Item Analysis Item Analysis
33
Example of a Holistic Rubric Performance Levels Descriptions Proficient (10 points) Ideas are expressed clearly and succinctly. Arguments are developed logically and with sensitivity to audience and context. Original and interesting concepts and/or unique perspectives are introduced. Intermediate (6 points) Ideas are clearly expressed but not fully developed or supported by logic and may lack originality, interest, and/or consideration of alternative points of view. Emerging (3 points) Expression of ideas is either undeveloped or significantly hindered by errors in logic, grammatical and/or mechanical errors, and/or over- reliance on jargon and/or idioms.
34
Example of an Analytic Rubric Delve, Mintz, and Stewart’s (1990) Service Learning Model Developmental Variables Phase 1 Exploration Phase 2 Clarification Phase 3 Realization Phase 4 Activation Phase 5 InternalizationInterventionMode GroupGroup (beginning to identify with group) Group that shares focus or independently Individual Setting Minimal community interaction—Prefers on- campus activities Trying many types of contact Direct contact with community Direct contact with community—intense focus on issue or cause Frequent and committed involvement CommitmentFrequency One TimeSeveral Activities or Sites Consistently at One SiteConsistently at One Site or with one issue Consistently at One Site or focused on particular issues Duration Short TermLong Term Commitment to Group Long Term Commitment to Activity, Site, or Issue Lifelong Commitment to Issue (beginnings of Civic Responsibility) Lifelong Commitment to Social Justice BehaviorNeeds Participate in Incentive Activities Identify with Group Camaraderie Commit to Activity, Site, or Issue Advocate for Issue(s)Promote Values in self and others Outcomes Feeling GoodBelonging to a GroupUnderstanding Activity, Site, or Issue Changing LifestyleLiving One’s Values BalanceChallenges Becoming Involved Concern about new environments Choosing from Multiple Opportunities/Group Process Confronting Diversity and Breaking from Group Questioning Authority/Adjusting to Peer Pressure Living Consistently with Values Supports Activities are Non- threatening and Structured Group Setting, Identification and Activities are Structured Reflective-Supervisors, Coordinators, Faculty, and Other Volunteers Reflective-Partners, Clients, and Other Volunteers Community—Have Achieved a Considerable Inner Support System
35
Another Example of an Analytic Rubric AAC&U Ethical Reasoning Value Rubric Capstone4Milestones 32 Benchmark1 Ethical Self- Awareness Student discusses in detail/analyzes both core beliefs and the origins of the core beliefs and discussion has greater depth and clarity. Student discusses in detail/analyzes both core beliefs and the origins of the core beliefs. Student states both core beliefs and the origins of the core beliefs. Student states either their core beliefs or articulates the origins of the core beliefs but not both. Understanding Different Ethical Perspectives/ Concepts Student names the theory or theories, can present the gist of said theory or theories, and accurately explains the details of the theory or theories used. Student can name the major theory or theories she/he uses, can present the gist of said theory or theories, and attempts to explain the details of the theory or theories used, but has some inaccuracies. Student can name the major theory she/he uses, and is only able to present the gist of the named theory. Student only names the major theory she/he uses. Ethical Issue Recognition Student can recognize ethical issues when presented in a complex, multilayered (gray) context AND can recognize cross- relationships among the issues. Student can recognize ethical issues when issues are presented in a complex, multilayered (gray) context OR can grasp cross-relationships among the issues. Student can recognize basic and obvious ethical issues and grasp (incompletely) the complexities or interrelationships among the issues. Student can recognize basic and obvious ethical issues but fails to grasp complexity or interrelationships. Application of Ethical Perspectives/ Concepts Student can independently apply ethical perspectives/concepts to an ethical question, accurately, and is able to consider full implications of the application. Student can independently apply ethical perspectives/concepts to an ethical question, accurately, but does not consider the specific implications of the application. Student can apply ethical perspectives/concepts to an ethical question, independently (to a new example) and the application is inaccurate. Student can apply ethical perspectives/concepts to an ethical question with support (using examples, in a class, in a group, or a fixed-choice setting) but is unable to apply ethical perspectives/concepts independently (to a new example.). Evaluation of Different Ethical Perspectives/ Concepts Student states a position and can state the objections to, assumptions and implications of and can reasonably defend against the objections to, assumptions and implications of different ethical perspectives/concepts, and the student's defense is adequate and effective. Student states a position and can state the objections to, assumptions and implications of, and respond to the objections to, assumptions and implications of different ethical perspectives/concepts, but the student's response is inadequate. Student states a position and can state the objections to, assumptions and implications of different ethical perspectives/concepts but does not respond to them (and ultimately objections, assumptions, and implications are compartmentalized by student and do not affect student's position.) Student states a position but cannot state the objections to and assumptions and limitations of the different perspectives/concepts.
36
There are No Rules for Developing Rubrics Form typically follows function, so how one sets up a descriptive rubric is usually determined by how one plans to use it. Form typically follows function, so how one sets up a descriptive rubric is usually determined by how one plans to use it. Performance levels are usually column headings but can function just as wall as row headings. Performance levels are usually column headings but can function just as wall as row headings. Performance levels can be arranged in ascending or descending order, and one can include as many levels as one wants. Performance levels can be arranged in ascending or descending order, and one can include as many levels as one wants. Descriptions can focus only on positive manifestations or include references to missing or negative characteristics. Descriptions can focus only on positive manifestations or include references to missing or negative characteristics. Some use grid lines while others do not. Some use grid lines while others do not. Click to go to: CATs Item Analysis Item Analysis
37
Rubric Artifact Analyses Interviews Observations Surveys Written Tests Descriptive rubrics can help pull together results from multiple measures for a more comprehensive picture of student learning.
38
Using the Model To pull together multiple measures for an overall assessment of student learning: To pull together multiple measures for an overall assessment of student learning: Take a random sample from each assignment and re-score those using the rubric (instead of the grading criteria), or rate the students as a group based on overall performance on each assignment. Take a random sample from each assignment and re-score those using the rubric (instead of the grading criteria), or rate the students as a group based on overall performance on each assignment. Then, combine the results, weighting their relative importance based on: Then, combine the results, weighting their relative importance based on: At what stage in the learning process the results were obtained At what stage in the learning process the results were obtained How well you think students understood the assignment or testing process How well you think students understood the assignment or testing process How closely the learning measured relates to the instructional objectives How closely the learning measured relates to the instructional objectives Factors that could have biased the results Factors that could have biased the results Your own observations, knowledge of the situations, and professional judgment Your own observations, knowledge of the situations, and professional judgment Click to go to: CATs Item Analysis Item Analysis
39
“ ” Remember that when you do assessment, whether in the department, the general education program, or at the institutional level, you are not trying to achieve the perfect research design; you are trying to gather enough data to provide a reasonable basis for action. You are looking for something to work on. Woolvard, B. E. (2010). Assessment clear and simple: Apractical guide for institutions, departments, and general education. San Fransisco, CA: Jossey-Bass.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.