Download presentation
Presentation is loading. Please wait.
Published byAldous Perkins Modified over 8 years ago
1
Assessment Instruments and Rubrics Workshop Series Part 4: Data Reporting Continued April 6, 2016 Drs. Summer DeProw and Topeka Small
2
Workshop Agenda Follow-up from Part 3’s Workshop—Questions, Comments, Experiences to share? (Dr. Green…..Lay it on us!) Previous data reporting discussion—one last look at data from rubrics New data reporting discussion Standardized/Licensure exams—ETS Major Field Exam Homegrown exams—Aggregate descriptive statistics and item analysis Assessment report forms for 15-16 (if time allows) Report form for programs without a specialized accreditor Status report form for programs with specialized accreditor
3
Follow-up? Part 3’s Workshop: Questions, Comments, Experiences to share? (Dr. Green…..Lay it on us!)
4
One Last Look at Rubric Data—Written Com. Capstone 4 Milestones 32 Benchmark 1 Context of and Purpose for Writing Includes considerations of audience, purpose, and the circumstances surrounding the writing task(s). Demonstrates a thorough understanding of context, audience, and purpose that is responsive to the assigned task(s) and focuses all elements of the work. Demonstrates adequate consideration of context, audience, and purpose and a clear focus on the assigned task(s) (e.g., the task aligns with audience, purpose, and context). Demonstrates awareness of context, audience, purpose, and to the assigned tasks(s) (e.g., begins to show awareness of audience's perceptions and assumptions). Demonstrates minimal attention to context, audience, purpose, and to the assigned tasks(s) (e.g., expectation of instructor or self as audience). Content DevelopmentUses appropriate, relevant, and compelling content to illustrate mastery of the subject, conveying the writer's understanding, and shaping the whole work. Uses appropriate, relevant, and compelling content to explore ideas within the context of the discipline and shape the whole work. Uses appropriate and relevant content to develop and explore ideas through most of the work. Uses appropriate and relevant content to develop simple ideas in some parts of the work. Genre and Disciplinary Conventions Formal and informal rules inherent in the expectations for writing in particular forms and/or academic fields (please see glossary). Demonstrates detailed attention to and successful execution of a wide range of conventions particular to a specific discipline and/or writing task (s) including organization, content, presentation, formatting, and stylistic choices Demonstrates consistent use of important conventions particular to a specific discipline and/or writing task(s), including organization, content, presentation, and stylistic choices Follows expectations appropriate to a specific discipline and/or writing task(s) for basic organization, content, and presentation Attempts to use a consistent system for basic organization and presentation. Sources and EvidenceDemonstrates skillful use of high-quality, credible, relevant sources to develop ideas that are appropriate for the discipline and genre of the writing Demonstrates consistent use of credible, relevant sources to support ideas that are situated within the discipline and genre of the writing. Demonstrates an attempt to use credible and/or relevant sources to support ideas that are appropriate for the discipline and genre of the writing. Demonstrates an attempt to use sources to support ideas in the writing. Control of Syntax and Mechanics Uses graceful language that skillfully communicates meaning to readers with clarity and fluency, and is virtually error- free. Uses straightforward language that generally conveys meaning to readers. The language in the portfolio has few errors. Uses language that generally conveys meaning to readers with clarity, although writing may include some errors. Uses language that sometimes impedes meaning because of errors in usage.
5
Note: Each work product was scored on 5 dimensions of written communication using a common AAC&U VALUE Rubric. These results are not generalizable across participating states or the nation in any way. Please use appropriately.
6
Rubric Data Presented the Same as the Rubric Capstone 4 Milestone 3 Milestone 2 Benchmark 1 Non-existent 0 Context/Purp ose 24.1%34.8%29.5%11.6%0% Content Development 22.9%35.2%26.9%14.5%0.4% Genre 19.9%33.3%30.7%14.7%1.3% Sources and Evidence 17.1%27.4%28.7%16.1%10.6% Syntax and Mechanics 18.7%34.8%33.3%11.8%1.4%
7
Purchased/Licensure Exams AKA standardized exams ETS Major Field Exams CAT—Critical Thinking Analysis Test California Critical Thinking Skills Test ACAT Specific graduate admissions exam—MCAT, PCAT Pre-Licensure exam—HESI Any licensure exam—PRAXIS, NCLEX, etc
8
Purchased Exams Advantages Tested exam questions All statistics calculated for you—Aggregate and disaggregate Sub-scores can be directly connected to outcomes Comparative data Disadvantages Includes questions that are not part of curriculum Students don’t invest themselves Most are knowledge-based exams, not application of knowledge
11
Homegrown Exams Advantages Professors can align exam to outcomes Professors have confidence in results Can include application questions from scenario-based (case study, video, music) questions Disadvantages Create the exam Connecting the exam questions to multiple outcomes can be laborious Calculate statistics yourself No external comparative data
12
Aggregate Data Entire group over a period of time Example, Senior Plant and Soil Science students comprehensive final exam scores worth 100 points in PSSC 4883 Agriculture Systems throughout the academic year 2015-16 (four sections of 25 students) Typically descriptive statistics Central tendency—Mean, Median, Mode Dispersion—Range, Standard Deviation Trend analysis (year-to-year percentage changes) can be particularly interesting if you have previously implemented an action plan However, we are not designing hypothesis testing (in some instances we can, but that is typically not used in student-learning assessment)
13
Disaggregated Data Aggregated data broken into parts, such as: Identifiable groups—day classes, night classes, fall or spring semester, race, gender, major Identifiable sub-scores—groups of questions that speak to a subject Item analysis—percent responses to each answer and psychometric analysis of questions and possible answers
14
Item Analysis Blackboard can be very helpful Here is a link to a video from Blackboard on how to use the Item Analysis feature https://www.youtube.com/watch?v=_ZeyX2yJfFU
15
Item Analysis
17
Office of Assessment Report Forms Report form for programs without a specialized accreditor Status report form for programs with specialized accreditor Feedback please!
18
Next Workshop April 27, 2016 Final Workshop for Spring 2016 Focus will be action plans to improve student learning and/or the student-learning assessment process Summer Workshops 2016 Digital Books and Student-Learning Outcomes Data Capture using Blackboard and EAC Visual Data Fall Workshops 2016 Indirect Assessment Measures—Survey Construction Psychometrics of Exam Question Construction
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.