Download presentation
Presentation is loading. Please wait.
Published byStuart Hall Modified over 8 years ago
2
Rubrics promote student performance and encourage reflective teaching practices (Beeth et al.,2001; Luft, 1998) Consistent rubric use in the classroom can effectively guide students’ revisions; this may lead to improvements in overall writing ability (Andrade, 2001; Schirmer & Bailey, 2000 ) The use of rubrics as a teaching strategy significantly improves writing in the following areas: topic, content, story development and organization (Schirmer, Bailey, and Fitzgerald,1999)
3
Rubrics can give teachers insight into the effectiveness of their instructional practices (Waltman et al., 1998) Providing scoring rubrics to educators and students can improve student performance (Johnson, Penny, and Gordon, 2008; U.S. Congress, 1992)
4
Providing rubrics can provide teachers with a sense of control and agency over student assessment. Releasing scoring rubrics might allay the fears and mysteries surrounding standardized forms of assessment.
6
Purpose › Provide arts educators and school administrators with a tool to authentically measure students’ arts achievement and to evaluate schools’ arts program › Improve the quality of arts education in South Carolina Goal › Develop standards-based arts assessments in dance, music, theatre, and visual arts in order to encourage implementation of standards and align curriculum, instruction, and assessment.
7
Current Assessments › Four Entry Level Music, Visual Arts, Dance & Theatre › Two Intermediate Level Music & Visual Arts Assessment Format › 2 parallel multiple-choice test forms with 45 items › 2 performance tasks
8
Performance Task 1 › Compare and Contrast two artworks Performance Task 2 › Draw and evaluate own artwork
9
Are performance assessment scores improved by releasing rubrics to test administrators? What other factors might influence students’ scores? Are the impacts different across types of performance task ?
10
Data Collected › SCAAP visual arts performance assessment results 2008-2010 Task 1 & Task 2a N = 24 schools Mean scores from each school › Teacher Feedback Survey Quantitative Analysis › Repeated-Measures ANOVA › Kruskal-Wallis test Qualitative Analysis of Teacher Feedback Survey
11
Mean score changes for the SCAAP schools Mean score changes for individual SCAAP schools: › Twelve schools out of 24 (50%) improved mean scores from 2008 to 2009 › Ten schools out of 24 (42%) improved mean scores from 2009 to 2010 200820092010 All SchoolsMean1.611.641.59 SD1.171.141.19 24 SchoolsMean1.581.701.46 SD0.580.530.73
12
Task 1: Compare and Contrast
13
Mean score changes for various time used to administer SCAAP Task 1 Note: We suggest students be given about 30 minutes to complete each task Time Used200820092010 ≤ 201.381.591.28 21 to 301.391.631.89 > 301.781.672.03 N/A1.661.781.32
14
Repeated-Measures ANOVA: To examine differences across years › No significant differences were found between the three years, with F = 2.39, p =.10 Kruskal-Wallis on Time Allocation: To examine how the amount of time used impacted student scores › No significant differences were found for any of the years (using α=.05). 200820092010 H2.380.083.76 CV5.675.655.38
15
Mean score changes for the SCAAP schools Mean score changes for individual SCAAP schools: › Thirteen schools out of 24 (54%) improved mean scores from 2008 to 2009 › Sixteen schools out of 24 (67%) improved mean scores from 2009 to 2010 200820092010 All SchoolsMean1.881.892.14 SD0.85 0.80 24 SchoolsMean1.831.812.03 SD0.370.330.42
16
Task 2a: Drawing
17
Mean score changes for various time used to administer SCAAP Task 2a Note: We suggest students be given about 30 minutes to complete each task. Time Used200820092010 ≤ 201.731.681.93 21 to 302.091.702.47 > 301.631.782.09 N/A1.871.951.99
18
Repeated-Measures ANOVA: F =7.46, p =.002 › 2008 with 2009: F =.12, p =.73 › 2009 with 2010: F = 11.26, p =.003 › 2008 with 2010: F = 8.26, p =.009 Kruskal-Wallis on Time Allocation: To examine how the amount of time used impacted student scores No significant difference was found for any of the years (using α=.05). 200820092010 H1.340.222.85 CV5.425.675.60
19
Teachers were asked 3 questions regarding their use of the SCAAP rubrics: › Have you reviewed the performance task rubrics (provided in the Visual Arts Script)? › Have you shared the information in the rubrics with yours students in any way? (If yes, what methods did you use to share that information?). › Did you use, or will you use, the information in the rubrics to inform your instructional practices? (If yes, please explain). 29 visual arts teachers responded to the feedback survey
20
Have you reviewed the performance task rubrics (provided in the Visual Arts Script)? Yes: 72% No: 28%
21
Have you shared the information in the rubrics with your students in any way (if Yes, what methods did you use to share that information)? Yes: 25% No: 75%
22
For open-ended portion, n=6 Responses indicated that teachers: › Read the rubrics aloud to the students › Summarized the rubrics for the students › Provided what was required to receive the highest score on the task
23
I believe that last year the first year that these were provided…As we practiced writing prompts during the year, I tried to get across the need to use terms, refer to the pictures and explain what is being compared… After receiving materials this year, I tried to show them some examples of sentences that might be used and possible scores they may receive. (This was as we practiced in the classroom using different types of pictures.) We had practiced comparing and contrasting landscapes. I may have written a sentence and said, "This may get you a zero or a one." I would add information to the sentence to show how they could give more information.
24
Did you use, or will you use, the information in the rubrics to inform your instructional practices? (If yes, please share how you have used or will use the information in the rubrics to inform instruction.) Yes: 50% No: 50%
25
For open-ended portion, n=10 Responses indicated that rubrics inform instruction by: › Being a tool for sharing assessment results › Assisting teachers in preparing for the assessment › Informing future instructional planning › Inspiring teachers to incorporate similar elements in their classrooms
26
Importantly, one teacher responded: I assumed that the instructions were the same as in the past, so, sadly, I did not know to look for any rubrics. Because of this mistake, I did not share with my students. I do think that sharing this information with the students ahead of taking the test would be helpful.
27
I did like the rubric and if I am allowed to have a copy of it, it would be very helpful touse in future lessons when writing an art critique. I would like to incorporate elements of your rubrics into my classroom so the terminology is consistent in class and in the rubric. I also think it will help the students on the multiple choice section because they will know the terms. I will use it to help students understand expectations of what is expected in the art room and what they can do to improve not only their work, but also their grades. It is a tangible and understandable measuring tool for art expectations.
28
Small sample size: 24 schools No control over whether and how teachers use the rubrics Lack of information about how teachers administered the test
29
Releasing rubrics may impact student performance, but those impacts might differ based on the nature of the task Time allocated to administer tasks caused variation in scores student achieved, but no evidence was found that it significantly changed the results
30
Investigation of how teachers used the rubrics Examination of how the nature of the task is related to the different impact of rubrics on student scores
31
Make uses of the rubrics more explicit to teachers Facilitate teachers in sharing suggestions for how to use the rubrics Create and provide grade-appropriate student versions of rubrics
32
Andrade, H.G. (2001). The effects of instructional rubrics on learning to write. Current Issues in Education[Online], 4(4). Retrieved from : http://cie.ed.asu.edu/volume/number4/ http://cie.ed.asu.edu/volume/number4/ Beeth, M. E., Cross, L., Pearl, C., Pirro, J., Yagnesak, K., & Kennedy, J. (2001). A continuum for assessing science process knowledge in grades K-6. Electronic Journal of Science Education, 5(3). Johnson, R.L., Penny, J.A., & Gordon, B. (2009). Assessing Performance: Designing, Scoring, and Validating Performance Tasks. New York, NY: The Guilford Press. Luft, J. (1998). Rubrics: Design and use in science teacher education. In Paper Presented at the Annual Meeting of the Association for the Education of Teachers in Science. Schirmer, B. R., & Bailey, J. (2000). Writing assessment rubric. Teaching Exceptional Children, 33,52-58. Schirmer, B. R., Bailey, J., & Fitzgerald, S. M. (1999). Using a writing assessment rubric for writing development of children who are deaf. Exceptional Children, 65, 383–397. Waltman, K., Kahn, A., & Koency, G. (1998). Alternative approaches to scoring: The effects of using different scoring methods on the validity of scores from a performance assessment. CSE Technical Report 488. Los Angeles. U.S. Congress, Office of Technology Assessment. (1992). Testing in American schools: Asking the right questions, OTA-SET-519. Washington, DC: U.S. Government Printing Office.
33
Thank you! Thank you! Contact Information Ashlee Lewis: lewisaa2@mailbox.sc.edu Min Zhu: zhum@mailbox.sc.edu Xiaofang Zhang: zhang29@mailbox.sc.edu Office of Program Evaluation University of South Carolina
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.